Connect with us

SEO

Is SEO Dead? A Fresh Look At The Age-Old Search Industry Question

Published

on

Is SEO Dead? A Fresh Look At The Age-Old Search Industry Question

The march of time is inevitable. And every year, some new technology pounds the nail into the coffin on something older.

Whether the horse and buggy are replaced by the automobile or the slide rule is replaced by the calculator, everything eventually becomes obsolete.

And if you listen to the rumors, this time around, it’s search engine optimization. Rest in peace, SEO: 1997-2022.

There’s just one tiny little problem.

SEO is still alive and kicking. It’s just as relevant today as it has ever been. If anything, it may even be more important.

Today, 53% of all website traffic comes from organic search.

In fact, the first search result in Google averages 26.9% of click-throughs on mobile devices and 32% on desktops.

And what’s helping Google determine which results belong at the top of search engine results pages (SERPs)? SEO, of course.

Need more proof? We have more statistics to back it up.

Thanks to consistent updates to the Google search algorithm, the entire SEO field is undergoing rapid evolution.

Completely ignoring the many small changes the search engine’s algorithm has undergone, we’ve seen several major updates in the last decade. Some of the more important ones are:

  • Panda – First put into place in February 2011, Panda was focused on quality and user experience. It was designed to eliminate black hat SEO tactics and web spam.
  • Hummingbird – Unveiled in August 2013, Hummingbird made the search engine’s core algorithm faster and more precise in anticipation of the growth of mobile search.
  • RankBrain – Rolled out in spring 2015, this update was announced in October of that year. Integrating artificial intelligence (AI) into all queries, RankBrain uses machine learning to provide better answers to ambiguous queries.
  • BERT – Initially released in November 2018 and updated in December 2019, this update helps Google understand natural language better.
  • Vicinity – Put into place in December 2021, Vicinity was Google’s biggest local search update in five years. Using proximity targeting as a ranking factor, local businesses are weighted more heavily in query results.

Each of these updates changed the way Google works, so each required SEO professionals to rethink their approach and tweak their strategy to ensure they get the results needed. But the need for their services remained.

Now that it’s been established that SEO is not dead, it raises the question: Where did all this death talk come from in the first place?

Most of it is based on unfounded conjecture and wild speculation. The truth is that SEO is in a state of transition, which can be scary.

And that transition is driven by three things:

  1. Artificial intelligence and machine learning, particularly Google RankBrain.
  2. Shrinking organic space on SERPs.
  3. Digital personal assistants and voice search.

The Rise Of Machine Learning

You’ve probably already recognized the impact AI has had on the world.

This exciting new technology has started to appear everywhere, from voice assistants to predictive healthcare to self-driving automobiles.

And it has been a trending topic in SEO for quite a while.

Unfortunately, most of what’s out there is incomplete information gathered from reading patents, analyzing search engine behavior, and flat-out guessing.

And part of the reason it’s so difficult to get a handle on what’s happening in AI concerning search engines is its constant evolution.

However, we will examine two identifiable trends: machine learning and natural language.

Machine learning is just what it sounds like: machines that are learning.

For a more sophisticated definition, it can be described as “a method of data analysis that automates analytical model building… a branch of artificial intelligence based on the idea that systems can learn from data, identify patterns and make decisions with minimal human intervention.”

For SEO purposes, this means gathering and analyzing information on content, user behaviors, citations, and patterns, and then using that information to create new rankings factors that are more likely to answer user queries accurately.

You will want to read this article for a more in-depth explanation of how that will work.

One of the most important factors machine learning uses when determining how to rank websites is our other trend – natural language.

From their earliest days, computers have used unique languages. And because it was very unlike the language humans don’t use, there was always a disconnect between user intent and what search engines delivered.

However, as technology has grown increasingly more advanced, Google has made great strides in this field.

The most important one for SEO professionals is RankBrain, Google’s machine learning system built upon the rewrite of Google’s core algorithm that we mentioned earlier, Hummingbird.

Nearly a decade ago, Google had the foresight to recognize that mobile devices were the future wave. Anticipating what this would mean for search, Hummingbird focused on understanding conversational speech.

RankBrain builds upon this, moving Google away from a search engine that follows the links between concepts to seeing the concepts they represent.

It moved the search engine away from matching keywords in a query to more precisely identifying user intent and delivering results that more accurately matched the search.

This meant identifying which words were important to the search and disregarding those that were not.

It also developed an understanding of synonyms, so if a webpage matches a query, it may appear in the results, even if it doesn’t include the searched-for keyword.

The biggest impact of RankBrain and machine learning has been on long-tail keywords.

In the past, websites would often jam in specific but rarely search-for keywords into their content. This allowed them to show up in queries for those topics.

RankBrain changed how Google handled these, which meant primarily focusing on long-tail keywords was no longer a good strategy. It also helped eliminate content from spammers who sought rankings for these terms.

Honey, I Shrunk The Organic Search Space

Search engines are big business, no one can deny that.

And since 2016, Google has slowly encroached on organic search results in favor of paid advertising. That was when sponsored ads were removed from the sidebar and put at the top of SERPs.

As a result, organic results were pushed further down the page, or “below the fold,” to borrow an anachronistic idiom.

From Google’s business perspective, this makes sense. The internet has become a huge part of the global economy, which means an ever-increasing number of companies are willing to pay for ad placement.

As a result of this seeming de-prioritization, organic SEO professionals are forced to develop innovative new strategies for not only showing up on the first page but also competing with paid ads.

Changes to local search have also affected SERPs. In its never-ending quest to provide more relevant results to users, Google added a local pack to search results. This group of three nearby businesses appears to satisfy the query. They are listed at the top of the first page of results, along with a map showing their location.

This was good news for local businesses who compete with national brands. For SEO professionals, however, it threw a new wrinkle into their work.

In addition to creating competition for local search results, this also opened the door for, you guessed it, local paid search ads.

And these are not the only things pushing organic results down the page. Depending on the search, your link may also have to compete with:

  • Shopping ads.
  • Automated extensions.
  • Featured snippets.
  • Video or image carousels.
  • News stories.

Additionally, Google has begun directly answering questions (and suggested related questions and answers). This has given birth to a phenomenon known as “zero-click searches,” which are searches that end on the SERP without a click-through to another site.

In 2020, nearly 65% of all searches received no clicks, which is troubling for anyone who makes their living by generating them.

With this in mind, and as organic results sink lower and lower, it’s easy to see why some SEO professionals are becoming frustrated. But savvy web marketers see these as more than challenges – they see them as opportunities.

For example, if you can’t get your link at the top of a SERP, you can use structured data markup to grab a featured snippet. While this isn’t technically an SEO tactic, it is a way to generate clicks and traffic, which is the ultimate goal.

Use Your Voice

Not long ago, taking a note or making out your grocery list meant locating some paper and writing on it with a pen. Like a caveman.

Thankfully, those days are gone, or at least on their way out, having been replaced by technology.

Whether you’re using Siri to play your favorite song, asking Cortana how much the moon weighs, or having Alexa check the price of Apple stock, much of the internet is now available just by using your voice.

In 2020, 4.2 million digital personal assistant devices were being used worldwide. And that’s a number expected to double by 2024 as more and more people adopt the Amazon Echo, Sonos One, Google Nest Hub, and the like.

And users don’t even have their own one of these smart speakers to use the power of voice search. 90% of iPhone owners use Siri, and 75% of smartphone owners use Google Assistant.

With the advent of these virtual helpers, we’ve seen a big increase in voice searching. Here are some interesting facts about voice search:

Isn’t technology grand?

It depends on who you are. If you work in SEO (and because you’re reading this, we’re going to assume you do), this creates some problems.

After all, how do you generate clicks to your website if no clicks are involved?

The answer is quite obvious: You need to optimize for voice search.

Voice-controlled devices don’t operate like a manual search, so your SEO content needs to consider this.

The best way to do this again is to improve the quality of your information. Your content needs to be the best answer to a person’s question, ensuring it ranks at the very top and gets the verbal click-throughs (is that a term?) you need.

And because people have figured out that more specific queries generate more specific responses, it’s important that your content fills that niche.

In general, specificity seems to be a growing trend in SEO, so it’s no longer enough to just have a web copy that says, “t-shirts for sale.”

Instead, you need to drill down to exactly what your target is searching for, e.g., “medium Garfield t-shirts + yellow + long-sleeve.”

What Does All This Mean For SEO?

Now that we’ve looked at the major reasons why pessimists and cynics are falsely proclaiming the demise of SEO, let’s review what we’ve learned along the way:

  1. Google will never be satisfied with its algorithms. It will always feel there is room to grow and improve its ability to precisely answer a search query. And far from being the death knell for SEO, this ensures its importance moving forward.
  2. Machine learning, especially regarding natural language, allows Google to better understand the intent behind a search and as a result, present more relevant options. Your content should focus on answering queries instead of just including keywords.
  3. Long-tail keywords are important for answering specific questions, particularly in featured answer sections, but focusing solely on them is an outdated and ineffective strategy.
  4. Organic search results lose real estate to paid ads and other features. However, this presents opportunities for clever SEO professionals to shoot to the top via things like local search.
  5. Zero-click searches constitute nearly two-thirds of all searches, which hurts SEO numbers but allows you to claim a spot of prominence as a featured snippet using structured data.
  6. The use of voice search and personal digital assistants is on the rise. This calls for rethinking SEO strategies and optimizing content to be found and used by voice search.

Have you noticed a theme running through this entire piece? It’s evolution, survival of the fittest.

To ensure you don’t lose out on important web traffic, you must constantly monitor the SEO situation and adapt to changes, just like you always have.

Your strategy needs to become more sophisticated as new opportunities present themselves. It needs to be ready to pivot quickly.

And above all, you need to remember that your content is still the most important thing.

If you can best answer a query, your site will get the traffic you seek. If it can’t, you need to rework it until it does.

Just remember, like rock and roll, SEO will never die.

More Resources:


Featured Image: sitthiphong/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Hints At Improving Site Rankings In Next Update

Published

on

By

Google Hints At Improving Site Rankings In Next Update

Google’s John Mueller says the Search team is “explicitly evaluating” how to reward sites that produce helpful, high-quality content when the next core update rolls out.

The comments came in response to a discussion on X about the impact of March’s core update and September’s helpful content update.

In a series of tweets, Mueller acknowledged the concerns, stating:

“I imagine for most sites strongly affected, the effects will be site-wide for the time being, and it will take until the next update to see similar strong effects (assuming the new state of the site is significantly better than before).”

He added:

“I can’t make any promises, but the team working on this is explicitly evaluating how sites can / will improve in Search for the next update. It would be great to show more users the content that folks have worked hard on, and where sites have taken helpfulness to heart.”

What Does This Mean For SEO Professionals & Site Owners?

Mueller’s comments confirm Google is aware of critiques about the March core update and is refining its ability to identify high-quality sites and reward them appropriately in the next core update.

For websites, clearly demonstrating an authentic commitment to producing helpful and high-quality content remains the best strategy for improving search performance under Google’s evolving systems.

The Aftermath Of Google’s Core Updates

Google’s algorithm updates, including the September “Helpful Content Update” and the March 2024 update, have far-reaching impacts on rankings across industries.

While some sites experienced surges in traffic, others faced substantial declines, with some reporting visibility losses of up to 90%.

As website owners implement changes to align with Google’s guidelines, many question whether their efforts will be rewarded.

There’s genuine concern about the potential for long-term or permanent demotions for affected sites.

Recovery Pathway Outlined, But Challenges Remain

In a previous statement, Mueller acknowledged the complexity of the recovery process, stating that:

“some things take much longer to be reassessed (sometimes months, at the moment), and some bigger effects require another update cycle.”

Mueller clarified that not all changes would require a new update cycle but cautioned that “stronger effects will require another update.”

While affirming that permanent changes are “not very useful in a dynamic world,” Mueller adds that “recovery” implies a return to previous levels, which may be unrealistic given evolving user expectations.

“It’s never ‘just-as-before’,” Mueller stated.

Improved Rankings On The Horizon?

Despite the challenges, Mueller has offered glimmers of hope for impacted sites, stating:

“Yes, sites can grow again after being affected by the ‘HCU’ (well, core update now). This isn’t permanent. It can take a lot of work, time, and perhaps update cycles, and/but a different – updated – site will be different in search too.”

He says the process may require “deep analysis to understand how to make a website relevant in a modern world, and significant work to implement those changes — assuming that it’s something that aligns with what the website even wants.”

Looking Ahead

Google’s search team is actively working on improving site rankings and addressing concerns with the next core update.

However, recovery requires patience, thorough analysis, and persistent effort.

The best way to spend your time until the next update is to remain consistent and produce the most exceptional content in your niche.


FAQ

How long does it generally take for a website to recover from the impact of a core update?

Recovery timelines can vary and depend on the extent and type of updates made to align with Google’s guidelines.

Google’s John Mueller noted that some changes might be reassessed quickly, while more substantial effects could take months and require additional update cycles.

Google acknowledges the complexity of the recovery process, indicating that significant improvements aligned with Google’s quality signals might be necessary for a more pronounced recovery.

What impact did the March and September updates have on websites, and what steps should site owners take?

The March and September updates had widespread effects on website rankings, with some sites experiencing traffic surges while others faced up to 90% visibility losses.

Publishing genuinely useful, high-quality content is key for website owners who want to bounce back from a ranking drop or maintain strong rankings. Stick to Google’s recommendations and adapt as they keep updating their systems.

To minimize future disruptions from algorithm changes, it’s a good idea to review your whole site thoroughly and build a content plan centered on what your users want and need.

Is it possible for sites affected by core updates to regain their previous ranking positions?

Sites can recover from the impact of core updates, but it requires significant effort and time.

Mueller suggested that recovery might happen over multiple update cycles and involves a deep analysis to align the site with current user expectations and modern search criteria.

While a return to previous levels isn’t guaranteed, sites can improve and grow by continually enhancing the quality and relevance of their content.


Featured Image: eamesBot/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Reveals Two New Web Crawlers

Published

on

By

Google Reveals Two New Web Crawlers

Google revealed details of two new crawlers that are optimized for scraping image and video content for “research and development” purposes. Although the documentation doesn’t explicitly say so, it’s presumed that there is no impact in ranking should publishers decide to block the new crawlers.

It should be noted that the data scraped by these crawlers are not explicitly for AI training data, that’s what the Google-Extended crawler is for.

GoogleOther Crawlers

The two new crawlers are versions of Google’s GoogleOther crawler that was launched in April 2023. The original GoogleOther crawler was also designated for use by Google product teams for research and development in what is described as one-off crawls, the description of which offers clues about what the new GoogleOther variants will be used for.

The purpose of the original GoogleOther crawler is officially described as:

“GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.”

Two GoogleOther Variants

There are two new GoogleOther crawlers:

  • GoogleOther-Image
  • GoogleOther-Video

The new variants are for crawling binary data, which is data that’s not text. HTML data is generally referred to as text files, ASCII or Unicode files. If it can be viewed in a text file then it’s a text file/ASCII/Unicode file. Binary files are files that can’t be open in a text viewer app, files like image, audio, and video.

The new GoogleOther variants are for image and video content. Google lists user agent tokens for both of the new crawlers which can be used in a robots.txt for blocking the new crawlers.

1. GoogleOther-Image

User agent tokens:

  • GoogleOther-Image
  • GoogleOther

Full user agent string:

GoogleOther-Image/1.0

2. GoogleOther-Video

User agent tokens:

  • GoogleOther-Video
  • GoogleOther

Full user agent string:

GoogleOther-Video/1.0

Newly Updated GoogleOther User Agent Strings

Google also updated the GoogleOther user agent strings for the regular GoogleOther crawler. For blocking purposes you can continue using the same user agent token as before (GoogleOther). The new Users Agent Strings are just the data sent to servers to identify the full description of the crawlers, in particular the technology used. In this case the technology used is Chrome, with the model number periodically updated to reflect which version is used (W.X.Y.Z is a Chrome version number placeholder in the example listed below)

The full list of GoogleOther user agent strings:

  • Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; GoogleOther)
  • Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; GoogleOther) Chrome/W.X.Y.Z Safari/537.36

GoogleOther Family Of Bots

These new bots may from time to time show up in your server logs and this information will help in identifying them as genuine Google crawlers and will help publishers who may want to opt out of having their images and videos scraped for research and development purposes.

Read the updated Google crawler documentation

GoogleOther-Image

GoogleOther-Video

Featured Image by Shutterstock/ColorMaker

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

ChatGPT To Surface Reddit Content Via Partnership With OpenAI

Published

on

By

ChatGPT artificial intelligence chatbot app on smartphone screen with large shadow giving the feeling of floating on top of the background. White background.

Reddit partners with OpenAI to integrate content into ChatGPT.

  • Reddit and OpenAI announce a partnership.
  • Reddit content will be used in ChatGPT.
  • Concerns about accuracy of Reddit user-generated content.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending