Connect with us

SEO

Google Algorithms And Updates Focusing On User Experience: A Timeline

Published

on

Google Algorithms And Updates Focusing On User Experience: A Timeline

As the role of search evolves to touch multiple marketing and consumer touchpoints, optimizing for the user has never been so important.

This is reflected in Google’s continual focus on the searcher experience. Whether in its core algorithmic updates, new features, products, or SERP format changes.

While some of these Google changes have involved updates targeting low-quality content, links, and spam, other updates aim to understand consumer behavior and intent.

For example, most recent updates have focused on page speed, Core Web Vitals, and product reviews.

Considering the massive competition for SERP real estate from brands, even slight drops in position will critically impact traffic, revenue, and conversions.

In this article, I examine a combination of some (not all) Google updates and technological advancements that significantly reflect the search engine’s focus on the human user and their experiences online – from Panda in 2011 through to Page and Product Experience in 2021 and 2022.

Google Panda (2011)

First launched in February 2011, subsequent updates were continuous and added to Google’s core algorithm.

Panda was announced to target sites with low-quality content; this was one of the first signals that Google focused on content for the user experience.

The focus: producing and optimizing unique and compelling content.

  • Avoid thin content and focus on producing high-quality information.
  • Measure quality over quantity.
  • Content length is not a significant factor but needs to contain information that answers the user’s needs.
  • Avoid duplicate content – initially a big concern for ecommerce sites. Most recently, Google’s John Mueller explained that duplicate content is not a negative ranking factor.

Google Hummingbird (2013)

Following the introduction of the Knowledge Graph came Hummingbird with a focus on semantic search.

Hummingbird was designed to help Google better understand the intent and context behind searches.

As users looked to enter queries more conversationally, it became essential to optimize for user experience by focusing on content beyond the keyword with a renewed focus on the long tail.

This was the first indication of Google using natural language processing (NLP) to identify black hat techniques and create personalized SERP results.

The focus: creating and optimizing content that audiences want and find helpful.

  • Long-tail keywords and intent model strategies became crucial.
  • Content creation is needed to address what users are interested in and would like to learn.
  • Expand keyword research to include conceptual and contextual factors.
  • Avoid keyword-stuffing and producing low-quality content to personalize experiences.
Image source: BrightEdge, July 2022

E-A-T (2014)

Although it gained attention in 2018, the Google E-A-T concept first appeared in 2014 in Google’s Quality Guidelines.

Now, it is part of Google’s guidelines on focusing on YMYL – your money or your life.

Marketers were advised to focus on content that could impact their readers’ future happiness, health, financial stability, or safety.

Google established E-A-T guidelines to help marketers tailor on and off-page SEO and content strategies to provide users with an experience containing the most relevant content from sources they could trust.

In other words: Expertise, Authority, and Trust.

The focus: ensuring websites offer expert and authoritative content that users can trust.

  • Create content that shows expertise and knowledge of the subject matter.
  • Focus on the credibility and authority of websites publishing content.
  • Improve the overall quality of websites – structure and security.
  • Earn off-page press coverage on reputable sites, reviews, testimonials, and expert authors.

Mobile Update (2015)

This was the first time Google gave marketers a heads-up (or a warning, for many) that an update was coming.

Focusing on the user’s experience on mobile was a significant signal reflecting the growing use of mobile as part of the customer search journey.

Google clearly communicated that this update would prioritize mobile-friendly websites on mobile SERPs. Many more mobile updates followed.

The focus: mobile content and users’ mobile site experience.

  • Focus on design factors such as responsive design and mobile page structures.
  • Enhance site navigation, so mobile users can quickly find what they need.
  • Avoid format issues on mobile that were different from the desktop experience.
  • Confirm that websites are mobile-optimized.

Just after the mobile update went live, Google quietly issued a Quality update.

Websites that focused on the user experience by focusing on quality content and avoiding too much irrelevant user-generated content and too many ads did well. This was another sign that Google was putting the user experience first.

RankBrain (2015)

Like the Hummingbird principles and NLP mentioned earlier, Google RankBrain was more of a change to the algorithm.

It gave us an indication of how vital machine learning was in all marketing and technology forms.

Utilizing this to learn and predict user behavior, RankBrain powered search results based on an even better understanding of users’ intent.

The focus: ensuring that content reflects user intent and optimizing for conversational search.

  • Place greater focus and emphasis on creating content that matches the user’s intent.
  • Ensure that all aspects of technical SEO are updated (such as schema markup, for example).
  • Google signified that RankBrain was the third-most important ranking signal.

Google Mobile-First Indexing (2018)

The Mobile-First Indexing Update meant that Google would use the mobile version of a webpage for indexation and ranking.

Once again, this was aimed to help enhance the user experience and help users find what they are looking for.

Producing content for mobile and focusing on speed and performance became paramount to success.

The focus: re-affirming the importance of mobile optimization, content, speed, and mobile site performance.

  • Improve AMP and mobile page speed and performance.
  • Ensure that URL structures for mobile and desktop sites meet Google requirements.
  • Add structured data for both desktop and mobile versions.
  • Make sure the mobile site contains the same content as the desktop site.

Google has said that March 2021 is the rollout date for its mobile-first index.

Shortly afterward, Google made mobile page speed a ranking factor so website owners would focus on load times and page speed to enhance the user experience.

Broad Core Algorithm Updates (2018)

2018 was a year in which Google released lots of core algorithm updates covering areas such as social signals and the so-called medic update.

After the August update, in particular, Google’s John Mueller suggested making content more relevant.

While there was some confusion on ranking factors and fixing specific issues, it did bring the concept of E-A-T and content for the user top of mind for many SEO professionals and content marketers.

On the topic of rater guidelines being key to the broad update, Google’s Danny Sullivan suggested:

“Want to do better with a broad change? Have great content. Yeah, the same boring answer. But if you want a better idea of what we consider great content, read our raters guidelines. That’s like almost 200 pages of things to consider.”

BERT (2019)

Following RankBrain, this neural network-based method for natural language processing allowed Google to understand conversational queries better.

BERT allows users to find valuable and accurate information more easily.

According to Google, this represented the most significant leap forward in the past five years and one of the greatest in search history.

The focus: improving the understanding of consumer intent through conversational type search themes.

  • Increase the depth and specifics of the content.
  • Work more with long-tail queries and phrases using more than three words.
  • Ensure that content addresses the users’ questions or queries and is optimized correctly.
  • Focus on writing for humans clearly and concisely so that it is easy to understand.

Read more on BERT and SMITH here.

COVID-19 Pandemic (March 2020)

The global pandemic meant that consumer behavior and search patterns changed forever as Google continued to focus on E-A-T signals.

Google began to emphasize YMYL signals as the internet struggled to cope with misinformation and SEO pros struggled to keep up with the rapid shifts and dips in consumer behavior.

From setting up 24-hour incident response teams with the World Health Organization and policing content to helping people find helpful information and avoiding misinformation, the user’s needs never became so important.

The demand for SEO rose to an all-time high, and Google released a COVID-19 playbook.

Google Page Experience Update And Core Web Vitals Announced (May 2020)

Focusing on a site’s technical health and metrics to measure the user experience of a page metrics include looking at how quickly page content loads, how quickly a browser loading a webpage can respond to a user’s input, and how unstable the content is as it loads in the browser.

The focus: integrating new Core Web Vitals metrics to measure and improve on-page experiences.

  • Mobile-friendliness, safe browsing, HTTPS, and intrusive interstitials – The Google Page Experience Signal.
  • LCP (Largest Contentful Paint): Improve page load times for large images and video backgrounds.
  • FID (First Input Delay): Ensure your browser responds quickly to a user’s first interaction with a page.
  • CLS (Cumulative Layout Shift): Include the size attributes on your images and video elements or reserve the space with CSS aspect ratio boxes and ensure content is never inserted above existing content, except in response to user interaction.

Broad Core Algorithm Updates (2020)

The third Google core algorithm update of the year rolled out in December 2020. This came in the form of slight changes that affect the order and weight of certain (not always disclosed) ranking signals.

According to SEJ Contributor Ryan Jones:

“Google aims to serve content that provides the best and most complete answers to searchers’ queries. Relevance is the one ranking factor that will always win out over all others.”

Read more on December’s Core Update here.

Passage Ranking (February 2021)

Google officially rolled out its passage-based indexing, designed to help users find answers to specific questions.

You’ve probably seen this in the wild, but essentially this allows Google to highlight pertinent elements of a passage within a piece of content that fits the question.

This means long-form content that may not be skimmable but provides valuable answers could be surfaced as a result.

Ultimately, this makes it easier for Google to connect users to content without making them hunt for the specific answer to their questions when they click on a page.

Passage Ranking (February 2021)Screenshot from blog.google, July 2022

The key to success with passage ranking goes back to focusing on creating great content for the user.

Read more on the 16 Key Points You Should Know here.

Product Reviews Update (April 2021)

This new product review update was designed to improve a user’s experience when searching for product reviews.

Marketers were advised to focus on avoiding creating thin content as this update will reward content that users find most helpful.

The focus: rewarding creators who provide users with authentic and detailed review content

Google shared nine helpful questions to consider when creating and publishing product reviews.

  • Show expert knowledge about products.
  • Differentiate your product compared to competitors.
  • Highlight benefits and any drawbacks clearly and concisely.
  • Show how the product has evolved to fit the needs of the user.

Read more here.

MUM (May 2021)

Following RankBrain and BERT, MUM (Multitask Unified Model) technology utilizes AI and NLP to improve information retrieval.

For the end user, this technological advancement helps provide better information and results as it processes multiple media formats such as video, images, and audio.

Pandu Nayak, Google fellow and vice president of Search, said:

“But with a new technology called Multitask Unified Model, or MUM, we’re getting closer to helping you with these types of complex needs. So in the future, you’ll need fewer searches to get things done.”

Read more here.

Page Experience Update And Core Web Vitals (CWV) Rollout (June 2021)

The much-anticipated Page Experience Update, including Core Web Vitals, rolled out, with further updates to desktop following in March 2022.

Nine months after the rollout of Google’s Core Web Vitals and over a year since BrightEdge launched pre-roll predictive research, new research showed how many industries are adapting and improving their Core Web Vitals.

The focus: improving Pages Experiences for users with speed and precision.

 

The focus: improving Pages Experiences for users with speed and precision.Image source: BrightEdge, July 2022
  • Retail giants have made significant strides in improving experiences.
  • In cases like Retail, CWV metrics like input delay have been cut in half.
  • Although Finance was the best prepared last year, it made the least performance gains in the categories ​evaluated.

Spam Update (June 2021) And Link Spam Algorithm Update (July 2021)

Spam updateImage source: BrightEdge July 2022

Ensuring users get the right results based on their searches is foundational to a good experience.

In addition, updates and algorithm changes help protect users’ privacy to keep searches safe and secure.

The focus: keeping user experiences safe.

Learn more in this video from Google here.

Local Search Update (November 2021))

Google has always provided local search updates for local search users and fine-tuned its algorithm for better user results.

Local search is a huge channel, not to be underestimated, but a whole other post.

This also includes guidance on how businesses can improve their local ranking for improved customer experiences.

Read more here.

Product Algorithm Update (March 2022)

On March 23, 2022, Google provided an instruction update based on how product reviews are performing in one year.

This also informed the community of improved rollout updates that will help users surface accurate and relevant information to help with purchasing decisions.

The focus: user experience and surfacing results that help users make purchasing easier.

Google Algorithms & Updates Focused On User Experience: A TimelineScreenshot from Google Search Central blog, July 2022
  • As always, showcase your expertise and ensure the content is authentic.
  • Share why you recommend products with evidence to support it.

Read more advice here and here.

Conclusion

A successful user experience requires a combination of content and technical expertise. Updates and guidance help marketers create content for the user.

In addition, algorithms and technological advancements help Google surface better results and showcase accurate, relevant, and trustworthy content.

Google will continue to focus on improving experiences for its user.

As a marketer who wants to optimize for both, ensuring your website (from navigation, speed, and reliability) and focusing on content is vital.

Many of Google’s updates signal that technical SEO, data science, and content marketing excellence are coming together.

Stay up to date and read through all of Google’s Updates here on SEJ.

More Resources:


Featured Image: Gorodenkoff/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Discusses Fixing 404 Errors From Inbound Links

Published

on

By

Google Discusses Fixing 404 Errors From Inbound Links

Google’s John Mueller responded to a thread in Reddit about finding and fixing inbound broken links, offering a nuanced insight that some broken links are worth finding and fixing and others are not.

Reddit Question About Inbound Broken Links

Someone asked on Reddit if there’s a way to find broken links for free.

This is the question:

“Is it possible to locate broken links in a similar manner to identifying expired domain names?”

The person asking the question clarified if this was a question about an inbound broken link from an external site.

John Mueller Explains How To Find 404 Errors To Fix

John Mueller responded:

“If you want to see which links to your website are broken & “relevant”, you can look at the analytics of your 404 page and check the referrers there, filtering out your domain.

This brings up those which actually get traffic, which is probably a good proxy.

If you have access to your server logs, you could get it in a bit more detail + see which ones search engine bots crawl.

It’s a bit of technical work, but no external tools needed, and likely a better estimation of what’s useful to fix/redirect.”

In his response, John Mueller answers the question on how to find 404 responses caused by broken inbound links and identify what’s “useful to fix” or to “redirect.”

Mueller Advises On When Not To “Fix” 404 Pages

John Mueller next offered advice on when it doesn’t make sense to not fix a 404 page.

Mueller explained:

“Keep in mind that you don’t have to fix 404 pages, having things go away is normal & fine.

The SEO ‘value’ of bringing a 404 back is probably less than the work you put into it.”

Some 404s Should Be Fixed And Some Don’t Need Fixing

John Mueller said that there are situations where a 404 error generated from an inbound link is easy to fix and suggested ways to find those errors and fix them.

Mueller also said that there are some cases where it’s basically a waste of time.

What wasn’t mentioned was what the difference was between the two and this may have caused some confusion.

Inbound Broken Links To Existing Webpages

There are times when another sites links into your site but uses the wrong URL. Traffic from the broken link on the outside site will generate a 404 response code on your site.

These kinds of links are easy to find and useful to fix.

There are other situations when an outside site will link to the correct webpage but the webpage URL changed and the 301 redirect is missing.

Those kinds of inbound broken links are also easy to find and useful to fix. If in doubt, read our guide on when to redirect URLs.

In both of those cases the inbound broken links to the existing webpages will generate a 404 response and this will show up in server logs, Google Search Console and in plugins like the Redirection WordPress plugin.

If the site is on WordPress and it’s using the Redirection plugin, identifying the problem is easy because the Redirection plugin offers a report of all 404 responses with all the necessary information for diagnosing and fixing the problem.

In the case where the Redirection plugin isn’t used one can also hand code an .htaccess rule for handling the redirect.

Lastly, one can contact the other website that’s generating the broken link and ask them to fix it. There’s always a small chance that the other site might decide to remove the link altogether. So it might be easier and faster to just fix it on your side.

Whichever approach is taken to fix the external inbound broken link, finding and fixing these issues is relatively simple.

Inbound Broken Links To Removed Pages

There are other situations where an old webpage was removed for a legitimate reason, like an event passed or a service is no longer offered.

In that case it makes sense to just show a 404 response code because that’s one of the reasons why a 404 response should be shown. It’s not a bad thing to show a 404 response.

Some people might want to get some value from the inbound link and create a new webpage to stand in for the missing page.

But that might not be useful because the link is for something that is irrelevant and of no use because the reason for the page no longer exists.

Even if you create a new reason, it’s possible that some of that link equity might flow to the page but it’s useless because the topic of that inbound link is totally irrelevant to anyting but the expired reason.

Redirecting the missing page to the home page is a strategy that some people use to benefit from the link to a page that no longer exists. But Google treats those links as Soft 404s, which then passes no benefit.

These are the cases that John Mueller was probably referring to when he said:

“…you don’t have to fix 404 pages, having things go away is normal & fine.

The SEO ‘value’ of bringing a 404 back is probably less than the work you put into it.”

Mueller is right, there are some pages that should be gone and totally removed from a website and the proper server response for those pages should be a 404 error response.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Site Quality Is Simpler Than People Think

Published

on

By

Site Quality Is Simpler Than People Think

Google’s John Mueller, Martin Splitt and Gary Illyes discussed site quality in a recent podcast, explaining the different ways of thinking about site quality and at one point saying it’s not rocket science. The discussion suggests that site quality could be simpler than most people know.

Site Quality Is Not Rocket Science

The first point they touched on is to recommend reading site quality documentation, insisting that site quality is not especially difficult to understand.

Gary Illyes said:

“So I would go to a search engine’s documentation.

Most of them have some documentation about how they function and just try to figure out where your content might be failing or where your page might be failing because honestly, okay, this is patronizing, but it’s not rocket science.”

No Tools For Site Quality – What To Do?

Gary acknowledged that there’s no tool for diagnosing site quality, not in the same way there are tools for objectively detecting technical issues.

The traffic metrics that show a downward movement don’t explain why, they just show that something changed.

Gary Illyes:

“I found the up-down metric completely useless because you still have to figure out what’s wrong with it or why people didn’t like it.

And then you’re like, “This is a perfectly good page. I wrote it, I know that it’s perfect.”

And then people, or I don’t know, like 99.7% of people are downvoting it. And you’re like, ‘Why?’”

Martin Splitt

“And I think that’s another thing.

How do I spot, I wrote the page, so clearly it is perfect and helpful and useful and amazing, but then people disagree, as you say.

How do you think about that? What do you do then?

How can I make my content more helpful, better, more useful? I don’t know.

…There’s all these tools that I can just look at and I see that something’s good or something’s bad.

But for quality, how do I go about that?”

Gary Illyes

“What if quality is actually simpler than at least most people think?

…What if it’s about writing the thing that will help people achieve whatever they need to achieve when they come to the page? And that’s it.”

Martin Splitt asked if Gary was talking about reviewing the page from the perspective of the user.

Illyes answered:

“No, we are reframing.”

Reframing generally means to think about the problem differently.

Gary’s example is to reframe the problem as whether the page delivers what it says it’s going to deliver (like helping users achieve X,Y,Z).

Something I see a lot with content is that the topic being targeted (for example, queries about how to catch a trout) isn’t matched by the content (which might actually be about tools for catching trout) which is not what the site visitor wants to achieve.

Quality In Terms Of Adding Value

There are different kinds of things that relate to site and page quality and in the next part of the podcast John Mueller and Gary Illyes discuss the issue about adding something of value.

Adding something of value came up in the context of where the SERPs offer good answers from websites that people not only enjoy but they expect to see those sites as answers for those queries.

You can tell when users expect specific sites for individual search queries when Google Suggests shows the brand name and the keyword.

That’s a clue that probably a lot of people are turning keywords into branded searches, which signals to Google what people want to see.

So, the problem of quality in those situations isn’t about being relevant for a query with the perfect answer.

For these situations, like for competitive queries, it’s not enough to be relevant or have the perfect answer.

John Mueller explains:

“The one thing I sometimes run into when talking with people is that they’ll be like, “Well, I feel I need to make this page.”

And I made this page for users in air quotes…

But then when I look at the search results, it’s like 9,000 other people also made this page.

It’s like, is this really adding value to the Internet?

And that’s sometimes kind of a weird discussion to have.

It’s like, ‘Well, it’s a good page, but who needs it?’

There are so many other versions of this page already, and people are happy with those.”

This is the type of situation where competitive analysis to “reverse engineer” the SERPs  works against the SEO.

It’s stale because using what’s in the SERPs as a template for what to do rank is feeding Google what it already has.

It’s like, as an example, let’s represent the site ranked in Google with a baseline of the number zero.

Let’s imagine everything in the SERPs has a baseline of zero. Less than zero is poor quality. Higher than zero is higher quality.

Zero is not better than zero, it’s just zero.

The SEOs who think they’re reverse engineering Google by copying entities, copying topics, they’re really just achieving an imperfect score of zero.

So, according to Mueller, Google responds with, “it’s a good page, but who needs it?”

What Google is looking for in this situation is not the baseline of what’s already in the SERPs, zero.

According to Mueller, they’re looking for something that’s not the same as the baseline.

So in my analogy, Google is looking for something above the baseline of what is already in the SERPs, a number greater than zero, which is a one.

You can’t add value by feeding Google back what’s already there. And you can’t add value by doing the same thing ten times bigger. It’s still the same thing.

Breaking Into The SERPs By The Side Door

Gary Illyes next discusses a way to break into a tough SERP, saying the way to do it is indirectly.

This is an old strategy but a good one that still works today.

So, rather than bringing a knife to a gunfight, Gary Illyes suggests choosing more realistic battles to compete in.

Gary continued the conversation about competing in tough SERPs.

He said:

“…this also is kind of related to the age-old topic that if you are a new site, then how can you break into your niche?

I think on today’s Internet, like back when I was doing ‘SEO’, it was already hard.

For certain topics or niches, it was absolutely a nightmare, like ….mesothelioma….

That was just impossible to break into. Legal topics, it was impossible to break into.

And I think by now, we have so much content on the Internet that there’s a very large number of topics where it is like 15 years ago or 20 years ago, that mesothelioma topic, where it was impossible to break into.

…I remember Matt Cutts, former head of Web Spam, …he was doing these videos.

And in one of the videos, he said try to offer something unique or your own perspective to the thing that you are writing about.

Then the number of perspective or available perspectives, free perspectives, is probably already gone.

But if you find a niche where people are not talking too much about, then suddenly, it’s much easier to break into.

So basically, this is me saying that you can break into most niches if you know what you are doing and if you are actually trying to help people.”

What Illyes is suggesting as a direction is to “know what you are doing and if you are actually trying to help people.

That’s one of my secrets to staying one step ahead in SEO.

For example, before the reviews update, before Google added Experience to E-A-T, I was telling clients privately to do that for their review pages and I told them to keep it a secret, because I knew I had it dialed in.

I’m not psychic, I was just looking at what Google wants to rank and I figured it out several years before the reviews update that you need to have original photos, you need to have hands-on experience with the reviewed product, etc.

Gary’s right when he advises to look at the problem from the perspective of “trying to help people.”

He next followed up with this idea about choosing which battles to fight.

He said:

“…and I think the other big motivator is, as always, money. People are trying to break into niches that make the most money. I mean, duh, I would do the same thing probably.

But if you write about these topics that most people don’t write about, let’s say just three people wrote about it on the Internet, then maybe you can capture some traffic.

And then if you have many of those, then maybe you can even outdo those high-traffic niches.”

Barriers To Entry

What Gary is talking about is how to get around the barrier to entry, which are the established sites. His suggestion is to stay away from offering what everyone else is offering (which is a quality thing).

Creating content that the bigger sites can’t or don’t know to create is an approach I’ve used with a new site.

Weaknesses can be things that the big site does poorly, like their inability to resonate with a younger or older audience and so on.

Those are examples of offering something different that makes the site stand out from a quality perspective.

Gary is talking about picking the battles that can be won, planting a flag, then moving on to the next hill.

That’s a far better strategies than walking up toe to toe with the bigger opponent.

Analyzing For Quality Issues

It’s a lot easier to analyze a site for technical issues than it is for quality issues.

But a few of the takeaways are:

  • Be aware that the people closest to the content are not always the best judges of content is quality.
  • Read Google’s search documentation (for on-page factors, content, and quality guidelines).
  • Content quality is simpler than it seems. Just think about knowing the topic well and being helpful to people.
  • Being original is about looking at the SERPs for things that you can do differently, not about copying what the competitors are doing.

In my experience, it’s super important to keep an open mind, to not get locked into one way of thinking, especially when it comes to site quality. This will help one keep from getting locked into a point of view that can keep one from seeing the true cause of ranking issues.

Featured Image by Shutterstock/Stone36

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Is Alt Text A Ranking Factor For Google Image Search?

Published

on

By

Is Alt Text A Ranking Factor For Google Image Search?

Alt text is used to help computers read images.

But can alt tags affect your organic search rankings?

Read on to learn whether there is any connection between alt text and improved rankings in Google Image Search results.

The Claim: Alt Text Is A Ranking Factor

What is alt text?

Alt text is an HTML image attribute. It allows you to create an alternative text version of your image if it cannot load or has an accessibility issue.

Because of its importance to Google Image Search, it is considered a ranking factor.

[Ranking Factors 2023] Download the free ebook + cheat sheet 

Alt Text As A Ranking Factor: The Evidence

Google emphasizes how alt text plays a vital role in getting your images recognized by Google Image Search.

You will find a page on image best practices in Google Search Central’s Advanced SEO documentation. In a section called “about alt text,” Google discusses the use of alt text.

“Google uses alt text along with computer vision algorithms and the contents of the page to understand the subject matter of the image. Also, alt text in images is useful as anchor text if you decide to use an image as a link.”

While the company doesn’t specify that alt text will improve your rankings, it warns website owners that improper use can harm your website.

“When writing alt text, focus on creating useful, information-rich content that uses keywords appropriately and is in context of the content of the page.

Avoid filling alt attributes with keywords (also known as keyword stuffing) as it results in a negative user experience and may cause your site to be seen as spam.”

It also offers the following examples of good and bad alt text usage.

Screenshot from Google Search Central, August 2023Google Search Central best practice for images

Google Sites Help documentation indicates that images may come with pre-populated alt text, including keywords for which you may not want to optimize.

“Some images automatically include alt text, so it’s a good idea to check that the alt text is what you want.”

For example, when I download stock photos, a text description of the image is embedded in the file.

Is Alt Text A Ranking Factor For Google Image Search?Screenshot by author, August 2023Is Alt Text A Ranking Factor For Google Image Search?

When uploaded to a content management system (CMS) like WordPress, the text descriptions may need to be moved to the alt text field or modified to remove unnecessary keywords.

Is Alt Text A Ranking Factor For Google Image Search?Screenshot from WordPress, August 2023Is Alt Text A Ranking Factor For Google Image Search?

In Google Search Central’s “Search Engine Optimization Starter Guide,” it offers the following advice about alt tags when using images as links:

“…if you’re using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don’t recommend using too many images for links in your site’s navigation when text links could serve the same purpose.”

In 2020, John Mueller, Google Search Advocate, answered a question about the alt text of a quote image during a Google Webmaster Office Hours. In the answer, he talked about how Google uses it:

“For Search, what happens with the alt attribute is we use that to better understand the images themselves, in particular, for Image Search. So if you didn’t care about Image Search, then from a Search point of view, you don’t really need to worry about alt text.

But if you do want these images to be shown in Image Search, which sometimes it makes sense to show fancy quotes in Image Search as well, then using the alt attribute is a good way to tell us this is on that image and we’ll get extra information from around your page with regard to how we can rank that landing page.”

Moz mentions ranking factors about alt text. Instead of saying that the alt text itself is a ranking factor, Moz advises:

“…alt text offers you another opportunity to include your target keyword. With on-page keyword usage still pulling weight as a search engine ranking factor, it’s in your best interest to create alt text that both describes the image and, if possible, includes a keyword or keyword phrase you’re targeting.”

In 2021, during a Twitter discussion about ALT text having a benefit on SEO, Google Developer Martin Splitt said:

“Yep, alt text is important for SEO too!”

Later in 2021, Mueller noted that alt text is not magic during a conversation about optimization for indexing purposes.

“My understanding was that alt attributes are required for HTML5 validation, so if you can’t use them with your platform, that sounds like a bug. That said, alt text isn’t a magic SEO bullet.”

[Recommended Read] → Ranking Factors: Systems, Signals, and Page Experience

Alt Text As A Ranking Factor: Our Verdict

Is Alt Text A Ranking Factor For Google Image Search?Is Alt Text A Ranking Factor For Google Image Search?

Alt text is a confirmed ranking factor for image search only. You should craft descriptive, non-spammy alt text to help your images appear in Google Image Search results.

Alt text is definitely not a ranking factor in Google Search. Google has clarified that alt text acts like normal page text in overall search. So it’s not useless, but it’s not a separately considered ranking factor in your page content.

That doesn’t mean you should ignore alt text. It’s a helpful accessibility tool for screen readers. When you’re writing alt text, ask yourself what you want someone who can’t see the image to understand about it.


Featured Image: Paulo Bobita/SearchEngineJournal



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending