Connect with us

SEO

Google Algorithms And Updates Focusing On User Experience: A Timeline

Published

on

Google Algorithms And Updates Focusing On User Experience: A Timeline

As the role of search evolves to touch multiple marketing and consumer touchpoints, optimizing for the user has never been so important.

This is reflected in Google’s continual focus on the searcher experience. Whether in its core algorithmic updates, new features, products, or SERP format changes.

While some of these Google changes have involved updates targeting low-quality content, links, and spam, other updates aim to understand consumer behavior and intent.

For example, most recent updates have focused on page speed, Core Web Vitals, and product reviews.

Considering the massive competition for SERP real estate from brands, even slight drops in position will critically impact traffic, revenue, and conversions.

In this article, I examine a combination of some (not all) Google updates and technological advancements that significantly reflect the search engine’s focus on the human user and their experiences online – from Panda in 2011 through to Page and Product Experience in 2021 and 2022.

Google Panda (2011)

First launched in February 2011, subsequent updates were continuous and added to Google’s core algorithm.

Panda was announced to target sites with low-quality content; this was one of the first signals that Google focused on content for the user experience.

The focus: producing and optimizing unique and compelling content.

  • Avoid thin content and focus on producing high-quality information.
  • Measure quality over quantity.
  • Content length is not a significant factor but needs to contain information that answers the user’s needs.
  • Avoid duplicate content – initially a big concern for ecommerce sites. Most recently, Google’s John Mueller explained that duplicate content is not a negative ranking factor.

Google Hummingbird (2013)

Following the introduction of the Knowledge Graph came Hummingbird with a focus on semantic search.

Hummingbird was designed to help Google better understand the intent and context behind searches.

As users looked to enter queries more conversationally, it became essential to optimize for user experience by focusing on content beyond the keyword with a renewed focus on the long tail.

This was the first indication of Google using natural language processing (NLP) to identify black hat techniques and create personalized SERP results.

The focus: creating and optimizing content that audiences want and find helpful.

  • Long-tail keywords and intent model strategies became crucial.
  • Content creation is needed to address what users are interested in and would like to learn.
  • Expand keyword research to include conceptual and contextual factors.
  • Avoid keyword-stuffing and producing low-quality content to personalize experiences.
Image source: BrightEdge, July 2022

E-A-T (2014)

Although it gained attention in 2018, the Google E-A-T concept first appeared in 2014 in Google’s Quality Guidelines.

Now, it is part of Google’s guidelines on focusing on YMYL – your money or your life.

Marketers were advised to focus on content that could impact their readers’ future happiness, health, financial stability, or safety.

Google established E-A-T guidelines to help marketers tailor on and off-page SEO and content strategies to provide users with an experience containing the most relevant content from sources they could trust.

In other words: Expertise, Authority, and Trust.

The focus: ensuring websites offer expert and authoritative content that users can trust.

  • Create content that shows expertise and knowledge of the subject matter.
  • Focus on the credibility and authority of websites publishing content.
  • Improve the overall quality of websites – structure and security.
  • Earn off-page press coverage on reputable sites, reviews, testimonials, and expert authors.

Mobile Update (2015)

This was the first time Google gave marketers a heads-up (or a warning, for many) that an update was coming.

Focusing on the user’s experience on mobile was a significant signal reflecting the growing use of mobile as part of the customer search journey.

Google clearly communicated that this update would prioritize mobile-friendly websites on mobile SERPs. Many more mobile updates followed.

The focus: mobile content and users’ mobile site experience.

  • Focus on design factors such as responsive design and mobile page structures.
  • Enhance site navigation, so mobile users can quickly find what they need.
  • Avoid format issues on mobile that were different from the desktop experience.
  • Confirm that websites are mobile-optimized.

Just after the mobile update went live, Google quietly issued a Quality update.

Websites that focused on the user experience by focusing on quality content and avoiding too much irrelevant user-generated content and too many ads did well. This was another sign that Google was putting the user experience first.

RankBrain (2015)

Like the Hummingbird principles and NLP mentioned earlier, Google RankBrain was more of a change to the algorithm.

It gave us an indication of how vital machine learning was in all marketing and technology forms.

Utilizing this to learn and predict user behavior, RankBrain powered search results based on an even better understanding of users’ intent.

The focus: ensuring that content reflects user intent and optimizing for conversational search.

  • Place greater focus and emphasis on creating content that matches the user’s intent.
  • Ensure that all aspects of technical SEO are updated (such as schema markup, for example).
  • Google signified that RankBrain was the third-most important ranking signal.

Google Mobile-First Indexing (2018)

The Mobile-First Indexing Update meant that Google would use the mobile version of a webpage for indexation and ranking.

Once again, this was aimed to help enhance the user experience and help users find what they are looking for.

Producing content for mobile and focusing on speed and performance became paramount to success.

The focus: re-affirming the importance of mobile optimization, content, speed, and mobile site performance.

  • Improve AMP and mobile page speed and performance.
  • Ensure that URL structures for mobile and desktop sites meet Google requirements.
  • Add structured data for both desktop and mobile versions.
  • Make sure the mobile site contains the same content as the desktop site.

Google has said that March 2021 is the rollout date for its mobile-first index.

Shortly afterward, Google made mobile page speed a ranking factor so website owners would focus on load times and page speed to enhance the user experience.

Broad Core Algorithm Updates (2018)

2018 was a year in which Google released lots of core algorithm updates covering areas such as social signals and the so-called medic update.

After the August update, in particular, Google’s John Mueller suggested making content more relevant.

While there was some confusion on ranking factors and fixing specific issues, it did bring the concept of E-A-T and content for the user top of mind for many SEO professionals and content marketers.

On the topic of rater guidelines being key to the broad update, Google’s Danny Sullivan suggested:

“Want to do better with a broad change? Have great content. Yeah, the same boring answer. But if you want a better idea of what we consider great content, read our raters guidelines. That’s like almost 200 pages of things to consider.”

BERT (2019)

Following RankBrain, this neural network-based method for natural language processing allowed Google to understand conversational queries better.

BERT allows users to find valuable and accurate information more easily.

According to Google, this represented the most significant leap forward in the past five years and one of the greatest in search history.

The focus: improving the understanding of consumer intent through conversational type search themes.

  • Increase the depth and specifics of the content.
  • Work more with long-tail queries and phrases using more than three words.
  • Ensure that content addresses the users’ questions or queries and is optimized correctly.
  • Focus on writing for humans clearly and concisely so that it is easy to understand.

Read more on BERT and SMITH here.

COVID-19 Pandemic (March 2020)

The global pandemic meant that consumer behavior and search patterns changed forever as Google continued to focus on E-A-T signals.

Google began to emphasize YMYL signals as the internet struggled to cope with misinformation and SEO pros struggled to keep up with the rapid shifts and dips in consumer behavior.

From setting up 24-hour incident response teams with the World Health Organization and policing content to helping people find helpful information and avoiding misinformation, the user’s needs never became so important.

The demand for SEO rose to an all-time high, and Google released a COVID-19 playbook.

Google Page Experience Update And Core Web Vitals Announced (May 2020)

Focusing on a site’s technical health and metrics to measure the user experience of a page metrics include looking at how quickly page content loads, how quickly a browser loading a webpage can respond to a user’s input, and how unstable the content is as it loads in the browser.

The focus: integrating new Core Web Vitals metrics to measure and improve on-page experiences.

  • Mobile-friendliness, safe browsing, HTTPS, and intrusive interstitials – The Google Page Experience Signal.
  • LCP (Largest Contentful Paint): Improve page load times for large images and video backgrounds.
  • FID (First Input Delay): Ensure your browser responds quickly to a user’s first interaction with a page.
  • CLS (Cumulative Layout Shift): Include the size attributes on your images and video elements or reserve the space with CSS aspect ratio boxes and ensure content is never inserted above existing content, except in response to user interaction.

Broad Core Algorithm Updates (2020)

The third Google core algorithm update of the year rolled out in December 2020. This came in the form of slight changes that affect the order and weight of certain (not always disclosed) ranking signals.

According to SEJ Contributor Ryan Jones:

“Google aims to serve content that provides the best and most complete answers to searchers’ queries. Relevance is the one ranking factor that will always win out over all others.”

Read more on December’s Core Update here.

Passage Ranking (February 2021)

Google officially rolled out its passage-based indexing, designed to help users find answers to specific questions.

You’ve probably seen this in the wild, but essentially this allows Google to highlight pertinent elements of a passage within a piece of content that fits the question.

This means long-form content that may not be skimmable but provides valuable answers could be surfaced as a result.

Ultimately, this makes it easier for Google to connect users to content without making them hunt for the specific answer to their questions when they click on a page.

Passage Ranking (February 2021)Screenshot from blog.google, July 2022

The key to success with passage ranking goes back to focusing on creating great content for the user.

Read more on the 16 Key Points You Should Know here.

Product Reviews Update (April 2021)

This new product review update was designed to improve a user’s experience when searching for product reviews.

Marketers were advised to focus on avoiding creating thin content as this update will reward content that users find most helpful.

The focus: rewarding creators who provide users with authentic and detailed review content

Google shared nine helpful questions to consider when creating and publishing product reviews.

  • Show expert knowledge about products.
  • Differentiate your product compared to competitors.
  • Highlight benefits and any drawbacks clearly and concisely.
  • Show how the product has evolved to fit the needs of the user.

Read more here.

MUM (May 2021)

Following RankBrain and BERT, MUM (Multitask Unified Model) technology utilizes AI and NLP to improve information retrieval.

For the end user, this technological advancement helps provide better information and results as it processes multiple media formats such as video, images, and audio.

Pandu Nayak, Google fellow and vice president of Search, said:

“But with a new technology called Multitask Unified Model, or MUM, we’re getting closer to helping you with these types of complex needs. So in the future, you’ll need fewer searches to get things done.”

Read more here.

Page Experience Update And Core Web Vitals (CWV) Rollout (June 2021)

The much-anticipated Page Experience Update, including Core Web Vitals, rolled out, with further updates to desktop following in March 2022.

Nine months after the rollout of Google’s Core Web Vitals and over a year since BrightEdge launched pre-roll predictive research, new research showed how many industries are adapting and improving their Core Web Vitals.

The focus: improving Pages Experiences for users with speed and precision.

 

The focus: improving Pages Experiences for users with speed and precision.Image source: BrightEdge, July 2022
  • Retail giants have made significant strides in improving experiences.
  • In cases like Retail, CWV metrics like input delay have been cut in half.
  • Although Finance was the best prepared last year, it made the least performance gains in the categories ​evaluated.

Spam Update (June 2021) And Link Spam Algorithm Update (July 2021)

Spam updateImage source: BrightEdge July 2022

Ensuring users get the right results based on their searches is foundational to a good experience.

In addition, updates and algorithm changes help protect users’ privacy to keep searches safe and secure.

The focus: keeping user experiences safe.

Learn more in this video from Google here.

Local Search Update (November 2021))

Google has always provided local search updates for local search users and fine-tuned its algorithm for better user results.

Local search is a huge channel, not to be underestimated, but a whole other post.

This also includes guidance on how businesses can improve their local ranking for improved customer experiences.

Read more here.

Product Algorithm Update (March 2022)

On March 23, 2022, Google provided an instruction update based on how product reviews are performing in one year.

This also informed the community of improved rollout updates that will help users surface accurate and relevant information to help with purchasing decisions.

The focus: user experience and surfacing results that help users make purchasing easier.

Google Algorithms & Updates Focused On User Experience: A TimelineScreenshot from Google Search Central blog, July 2022
  • As always, showcase your expertise and ensure the content is authentic.
  • Share why you recommend products with evidence to support it.

Read more advice here and here.

Conclusion

A successful user experience requires a combination of content and technical expertise. Updates and guidance help marketers create content for the user.

In addition, algorithms and technological advancements help Google surface better results and showcase accurate, relevant, and trustworthy content.

Google will continue to focus on improving experiences for its user.

As a marketer who wants to optimize for both, ensuring your website (from navigation, speed, and reliability) and focusing on content is vital.

Many of Google’s updates signal that technical SEO, data science, and content marketing excellence are coming together.

Stay up to date and read through all of Google’s Updates here on SEJ.

More Resources:


Featured Image: Gorodenkoff/Shutterstock



Source link

SEO

9 Common Technical SEO Issues That Actually Matter

Published

on

9 Common Technical SEO Issues That Actually Matter

In this article, we’ll see how to find and fix technical SEO issues, but only those that can seriously affect your rankings.

If you’d like to follow along, get Ahrefs Webmaster Tools and Google Search Console (both are free) and check for the following issues.

Indexability is a webpage’s ability to be indexed by search engines. Pages that are not indexable can’t be displayed on the search engine results pages and can’t bring in any search traffic. 

Three requirements must be met for a page to be indexable:

  1. The page must be crawlable. If you haven’t blocked Googlebot from entering the page robots.txt or you have a website with fewer than 1,000 pages, you probably don’t have an issue there. 
  2. The page must not have a noindex tag (more on that in a bit).
  3. The page must be canonical (i.e., the main version). 

Solution

In Ahrefs Webmaster Tools (AWT):  

  1. Open Site Audit
  2. Go to the Indexability report 
  3. Click on issues related to canonicalization and “noindex” to see affected pages
Indexability issues in Site Audit

For canonicalization issues in this report, you will need to replace bad URLs in the link rel="canonical" tag with valid ones (i.e., returning an “HTTP 200 OK”). 

As for pages marked by “noindex” issues, these are the pages with the “noindex” meta tag placed inside their code. Chances are most of the pages found in the report there should stay as is. But if you see any pages that shouldn’t be there, simply remove the tag. Do make sure those pages aren’t blocked by robots.txt first. 

Recommendation

Click on the question mark on the right to see instructions on how to fix each issue. For more detailed instructions, click on the “Learn more” link. 

Instruction on how to fix an SEO issue in Site Audit

A sitemap should contain only pages that you want search engines to index. 

When a sitemap isn’t regularly updated or an unreliable generator has been used to make it, a sitemap may start to show broken pages, pages that became “noindexed,” pages that were de-canonicalized, or pages blocked in robots.txt. 

Solution 

In AWT:

  1. Open Site Audit 
  2. Go to the All issues report
  3. Click on issues containing the word “sitemap” to find affected pages 
Sitemap issues shown in Site Audit

Depending on the issue, you will have to:

  • Delete the pages from the sitemap.
  • Remove the noindex tag on the pages (if you want to keep them in the sitemap). 
  • Provide a valid URL for the reported page. 

Google uses HTTPS encryption as a small ranking signal. This means you can experience lower rankings if you don’t have an SSL or TLS certificate securing your website. 

But even if you do, some pages and/or resources on your pages may still use the HTTP protocol. 

Solution 

Assuming you already have an SSL/TLS certificate for all subdomains (if not, do get one), open AWT and do these: 

  1. Open Site Audit
  2. Go to the Internal pages report 
  3. Look at the protocol distribution graph and click on HTTP to see affected pages
  4. Inside the report showing pages, add a column for Final redirect URL 
  5. Make sure all HTTP pages are permanently redirected (301 or 308 redirects) to their HTTPS counterparts 
Protocol distribution graph
Internal pages issues report with added column

Finally, let’s check if any resources on the site still use HTTP: 

  1. Inside the Internal pages report, click on Issues
  2. Click on HTTPS/HTTP mixed content to view affected resources 
Site Audit reporting six HTTPS/HTTP mixed content issues

You can fix this issue by one of these methods:

  • Link to the HTTPS version of the resource (check this option first) 
  • Include the resource from a different host, if available 
  • Download and host the content on your site directly if you are legally allowed to do so
  • Exclude the resource from your site altogether

Learn more: What Is HTTPS? Everything You Need to Know 

Duplicate content happens when exact or near-duplicate content appears on the web in more than one place. 

It’s bad for SEO mainly for two reasons: It can cause undesirable URLs to show in search results and can dilute link equity

Content duplication is not necessarily a case of intentional or unintentional creation of similar pages. There are other less obvious causes such as faceted navigation, tracking parameters in URLs, or using trailing and non-trailing slashes

Solution 

First, check if your website is available under only one URL. Because if your site is accessible as:

  • http://domain.com
  • http://www.domain.com
  • https://domain.com
  • https://www.domain.com

Then Google will see all of those URLs as different websites. 

The easiest way to check if users can browse only one version of your website: type in all four variations in the browser, one by one, hit enter, and see if they get redirected to the master version (ideally, the one with HTTPS). 

You can also go straight into Site Audit’s Duplicates report. If you see 100% bad duplicates, that is likely the reason.

Duplicates report showing 100% bad duplicates
Simulation (other types of duplicates turned off).

In this case, choose one version that will serve as canonical (likely the one with HTTPS) and permanently redirect other versions to it. 

Then run a New crawl in Site Audit to see if there are any other bad duplicates left. 

Running a new crawl in Site Audit

There are a few ways you can handle bad duplicates depending on the case. Learn how to solve them in our guide

Learn more: Duplicate Content: Why It Happens and How to Fix It 

Pages that can’t be found (4XX errors) and pages returning server errors (5XX errors) won’t be indexed by Google so they won’t bring you any traffic. 

Furthermore, if broken pages have backlinks pointing to them, all of that link equity goes to waste. 

Broken pages are also a waste of crawl budget—something to watch out for on bigger websites. 

Solution

In AWT, you should: 

  1. Open Site Audit.
  2. Go to the Internal pages report.
  3. See if there are any broken pages. If so, the Broken section will show a number higher than 0. Click on the number to show affected pages.
Broken pages report in Site Audit

In the report showing pages with issues, it’s a good idea to add a column for the number of referring domains. This will help you make the decision on how to fix the issue. 

Internal pages report with no. of referring domains column added

Now, fixing broken pages (4XX error codes) is quite simple, but there is more than one possibility. Here’s a short graph explaining the process:

How to deal with broken pages

Dealing with server errors (the ones reporting a 5XX) can be a tougher one, as there are different possible reasons for a server to be unresponsive. Read this short guide for troubleshooting.

Recommendation

With AWT, you can also see 404s that were caused by incorrect links to your website. While this is not a technical issue per se, reclaiming those links may give you an additional SEO boost.

  1. Go to Site Explorer
  2. Enter your domain 
  3. Go to the Best by links report
  4. Add a “404 not found” filter
  5. Then sort the report by referring domains from high to low
How to find broken backlinks in Site Explorer
In this example, someone linked to us, leaving a comma inside the URL.

If you’ve already dealt with broken pages, chances are you’ve fixed most of the broken links issues. 

Other critical issues related to links are: 

  • Orphan pages – These are the pages without any internal links. Web crawlers have limited ability to access those pages (only from sitemap or backlinks), and there is no link equity flowing to them from other pages on your site. Last but not least, users won’t be able to access this page from the site navigation. 
  • HTTPS pages linking to internal HTTP pages – If an internal link on your website brings users to an HTTP URL, web browsers will likely show a warning about a non-secure page. This can damage your overall website authority and user experience.

Solution

In AWT, you can:

  1. Go to Site Audit.
  2. Open the Links report.
  3. Open the Issues tab. 
  4. Look for the following issues in the Indexable category. Click to see affected pages. 
Important SEO issues related to links

Fix the first issue by changing the links from HTTP to HTTPS or simply delete those links if no longer needed.

For the second issue, an orphan page needs to be either linked to from some other page on your website or deleted if a given page holds no value to you.

Sidenote.

Ahrefs’ Site Audit can find orphan pages as long as they have backlinks or are included in the sitemap. For a more thorough search for this issue, you will need to analyze server logs to find orphan pages with hits. Find out how in this guide.

7. Mobile experience issues

Having a mobile-friendly website is a must for SEO. Two reasons: 

  1. Google uses mobile-first indexing – It’s mostly using the content of mobile pages for indexing and ranking.
  2. Mobile experience is part of the Page Experience signals – While Google will allegedly always “promote” the page with the best content, page experience can be a tiebreaker for pages offering content of similar quality. 

Solution

In GSC: 

  1. Go to the Mobile Usability report in the Experience section
  2. View affected pages by clicking on issues in the Why pages aren’t usable on mobile section 
Mobile Usability report in Google Search Console

You can read Google’s guide for fixing mobile issues here.  

8. Performance and stability issues 

Performance and visual stability are other aspects of Page Experience signals used by Google to rank pages. 

Google has developed a special set of metrics to measure user experience called Core Web Vitals (CWV). Site owners and SEOs can use those metrics to see how Google perceives their website in terms of UX. 

Google's search signals for page experience

While page experience can be a ranking tiebreaker, CWV is not a race. You don’t need to have the fastest website on the internet. You just need to score “good” ideally in all three categories: loading, interactivity, and visual stability. 

Three categories of Core Web Vitals

Solution 

In GSC: 

  1. First, click on Core Web Vitals in the Experience section of the reports.
  2. Then click Open report in each section to see how your website scores. 
  3. For pages that aren’t considered good, you’ll see a special section at the bottom of the report. Use it to see pages that need your attention.
How to find Core Web Vitals in Google Search Console
CWV issue report in Google Search Console

Optimizing for CWV may take some time. This may include things like moving to a faster (or closer) server, compressing images, optimizing CSS, etc. We explain how to do this in the third part of this guide to CWV. 

Bad website structure in the context of technical SEO is mainly about having important organic pages too deep into the website structure. 

Pages that are nested too deep (i.e., users need >6 clicks from the website to get to them) will receive less link equity from your homepage (likely the page with the most backlinks), which may affect their rankings. This is because link value diminishes with every link “hop.” 

Sidenote.

Website structure is important for other reasons too such as the overall user experience, crawl efficiency, and helping Google understand the context of your pages. Here, we’ll only focus on the technical aspect, but you can read more about the topic in our full guide: Website Structure: How to Build Your SEO Foundation.

Solution 

In AWT

  1. Open Site Audit
  2. Go to Structure explorer, switch to the Depth tab, and set the data type to Data table
  3. Configure the Segment to only valid HTML pages and click Apply
  4. Use the graph to investigate pages with more than six clicks away from the homepage 
How to find site structure issues in Site Audit
Adding a new segment in Site Audit

The way to fix the issue is to link to these deeper nested pages from pages closer to the homepage. More important pages could find their place in site navigation, while less important ones can be just linked to the pages a few clicks closer.

It’s a good idea to weigh in user experience and the business role of your website when deciding what goes into sitewide navigation. 

For example, we could probably give our SEO glossary a slightly higher chance to get ahead of organic competitors by including it in the main site navigation. Yet we decided not to because it isn’t such an important page for users who are not particularly searching for this type of information. 

We’ve moved the glossary only up a notch by including a link inside the beginner’s guide to SEO (which itself is just one click away from the homepage). 

Structure explorer showing glossary page is two clicks away from the homepage
One page from the glossary folder is two clicks away from the homepage.
Link that moved SEO glossary a click closer to the homepage
Just one link, even at the bottom of a page, can move a page higher in the overall structure.

Final thoughts 

When you’re done fixing the more pressing issues, dig a little deeper to keep your site in perfect SEO health. Open Site Audit and go to the All issues report to see other issues regarding on-page SEO, image optimization, redirects, localization, and more. In each case, you will find instructions on how to deal with the issue. 

All issues report in Site Audit

You can also customize this report by turning issues on/off or changing their priority. 

Issue report in Site Audit is customizable

Did I miss any important technical issues? Let me know on Twitter or Mastodon.



Source link

Continue Reading

SEO

New Google Ads Feature: Account-Level Negative Keywords

Published

on

New Google Ads Feature: Account-Level Negative Keywords

Google Ads Liaison Ginny Marvin has announced that account-level negative keywords are now available to Google Ads advertisers worldwide.

The feature, which was first announced last year and has been in testing for several months, allows advertisers to add keywords to exclude traffic from all search and shopping campaigns, as well as the search and shopping portion of Performance Max, for greater brand safety and suitability.

Advertisers can access this feature from the account settings page to ensure their campaigns align with their brand values and target audience.

This is especially important for brands that want to avoid appearing in contexts that may be inappropriate or damaging to their reputation.

In addition to the brand safety benefits, the addition of account-level negative keywords makes the campaign management process more efficient for advertisers.

Instead of adding negative keywords to individual campaigns, advertisers can manage them at the account level, saving time and reducing the chances of human error.

You no longer have to worry about duplicating negative keywords in multiple campaigns or missing any vital to your brand safety.

Additionally, account-level negative keywords can improve the accuracy of ad targeting by excluding irrelevant or low-performing keywords that may adversely impact campaign performance. This can result in higher-quality traffic and a better return on investment.

Google Ads offers a range of existing brand suitability controls, including inventory types, digital content labels, placement exclusions, and negative keywords at the campaign level.

Marvin added that Google Ads is expanding account-level negative keywords to address various use cases and will have more to share soon.

This rollout is essential in giving brands more control over their advertising and ensuring their campaigns target the appropriate audience.


Featured Image: Primakov/Shutterstock



Source link

Continue Reading

SEO

Google’s Gary Illyes Answers Your SEO Questions On LinkedIn

Published

on

Google's Gary Illyes Answers Your SEO Questions On LinkedIn

Google Analyst Gary Illyes offers guidance on large robots.txt files, the SEO impact of website redesigns, and the correct use of rel-canonical tags.

Illyes is taking questions sent to him via LinkedIn direct message and answering them publicly, offering valuable insights for those in the SEO community.

It’s already newsworthy for a Google employee to share SEO advice. This is especially so given it’s Illyes, who isn’t as active on social media as colleagues like Search Advocate John Mueller and Developer Advocate Martin Splitt.

Throughout the past week, Illyes has shared advice and offered guidance on the following subjects:

  • Large robots.txt files
  • The SEO impact of website redesigns
  • The correct use of rel-canonical tags

Considering the engagement his posts are getting, there’s likely more to come. Here’s a summary of what you missed if you’re not following him on LinkedIn.

Keep Robots.Txt Files Under 500KB

Regarding a previously published poll on the size of robots.txt files, Illyes shares a PSA for those with a file size larger than 500kb.

Screenshot from: linkedin.com/in/garyillyes/, January 2023.

Illyes advises paying attention to the size of your website’s robots.txt file, especially if it’s larger than 500kb.

Google’s crawlers only process the first 500kb of the file, so it’s crucial to ensure that the most important information appears first.

Doing this can help ensure that your website is properly crawled and indexed by Google.

Website Redesigns May Cause Rankings To Go “Nuts”

When you redesign a website, it’s important to remember that its rankings in search engines may be affected.

As Illyes explains, this is because search engines use the HTML of your pages to understand and categorize the content on your site.

If you make changes to the HTML structure, such as breaking up paragraphs, using CSS styling instead of H tags, or adding unnecessary breaking tags, it can cause the HTML parsers to produce different results.

This can significantly impact your site’s rankings in search engines. Or, as Illyes phrases it, it can cause rankings to go “nuts”:

Google’s Gary Illyes Answers Your SEO Questions On LinkedInScreenshot from: linkedin.com/in/garyillyes/, January 2023.

Illyes advises using semantically similar HTML when redesigning the site and avoiding adding tags that aren’t necessary to minimize the SEO impact.

This will allow HTML parsers to better understand the content on your site, which can help maintain search rankings.

Don’t Use Relative Paths In Your Rel-Canonical

Don’t take shortcuts when implementing rel-canonical tags. Illyes strongly advises spelling out the entire URL path:

Google’s Gary Illyes Answers Your SEO Questions On LinkedInScreenshot from: linkedin.com/in/garyillyes/, January 2023.

Saving a few bytes using a relative path in the rel-canonical tag isn’t worth the potential issues it could cause.

Using relative paths may result in search engines treating it as a different URL, which can confuse search engines.

Spelling out the full URL path eliminates potential ambiguity and ensures that search engines identify the correct URL as the preferred version.

In Summary

By answering questions sent to him via direct message and offering his expertise, Illyes is giving back to the community and providing valuable insights on various SEO-related topics.

This is a testament to Illyes’ dedication to helping people understand how Google works. Send him a DM, and your question may be answered in a future LinkedIn post.


Source: LinkedIn

Featured Image: SNEHIT PHOTO/Shutterstock



Source link

Continue Reading

Trending

en_USEnglish