Connect with us

SEO

Mastering Website Navigation: The Ultimate Guide

Published

on

Mastering Website Navigation: The Ultimate Guide

Great navigation is a crucial element to get right on your website for two main reasons.

First, it affects your user experience—helping users find what they want will result in more conversions. 

Second, it helps search engines. Because good navigation will help them better understand how you’ve organized your site and ensure PageRank flows to the most beneficial pages.

But how can you improve navigation to help users find what they’re looking for and benefit your SEO?

This article will explain precisely that.

What is website navigation?

Website navigation is the process of a user interacting with a website by selecting links, buttons, and other elements to discover the content on a site.

While website navigation describes users clicking on links, there are standard components that facilitate that navigation. Some of these include: 

  • Breadcrumbs
  • Sidebars
  • Mega menus
  • Dropdowns
  • Tabs
  • Accordions 

I’ll explain each one and how you can use them later on in the guide.

How people navigate websites

Before we dive into the navigation components and how they can help users navigate websites, it’s important to understand how users actually navigate sites.

There are three main ways, starting with forward navigation.

Forward navigation

This type of navigation helps users explore deeper within a site hierarchy, often to find more specific content within the same topic area.

Here’s an example of typical forward navigation using Stripe’s documentation area.

Stripe's documentation

First, a user may start on the main documentation page. They then scroll and select the payments documentation link.

Stripe's documentation payment link

On the next page, they select to view all “After the payment” documentation.

Stripe's "after payment" page

Then they navigate to “Receive payouts.”

Stripe's "receive payouts" link

And finally, to “Instant Payouts.”

Stripe's "instant payouts"

In this scenario, the user has navigated deeper within the site hierarchy, going from a broad topic like documentation to more specific subtopics like payments and payouts.

Forward navigation illustration

This is forward navigation.

Sideward navigation

Secondly, there is sideward navigation. This helps users explore content related to what they are currently viewing but isn’t a subtopic.

For example, if you’re on Stripe’s “Instant Payouts” page, the sidebar links you to other payout-related content, such as “Alternative currencies.” 

Stripe's "after payment" page, highlighting the "alternative currencies" link

Backward navigation

Finally, there is backward navigation. This navigation helps users return to a previous page or step within a process.

The most common way to facilitate this is with breadcrumbs, which you can see within the Stripe documentation.

Backward navigation on Stripe

Common navigation components

When you build your website, you’ll think of it in terms of navigation components. Here are some examples of those.

Navigation menus

Starting with a simple navigation component—menus.

You’ll see these in multiple website areas, often horizontally in a header (sometimes called a navigation bar).

Header navigation menu on AO.com
Header navigation on AO.com.

Vertically in the footer:

Navigation menu on ASOS
A navigation menu on ASOS, with a header to group the links in the menu.

Within sidebars:

Sidebar navigation

Sometimes, their appearance can change. But the concept is the same—a list of links, vertically or horizontally, often organized by topic.

Dropdown menus

The key element of a dropdown menu is that it’s hidden, and then it displays when users hover or click on another element, like a navigation bar in the header.

Dropdown navigation is often straightforward, with a single vertical list of links, like this example on FreeAgent.

Dropdown menu on FreeAgent

Mega menus

There is a more visually complex dropdown type, a mega menu. Both menus display in the same way (via hover or click)—the main difference being the number of content/links contained, like in this example from ASOS.

Mega menu on ASOS

Mega menus are better than simple dropdowns, as the Nielsen Norman Group confirmed.

Quote on mega menus from the Nielsen Norman Group

Why mega menus are better can be summarized into a few key reasons:

  • You can include more links within a mega menu.
  • You can group related links easier.
  • You can include imagery and illustrations.

Burger menus

Burger menus (or mobile menus) are similar to dropdown menus, but sites hide them behind an icon that looks like a burger (hence the name).

This type of navigation is often seen on mobile devices, like in this example on my website.

Burger menu–style navigation

But they’re also becoming more commonly seen on desktop sites, like this example on Amazon.

Burger menu–style navigation on Amazon

Hidden menus like this should not be used as the primary way you expect users to navigate. Hidden navigation is less discoverable, so make it visible by default if it’s important. 

Tabbed navigation

Tabbed navigation replaces content when users select to view a subset of information. 

Here’s how I’ve implemented that on my SEOToolbelt resource.

Tabbed navigation

You can also see tabs implemented on the Ahrefs Blog’s homepage.

Tabbed navigation on Ahrefs Blog

Sometimes, sites have tabbed navigation within mega menus, and content is shown on hover—like in this example on Virgin Experience Days.

Tabbed navigation within mega menu

Accordion navigation

Accordions are similar to tabs, but the two main differences are that:

  • There is the option for no content to be displayed.
  • Multiple accordions can optionally be opened at once.

I made an example of an accordion where only one can be open at once.

Accordion tab where only one dropdown menu can be opened at a time

And here it is if you allow multiple accordions to open at once.

Accordion tab where all tabs on the dropdown menu can be opened at a time

Tabs and accordions can present a challenge for JavaScript SEO. If you want the content to be ranked, the HTML source should include the content rather than setting it to inject via JavaScript on click.

Faceted navigation

Faceted navigation (also known as filtered navigation) allows users to select different filters to view subsets of information on the page. Again, I’ve implemented this on my blog using the brilliant Isotope JS package.

Like this example on John Lewis, you’ll also commonly see filtered navigation on an e-commerce store.

Filtered navigation on an e-commerce store

Anchor link navigation

Anchor links are navigation that helps users jump to a particular part of a long page, like an article. 

The Ahrefs Blog uses this, like in this link building article.

Anchor link navigation on Ahrefs

Hierarchical navigation

Hierarchical navigation helps users move backward or forward across a site hierarchy. Often, you’ll see this in practice by using breadcrumbs, where a user is shown the current page’s parent page. Here’s an example of that on an e-commerce store called Projectorpoint.

Breadcrumbs on Projectorpoint

You also sometimes see hierarchical navigation in the form of sidebar links. For example, on Sephora, the entire site hierarchy is shown in these links (alongside the current page’s sibling pages).

Sidebar links on Sephora

Related navigation

Related navigation helps users move sideways across a site hierarchy to other associated pages (sometimes called sibling pages). There are a few ways sites usually implement related navigation:

  • By pages sharing the same taxonomy – For example, if two products on an e-commerce store were in the 4K TV category, they’d display as related products.
  • By pages having the same parent page – For example, if the 4K TV category and the 1080p TV category both had “TV set” as their parent category, links would be added between the two.
  • By pages having similar content – Sometimes, related navigation is built based on an index of page content, and then links are automatically added between pages depending on the content similarity. 
  • By products being frequently bought together

You can manually add these types of links. But ideally, you’d automate them to reduce the burden on site admins. 

Pagination navigation

Sites use pagination to show a subset of content from a page, often used on pages that list links to other pages, like on the Ahrefs Blog archive pages.

Pagination navigation on the Ahrefs Blog's archive pages

Or even Google search results.

Numbered pagination on the Google homepage

Nowadays, you don’t just get numbered pagination, as I’ve shown above. You also get “load more” buttons, like the one below.

Load button saying "show me more"

Most studies say that users prefer “load more” buttons, such as this one by Smashing Magazine

Quote from Smashing Magazine

12 best practices to master website navigation

Now you know how people navigate sites and the common components that help users navigate, here are 12 best practices to create website navigation that users and search engines will love.

1. Research the pages you want to create

Excellent navigation starts with great pages to navigate to. So you’ll want to start by planning the pages on your site into a sitemap (not an HTML or XML sitemap, just a list of the pages you want on your site in a spreadsheet).

Ideally, you only want to add pages to your sitemap if a user will be interested in the contents of that page (or you’ll waste your time creating it).

But how do you decide if a user will be interested?

One of the best methods to figure that out is keyword research. But that’s a vast topic. So if you don’t know how to do it, head to this keyword research guide. The process you’ll follow goes like this. You can:

  1. Use a keyword research tool, like Keywords Explorer, to find what users search for, or check competitors’ sites using Site Explorer.
  2. Group semantically related keywords together (Ahrefs’ Parent Topic feature can help here).
  3. Create a list of all the parent topics and secondary keywords. Below is an example of that for a gifting client of mine.
Keyword research results table

2. Create a hierarchy

Now you have your list of pages, you should group those pages to create a site hierarchy/information architecture.

First, you should group them into a few broad page types, such as top-level product categories, articles, company information, or whatever suits your site. This shows the types of content you’ll have.

Illustration showing page types coming off the homepage

You’ll then want to create a hierarchy of pages within each page type. Hierarchies get more specific the deeper into them you go. So for a clothing e-commerce store, that could look something like this:

Home > Men’s clothing > Shoes > Boots > Black boots

Home > Blog > Trends > Winter clothing trends 2022

I use a Mac App called MindNode that makes it easy to create organization charts that display an information architecture. Here’s a quick example of a category structure for a clothing brand.

Organization chart showing a clothing brand's information architecture

Card sorting is a valuable technique for understanding how different people organize content. Sometimes, it’s not clear-cut. There can be many ways to organize a site. 

It can also help to look at what competing sites do. By examining breadcrumbs and other navigation elements, you can understand how even large, complex sites have decided to organize information.

Breadcrumb navigation on John Lewis
An example of how John Lewis has organized its content.

One method to understand site structure is to examine URL structure. That’s easy, thanks to the Structure explorer report in Ahrefs’ Site Audit.

Structure Explorer tool, via Ahrefs' Site Audit

This report has a brilliant feature where you can view various types of data according to the structure. For instance, when auditing any website, you can view Ahrefs’ organic traffic estimate by directory.

Ahrefs organic traffic by directory

RECOMMENDATION

Are you analyzing a website without structured URLs? If so, I recently wrote a guide on using breadcrumbs and URL structure to analyze competitors’ site structures quickly. It provides great insights for planning your structure, so it’s worth a read.

3. Build navigation elements around that hierarchy

Now that you have set up your hierarchy, you can develop a clear plan for how breadcrumb will work. You can also begin using different navigation components to help with forward and sideward navigation. 

For example, if the current page has child pages, you can use different navigation components to ensure they are linked.

For Google’s SEO documentation, it’s built its sidebar around that hierarchy. 

Sidebar hierarchy navigation on Google Search Central

And then, on topic overview pages, like its “crawling” one, it’s created a table that links to all of that page’s subpages.

Table that links to all subpages on Google's SEO documentation

Sites that have more simplistic navigation implement small menus with the main content with links to subpages. For e-commerce stores, you’ll often see this presented as a list of horizontal links, like this example: 

Horizontal links on the Reiss website

You can automate these links entirely by querying the parent/child relationships between pages, which I’ve written about recently in my guide for improving e-commerce category pages.

4. Don’t forget forward navigation

When planning your website navigation, it’s surprisingly easy to miss out on forward navigation. 

A common culprit for this is e-commerce category pages. Users can use JavaScript-based filtering navigation to navigate forwards, so links to subcategories of the current category are sometimes omitted.

I worked with jewelry brand Abbott Lyon in mid-2021. I found it needed longer-tail categories and components for forward navigation on categories. 

It fixed the issue by December 2021 and started to see the reward by March 2022. It has continuously grown since the fix, setting itself up for an excellent peak season in December 2022.

Line graph showing how traffic increased over time after a navigation strategy was put in place
Chart is from the Overview report in Site Explorer.

But how specifically did it improve navigation? It was as simple as adding a block of links to subcategories in each category. 

Block links on a category page for necklaces

This nicely displays the influence navigation components can have on SEO. 

5. Link to important pages globally

Burger and mega menus help users quickly and easily find the important pages of your site. I’ve seen advice to make all your site pages one click away using mega menus—don’t do that.

Mega menus give you the option to link to more pages. But you should still only include the pages that are most important to you for two reasons:

  1. You don’t want to clutter the interface, making it harder for users to find important information.
  2. You want to consolidate PageRank into the pages that would have the most commercial benefit to your business if you ranked well for them. 

Refer to the sitemap I suggested in tip #1 and use the search demand data to decide what to include. Then, use the hierarchy you’ve set to determine how to organize it.

I take a data-led approach. So I usually end up with a sheet with multiple tables; each table looks like this.

Data table example
In this example, “Gifts by recipient” is a subsection of a mega menu panel.

I used the potential traffic metric from Ahrefs’ Keywords Explorer, GA session, and revenue data. Then, I created a 0–100 priority score based on those three metrics. This makes it easier to figure out which pages to include.

6. Show your site hierarchy with breadcrumbs

Breadcrumbs help users understand where they are within your site. If your site doesn’t have them, I recommend you add them.

In addition to helping users, they also benefit SEO by helping Google understand your hierarchy and distributing PageRank, as confirmed by Gary Illyses.

If you add breadcrumb structured data, you also increase the likelihood of breadcrumbs showing on search results, like in this example in Google documentation:

Breadcrumb structured data example in Google's SEO documentation

The convention for breadcrumb placement is high up on the page under the header. Here’s an example on Search Engine Roundtable. 

Breadcrumb links on Search Engine Roundtable

You don’t necessarily need to place them highly. It’s something you’ll need to test to see how it affects users achieving the primary goals you set for your site. The placement won’t impact your SEO.

7. Feature popular pages throughout your site

Wherever it makes sense, add links to your most popular pages throughout your site.

Sephora does a great job of this on top-level categories, like its makeup page. This category has a grid of links to various makeup-related categories. 

Grid links to different makeup categories on the Sephora site

None of these are direct child pages of the makeup category; there is another category below makeup before you reach these pages.

Breadcrumbs links on Sephora

Still, Sephora has added links because they’re popular with users.

This keeps the site structure flat rather than deep. If internal links were based purely on hierarchy, users would have to make multiple clicks to reach pages deeper within the site, even if they’re popular.

8. Don’t hide important navigation on mobile

Mobile navigation can be more challenging due to the reduced amount of space. But the solution isn’t hiding essential navigation elements on mobile.

Take a look at the mega menu on Stripe, for example.

Mega menu on Stripe's mobile site

There are a lot of links on desktop; ideally, we’ll also want to make it easy for users to find those links on mobile.

Stripe’s solution is to use a slider and take advantage of vertical space and users’ tendency to scroll on mobile. 

Stripe's mobile navigation menu

This is a brilliant example of keeping consistency between mobile and desktop UX. If you’re looking for a JS library to simplify creating a similar menu, I recommend mmenu.

For SEO, using tabs, accordions, and sliders on mobile devices rather than removing content also means you’re less likely to be negatively impacted by mobile indexing in Google.

Mobile indexing documentation

Although, you’re unlikely to run into issues with hiding content on mobile if your site is responsive (unless you’re using JS to remove content from the HTML on load).

9. Keep anchor text descriptive

Anchor text that is too short can confuse users about which page they will end up on after clicking the link.

This can also cause problems for SEO since anchor text is one way search engines determine page relevancy, as John Mueller has confirmed. 

For example, say you had a site that sold gaming equipment. Within a menu, you could have a list of links like this.

List of three links

Rather than just saying “laptops,” be more specific about the type of accessories, laptops, and desktops you’re selling. In this example, it’s likely as simple as pre-appending “Gaming” to each link. 

List of three links

This will reduce uncertainty for users and help search engines figure out which of these two topics you’re targeting. 

List of keywords, via Ahrefs' Keywords Explorer
Source: Ahrefs’ Keywords Explorer.

Keep your anchor text descriptive, but don’t go overboard. For example, you don’t need to repeat words multiple times in breadcrumbs like this:

Long breadcrumbs

As breadcrumbs are hierarchical, you can shorten them; they’ll still make sense to users, and you can put more descriptive anchor text elsewhere on the site, like in menus or hierarchical links.

Short breadcrumbs

10. Order navigation by popularity

When creating your navigation components, order them based on how likely a user is to click a link. If you’re making a new site and doing thorough keyword research, you can understand the relative popularity of different topics by looking at search volumes. 

Keywords Explorer shows that gaming laptops are much more popular than other pages. So unless there’s a business reason not to, I should promote them more prominently.

Keyword research for gaming laptop, accessories, and consoles

If your site receives a reasonable amount of traffic, you can use Google Analytics event tracking to see which links users click most. 

Doing this using Google Tag Manager is simple; you’ll need to set up a GA event tag with an appropriate category and action label, then set the “Label” Click Text like this.

Tag configuration page on Google Tag Manager

Click Text” will use the anchor text of the clicked link as the label, making it easy to identify the link.

Next, add a trigger to the tag. This will fire the event when a user clicks the links you want to track. To do this, set the trigger type as “Click – Just Links.” Then set it to only trigger on some link clicks. Finally, fire the tag based upon the “Click Element” matching a CSS selector.

GTM trigger configuration

In the above example, I’m tracking a sub-menu in a dropdown.

If you’ve moved to GA4, the process is quite similar. You can reuse the same trigger and configure the event parameters to send the click text and URL. Here’s how that should look:

Event tracking tag configuration

Here’s an example of the report you’ll see in GA once you’ve set this up:

Event tracking report in Google Analytics

Now you can easily decide how to order different items on your menus.

You may be wondering, “Does this impact SEO?” 

Potentially, yes. Google has filed patents on something called the “Reasonable Surfer Model.” This replaces part of the old “PageRank” algorithm, which relied on a “Random Surfer Model.”

The main difference between the two models is how they weigh the amount of PageRank passed. The Reasonable Surfer Model weighs the amount of PageRank passed based on the likelihood of a user clicking a link. The other model evenly distributes PageRank between all links. 

Explanation of the Reasonable Surfer Model

In short, if you reorganize links within a component or move different navigation components higher up the page, the more prominently displayed links may have more PageRank flow through them, so the target page could end up ranking better.

11. Don’t make users think

The best website navigation is the one that follows web design conventions. This applies to all aspects of web design. That means following standard web practices like:

  • Keeping your primary header navigation at the top of the page rather than on the side.
  • Using breadcrumbs to help users understand structure.
  • Adding a link to the logo that takes you back to the homepage.
  • Keeping the header/footer consistent between pages.
  • Styling links and other elements in a standardized way.

If you’d like to explore this topic further, I recommend Steve Krug’s book titled “Don’t Make Me Think.”

12. Test, test, test

To make your navigation more effective, use split-testing tools like Optimizely or VWO to create variants of your navigation and identify areas for improvement. Test things like link labels, placements, colors, and styles, as well as different types of navigation like dropdown menus, tabs, and mega menus.

By making small changes and measuring their performance, you can improve the usability of your navigation and help visitors easily find the content they’re looking for. 

If you’re unsure of what to test, consider using a behavior analytics tool like Hotjar to gather insights via heatmaps, surveys, and recordings.

Keep navigation collaborative

Changing navigation can be challenging. This is because multiple stakeholders with different areas of expertise have their own demands. The merchandising, UX, customer service, and SEO teams all need to be satisfied.

Often, difficulties with changing navigation occur for two reasons:

  1. Usability isn’t the primary reason for making the change.
  2. There is a lack of data to suggest the change will be worthwhile.

For reason #1, if the change does not improve usability, the UX team will likely reject it. For instance, SEO teams often concentrate on PageRank and what search engines want. To persuade a UX expert, do not talk about that. Instead, discuss the secondary effect the change will have on users—that’s the more important element after all.

For reason #2, any team could reject the change. Start by gathering insights on why you think the change will be positive. This can be:

  • Search demand data.
  • Behavioral analytics data from Google Analytics or similar.
  • Customer service feedback.
  • Sales and revenue data.

The ideal scenario here is each team works together using the different data they use daily to form the ideal navigation based on their expertise.

Final thoughts

Website navigation is a broad topic. Each component has a lot of nuances and, usually, the decision on what components there will be and how they’ll function has input from people with different types of expertise.

Still, the return on implementing excellent site navigation is worth the time investment. Hopefully, this article has inspired how yours will work.

Any further questions on website navigation? Find me on Twitter.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google’s Search Algorithm Exposed in Document Leak

Published

on

The Search Algorithm Exposed: Inside Google’s Search API Documents Leak

Google’s search algorithm is, essentially, one of the biggest influencers of what gets found on the internet. It decides who gets to be at the top and enjoy the lion’s share of the traffic, and who gets regulated to the dark corners of the web — a.k.a. the 2nd and so on pages of the search results. 

It’s the most consequential system of our digital world. And how that system works has been largely a mystery for years, but no longer. The Google search document leak, just went public just yesterday, drops thousands of pages of purported ranking algorithm factors onto our laps. 

The Leak

There’s some debate as to whether the documentation was “leaked,” or “discovered.” But what we do know is that the API documentation was (likely accidentally) pushed live on GitHub— where it was then found.

The thousands and thousands of pages in these documents, which appear to come from Google’s internal Content API Warehouse, give us an unprecedented look into how Google search and its ranking algorithms work. 

Fast Facts About the Google Search API Documentation

  • Reported to be the internal documentation for Google Search’s Content Warehouse API.
  • The documentation indicates this information is accurate as of March 2024.
  • 2,596 modules are represented in the API documentation with 14,014 attributes. These are what we might call ranking factors or features, but not all attributes may be considered part of the ranking algorithm. 
  • The documentation did not provide how these ranking factors are weighted. 

And here’s the kicker: several factors found on this document were factors that Google has said, on record, they didn’t track and didn’t include in their algorithms. 

That’s invaluable to the SEO industry, and undoubtedly something that will direct how we do SEO for the foreseeable future.

Is The Document Real? 

Another subject of debate is whether these documents are real. On that point, here’s what we know so far:

  • The documentation was on GitHub and was briefly made public from March to May 2024.
  • The documentation contained links to private GitHub repositories and internal pages — these required specific, Google-credentialed logins to access.
  • The documentation uses similar notation styles, formatting, and process/module/feature names and references seen in public Google API documentation.
  • Ex-Googlers say documentation similar to this exists on almost every Google team, i.e., with explanations and definitions for various API attributes and modules.

No doubt Google will deny this is their work (as of writing they refuse to comment on the leak). But all signs, so far, point to this document being the real deal, though I still caution everyone to take everything you learn from it with a grain of salt.

What We Learnt From The Google Search Document Leak

With over 2,500 technical documents to sift through, the insights we have so far are just the tip of the iceberg. I expect that the community will be analyzing this leak for months (possibly years) to gain more SEO-applicable insights.

Other articles have gotten into the nitty-gritty of it already. But if you’re having a hard time understanding all the technical jargon in those breakdowns, here’s a quick and simple summary of the points of interest identified in the leak so far:

  • Google uses something called “Twiddlers.” These are functions that help rerank a page (think boosting or demotion calculations). 
  • Content can be demoted for reasons such as SERP signals (aka user behavior) indicating dissatisfaction, a link not matching the target site, using exact match domains, product reviews, location, or sexual content.
  • Google uses a variety of measurements related to clicks, including “badClicks”, ”goodClicks”, ”lastLongestClicks” and ”unsquashedClicks”.
  • Google keeps a copy of every version of every page it has ever indexed. However, it only uses the last 20 changes of any given URL when analyzing a page.
  • Google uses a domain authority metric, called “siteAuthority
  • Google uses a system called “NavBoost” that uses click data for evaluating pages.
  • Google has a “sandbox” that websites are segregated to, based on age or lack of trust signals. Indicated by an attribute called “hostAge
  • May be related to the last point, but there is an attribute called “smallPersonalSite” in the documentation. Unclear what this is used for.
  • Google does identify entities on a webpage and can sort, rank, and filter them.
  • So far, the only attributes that can be connected to E-E-A-T are author-related attributes.
  • Google uses Chrome data as part of their page quality scoring, with a module featuring a site-level measure of views from Chrome (“chromeInTotal”)
  • The number, diversity, and source of your backlinks matter a lot, even if PageRank has not been mentioned by Google in years.
  • Title tags being keyword-optimized and matching search queries is important.
  • siteFocusScore” attribute measures how much a site is focused on a given topic. 
  • Publish dates and how frequently a page is updated determines content “freshness” — which is also important. 
  • Font size and text weight for links are things that Google notices. It appears that larger links are more positively received by Google.

Author’s Note: This is not the first time a search engine’s ranking algorithm was leaked. I covered the Yandex hack and how it affects SEO in 2023, and you’ll see plenty of similarities in the ranking factors both search engines use.

Action Points for Your SEO

I did my best to review as much of the “ranking features” that were leaked, as well as the original articles by Rand Fishkin and Mike King. From there, I have some insights I want to share with other SEOs and webmasters out there who want to know how to proceed with their SEO.

Links Matter — Link Value Affected by Several Factors 

Links still matter. Shocking? Not really. It’s something I and other SEOs have been saying, even if link-related guidelines barely show up in Google news and updates nowadays.

Still, we need to emphasize link diversity and relevance in our off-page SEO strategies. 

Some insights from the documentation:

  • PageRank of the referring domain’s homepage (also known as Homepage Trust) affects the value of the link.
  • Indexing tier matters. Regularly updated and accessed content is of the highest tier, and provides more value for your rankings.

If you want your off-page SEO to actually do something for your website, then focus on building links from websites that have authority, and from pages that are either fresh or are otherwise featured in the top tier. 

Some PR might help here — news publications tend to drive the best results because of how well they fulfill these factors.

As for guest posts, there’s no clear indication that these will hurt your site, but I definitely would avoid approaching them as a way to game the system. Instead, be discerning about your outreach and treat it as you would if you were networking for new business partners.

Aim for Successful Clicks 

The fact that clicks are a ranking factor should not be a surprise. Despite what Google’s team says, clicks are the clearest indicator of user behavior and how good a page is at fulfilling their search intent.

Google’s whole deal is providing the answers you want, so why wouldn’t they boost pages that seem to do just that?

The core of your strategy should be creating great user experiences. Great content that provides users with the right answers is how you do that. Aiming for qualified traffic is how you do that. Building a great-looking, functioning website is how you do that.

Go beyond just picking clickbait title tags and meta descriptions, and focus on making sure users get what they need from your website.

Author’s Note: If you haven’t been paying attention to page quality since the concepts of E-E-A-T and the HCU were introduced, now is the time to do so. Here’s my guide to ranking for the HCU to help you get started.

Keep Pages Updated

An interesting click-based measurement is the “last good click.” That being in a module related to indexing signals suggests that content decay can affect your rankings. 

Be vigilant about which pages on your website are not driving the expected amount of clicks for its SERP position. Outdated posts should be audited to ensure content has up-to-date and accurate information to help users in their search journey. 

This should revive those posts and drive clicks, preventing content decay. 

It’s especially important to start on this if you have content pillars on your website that aren’t driving the same traffic as they used to.

Establish Expertise & Authority  

Google does notice the entities on a webpage, which include a bunch of things, but what I want to focus on are those related to your authors.

E-E-A-T as a concept is pretty nebulous — because scoring “expertise” and “authority” of a website and its authors is nebulous. So, a lot of SEOs have been skeptical about it.

However, the presence of an “author” attribute combined with the in-depth mapping of entities in the documentation shows there is some weight to having a well-established author on your website.

So, apply author markups, create an author bio page and archive, and showcase your official profiles on your website to prove your expertise. 

Build Your Domain Authority

After countless Q&As and interviews where statements like “we don’t have anything like domain authority,” and “we don’t have website authority score,” were thrown around, we find there does exist an attribute called “siteAuthority”.

Though we don’t know specifically how this measure is computed, and how it weighs in the overall scoring for your website, we know it does matter to your rankings.

So, what do you need to do to improve site authority? It’s simple — keep following best practices and white-hat SEO, and you should be able to grow your authority within your niche. 

Stick to Your Niche

Speaking of niches — I found the “siteFocusScore” attribute interesting. It appears that building more and more content within a specific topic is considered a positive.

It’s something other SEOs have hypothesized before. After all, the more you write about a topic, the more you must be an authority on that topic, right?

But anyone can write tons of blogs on a given topic nowadays with AI, so how do you stand out (and avoid the risk of sounding artificial and spammy?)

That’s where author entities and link-building come in. I do think that great content should be supplemented by link-building efforts, as a sort of way to show that hey, “I’m an authority with these credentials, and these other people think I’m an authority on the topic as well.”

Key Takeaway

Most of the insights from the Google search document leak are things that SEOs have been working on for months (if not years). However, we now have solid evidence behind a lot of our hunches, providing that our theories are in fact best practices. 

The biggest takeaway I have from this leak: Google relies on user behavior (click data and post-click behavior in particular) to find the best content. Other ranking factors supplement that. Optimize to get users to click on and then stay on your page, and you should see benefits to your rankings.

Could Google remove these ranking factors now that they’ve been leaked? They could, but it’s highly unlikely that they’ll remove vital attributes in the algorithm they’ve spent years building. 

So my advice is to follow these now validated SEO practices and be very critical about any Google statements that follow this leak.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Search Leak: Conflicting Signals, Unanswered Questions

Published

on

By

Google Search Leak: Conflicting Signals, Unanswered Questions

An apparent leak of Google Search API documentation has sparked intense debate within the SEO community, with some claiming it proves Google’s dishonesty and others urging caution in interpreting the information.

As the industry grapples with the allegations, a balanced examination of Google’s statements and the perspectives of SEO experts is crucial to understanding the whole picture.

Leaked Documents Vs. Google’s Public Statements

Over the years, Google has consistently maintained that specific ranking signals, such as click data and user engagement metrics, aren’t used directly in its search algorithms.

In public statements and interviews, Google representatives have emphasized the importance of relevance, quality, and user experience while denying the use of specific metrics like click-through rates or bounce rates as ranking-related factors.

However, the leaked API documentation appears to contradict these statements.

It contains references to features like “goodClicks,” “badClicks,” “lastLongestClicks,” impressions, and unicorn clicks, tied to systems called Navboost and Glue, which Google VP Pandu Nayak confirmed in DOJ testimony are parts of Google’s ranking systems.

The documentation also alleges that Google calculates several metrics using Chrome browser data on individual pages and entire domains, suggesting the full clickstream of Chrome users is being leveraged to influence search rankings.

This contradicts past Google statements that Chrome data isn’t used for organic searches.

The Leak’s Origins & Authenticity

Erfan Azimi, CEO of digital marketing agency EA Eagle Digital, alleges he obtained the documents and shared them with Rand Fishkin and Mike King.

Azimi claims to have spoken with ex-Google Search employees who confirmed the authenticity of the information but declined to go on record due to the situation’s sensitivity.

While the leak’s origins remain somewhat ambiguous, several ex-Googlers who reviewed the documents have stated they appear legitimate.

Fishkin states:

“A critical next step in the process was verifying the authenticity of the API Content Warehouse documents. So, I reached out to some ex-Googler friends, shared the leaked docs, and asked for their thoughts.”

Three ex-Googlers responded, with one stating, “It has all the hallmarks of an internal Google API.”

However, without direct confirmation from Google, the authenticity of the leaked information is still debatable. Google has not yet publicly commented on the leak.

It’s important to note that, according to Fishkin’s article, none of the ex-Googlers confirmed that the leaked data was from Google Search. Only that it appears to have originated from within Google.

Industry Perspectives & Analysis

Many in the SEO community have long suspected that Google’s public statements don’t tell the whole story. The leaked API documentation has only fueled these suspicions.

Fishkin and King argue that if the information is accurate, it could have significant implications for SEO strategies and website search optimization.

Key takeaways from their analysis include:

  • Navboost and the use of clicks, CTR, long vs. Short clicks, and user data from Chrome appear to be among Google’s most powerful ranking signals.
  • Google employs safelists for sensitive topics like COVID-19, elections, and travel to control what sites appear.
  • Google uses Quality Rater feedback and ratings in its ranking systems, not just as a training set.
  • Click data influences how Google weights links for ranking purposes.
  • Classic ranking factors like PageRank and anchor text are losing influence compared to more user-centric signals.
  • Building a brand and generating search demand is more critical than ever for SEO success.

However, just because something is mentioned in API documentation doesn’t mean it’s being used to rank search results.

Other industry experts urge caution when interpreting the leaked documents.

They point out that Google may use the information for testing purposes or apply it only to specific search verticals rather than use it as active ranking signals.

There are also open questions about how much weight these signals carry compared to other ranking factors. The leak doesn’t provide the full context or algorithm details.

Unanswered Questions & Future Implications

As the SEO community continues to analyze the leaked documents, many questions still need to be answered.

Without official confirmation from Google, the authenticity and context of the information are still a matter of debate.

Key open questions include:

  • How much of this documented data is actively used to rank search results?
  • What is the relative weighting and importance of these signals compared to other ranking factors?
  • How have Google’s systems and use of this data evolved?
  • Will Google change its public messaging and be more transparent about using behavioral data?

As the debate surrounding the leak continues, it’s wise to approach the information with a balanced, objective mindset.

Unquestioningly accepting the leak as gospel truth or completely dismissing it are both shortsighted reactions. The reality likely lies somewhere in between.

Potential Implications For SEO Strategies and Website Optimization

It would be highly inadvisable to act on information shared from this supposed ‘leak’ without confirming whether it’s an actual Google search document.

Further, even if the content originates from search, the information is a year old and could have changed. Any insights derived from the leaked documentation should not be considered actionable now.

With that in mind, while the full implications remain unknown, here’s what we can glean from the leaked information.

1. Emphasis On User Engagement Metrics

If click data and user engagement metrics are direct ranking factors, as the leaked documents suggest, it could place greater emphasis on optimizing for these metrics.

This means crafting compelling titles and meta descriptions to increase click-through rates, ensuring fast page loads and intuitive navigation to reduce bounces, and strategically linking to keep users engaged on your site.

Driving traffic through other channels like social media and email can also help generate positive engagement signals.

However, it’s important to note that optimizing for user engagement shouldn’t come at the expense of creating reader-focused content. Gaming engagement metrics are unlikely to be a sustainable, long-term strategy.

Google has consistently emphasized the importance of quality and relevance in its public statements, and based on the leaked information, this will likely remain a key focus. Engagement optimization should support and enhance quality content, not replace it.

2. Potential Changes To Link-Building Strategies

The leaked documents contain information about how Google treats different types of links and their impact on search rankings.

This includes details about the use of anchor text, the classification of links into different quality tiers based on traffic to the linking page, and the potential for links to be ignored or demoted based on various spam factors.

If this information is accurate, it could influence how SEO professionals approach link building and the types of links they prioritize.

Links that drive real click-throughs may carry more weight than links on rarely visited pages.

The fundamentals of good link building still apply—create link-worthy content, build genuine relationships, and seek natural, editorially placed links that drive qualified referral traffic.

The leaked information doesn’t change this core approach but offers some additional nuance to be aware of.

3. Increased Focus On Brand Building and Driving Search Demand

The leaked documents suggest that Google uses brand-related signals and offline popularity as ranking factors. This could include metrics like brand mentions, searches for the brand name, and overall brand authority.

As a result, SEO strategies may emphasize building brand awareness and authority through both online and offline channels.

Tactics could include:

  • Securing brand mentions and links from authoritative media sources.
  • Investing in traditional PR, advertising, and sponsorships to increase brand awareness.
  • Encouraging branded searches through other marketing channels.
  • Optimizing for higher search volumes for your brand vs. unbranded keywords.
  • Building engaged social media communities around your brand.
  • Establishing thought leadership through original research, data, and industry contributions.

The idea is to make your brand synonymous with your niche and build an audience that seeks you out directly. The more people search for and engage with your brand, the stronger those brand signals may become in Google’s systems.

4. Adaptation To Vertical-Specific Ranking Factors

Some leaked information suggests that Google may use different ranking factors or algorithms for specific search verticals, such as news, local search, travel, or e-commerce.

If this is the case, SEO strategies may need to adapt to each vertical’s unique ranking signals and user intents.

For example, local search optimization may focus more heavily on factors like Google My Business listings, local reviews, and location-specific content.

Travel SEO could emphasize collecting reviews, optimizing images, and directly providing booking/pricing information on your site.

News SEO requires focusing on timely, newsworthy content and optimized article structure.

While the core principles of search optimization still apply, understanding your particular vertical’s nuances, based on the leaked information and real-world testing, can give you a competitive advantage.

The leaks suggest a vertical-specific approach to SEO could give you an advantage.

Conclusion

The Google API documentation leak has created a vigorous discussion about Google’s ranking systems.

As the SEO community continues to analyze and debate the leaked information, it’s important to remember a few key things:

  1. The information isn’t fully verified and lacks context. Drawing definitive conclusions at this stage is premature.
  2. Google’s ranking algorithms are complex and constantly evolving. Even if entirely accurate, this leak only represents a snapshot in time.
  3. The fundamentals of good SEO – creating high-quality, relevant, user-centric content and promoting it effectively – still apply regardless of the specific ranking factors at play.
  4. Real-world testing and results should always precede theorizing based on incomplete information.

What To Do Next

As an SEO professional, the best course of action is to stay informed about the leak.

Because details about the document remain unknown, it’s not a good idea to consider any takeaways actionable.

Most importantly, remember that chasing algorithms is a losing battle.

The only winning strategy in SEO is to make your website the best result for your message and audience. That’s Google’s endgame, and that’s where your focus should be, regardless of what any particular leaked document suggests.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google’s AI Overviews Shake Up Ecommerce Search Visibility

Published

on

By

Google's AI Overviews Shake Up Ecommerce Search Visibility

An analysis of 25,000 ecommerce queries by Bartosz Góralewicz, founder of Onely, reveals the impact of Google’s AI overviews on search visibility for online retailers.

The study found that 16% of eCommerce queries now return an AI overview in search results, accounting for 13% of total search volume in this sector.

Notably, 80% of the sources listed in these AI overviews do not rank organically for the original query.

“Ranking #1-3 gives you only an 8% chance of being a source in AI overviews,” Góralewicz stated.

Shift Toward “Accelerated” Product Experiences

International SEO consultant Aleyda Solis analyzed the disconnect between traditional organic ranking and inclusion in AI overviews.

According to Solis, for product-related queries, Google is prioritizing an “accelerated” approach over summarizing currently ranking pages.

She commented Góralewicz’ findings, stating:

“… rather than providing high level summaries of what’s already ranked organically below, what Google does with e-commerce is “accelerate” the experience by already showcasing what the user would get next.”

Solis explains that for queries where Google previously ranked category pages, reviews, and buying guides, it’s now bypassing this level of results with AI overviews.

Assessing AI Overview Traffic Impact

To help retailers evaluate their exposure, Solis has shared a spreadsheet that analyzes the potential traffic impact of AI overviews.

As Góralewicz notes, this could be an initial rollout, speculating that “Google will expand AI overviews for high-cost queries when enabling ads” based on data showing they are currently excluded for high cost-per-click keywords.

An in-depth report across ecommerce and publishing is expected soon from Góralewicz and Onely, with additional insights into this search trend.

Why SEJ Cares

AI overviews represent a shift in how search visibility is achieved for ecommerce websites.

With most overviews currently pulling product data from non-ranking sources, the traditional connection between organic rankings and search traffic is being disrupted.

Retailers may need to adapt their SEO strategies for this new search environment.

How This Can Benefit You

While unsettling for established brands, AI overviews create new opportunities for retailers to gain visibility without competing for the most commercially valuable keywords.

Ecommerce sites can potentially circumvent traditional ranking barriers by optimizing product data and detail pages for Google’s “accelerated” product displays.

The detailed assessment framework provided by Solis enables merchants to audit their exposure and prioritize optimization needs accordingly.


FAQ

What are the key findings from the analysis of AI overviews & ecommerce queries?

Góralewicz’s analysis of 25,000 ecommerce queries found:

  • 16% of ecommerce queries now return an AI overview in the search results.
  • 80% of the sources listed in these AI overviews do not rank organically for the original query.
  • Ranking positions #1-3 only provides an 8% chance of being a source in AI overviews.

These insights reveal significant shifts in how ecommerce sites need to approach search visibility.

Why are AI overviews pulling product data from non-ranking sources, and what does this mean for retailers?

Google’s AI overviews prioritize “accelerated” experiences over summarizing currently ranked pages for product-related queries.

This shift focuses on showcasing directly what users seek instead of traditional organic results.

For retailers, this means:

  • A need to optimize product pages beyond traditional SEO practices, catering to the data requirements of AI overviews.
  • Opportunities to gain visibility without necessarily holding top organic rankings.
  • Potential to bypass traditional ranking barriers by focusing on enhanced product data integration.

Retailers must adapt quickly to remain competitive in this evolving search environment.

What practical steps can retailers take to evaluate and improve their search visibility in light of AI overview disruptions?

Retailers can take several practical steps to evaluate and improve their search visibility:

  • Utilize the spreadsheet provided by Aleyda Solis to assess the potential traffic impact of AI overviews.
  • Optimize product and detail pages to align with the data and presentation style preferred by AI overviews.
  • Continuously monitor changes and updates to AI overviews, adapting strategies based on new data and trends.

These steps can help retailers navigate the impact of AI overviews and maintain or improve their search visibility.


Featured Image: Marco Lazzarini/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending