Connect with us

SEO

Single-Page Websites and SEO: The Essential Guide

Published

on

Single-Page Websites and SEO: The Essential Guide

Almost all SEO strategies involve launching several pages on a website. In most instances, this is undoubtedly the way to go, but can you run a successful SEO campaign with just a single-page website?

In this article, we’ll define what a single-page website is, review whether it is good or bad for SEO, and run through some top SEO tips for single-page websites.

What is a single-page website?

A single-page website contains all of the site’s content on only one landing page. Unlike multi-page websites, the user is unable to navigate to content on separate URLs via internal linking.

Structure comparison between single-page and multi-page websites

A single-page website is pretty much what it says on the tin. It shouldn’t, however, be confused with a single-page application (SPA).

A SPA is a website that loads all files via a web document when the user first visits the website. The rest of the content is then loaded dynamically as and when the user interacts with the site. 

Within a SPA, the user can still technically visit different pages. These are typically rendered “client-side” via JavaScript to dynamically serve the new content.

Are single-page websites good for SEO?

In most instances, you’ll want to launch your website with more than one page. Single-page websites limit your potential to grow organic traffic.

Here are my top reasons why single-page websites aren’t good for SEO.

Lack of content compromises keyword targeting

One of the biggest issues with a single-page website is that you are limited to targeting a small group of keywords.

While it’s possible to target different keywords via a single landing page, in SEO, it’s often more effective to split keyword targeting out via separate pages with a dedicated focus.

This process is often referred to as “keyword mapping,” where keywords are mapped to dedicated landing pages based on Google’s perceived search intent behind the keyword.

The best way to understand the search intent behind a keyword is to simply search the terms manually and see what results Google serves up. If Google ranks single-page websites in the top positions for your targeted query, then chances are you can rank in the top positions with a single page too.

In my experience, however, Google prefers to rank content that is super relevant to the search term. Even if you were to target keywords of different focuses through separate passages of content on a single page, you’ll be diluting the overall relevancy of that page. Splitting this content out into hyper-focused landing pages is a far more optimal content strategy.

Building out a strong breadth of content relevant to your niche also helps build authority and topical relevance to your industry in the eyes of Google.

Let’s say, for example, you are looking to purchase contact lenses. Store A provides a single product landing page to purchase the lenses.

On the other hand, Store B has the product page, a “contact us” page, and several relevant and informative blog posts answering common queries to do with contact lens eye care.

Structure comparison with Store B (multi-page) housing more content than Store A (single-page)

Of course, there’s much more at play here. But generally, as a potential customer, you’ll be more likely to trust Store B. Similarly, in the eyes of Google, a wider breadth of trustworthy content is a signal of expertise and builds topical relevance and association to your niche.

Lack of structure and organization leads to poor user experience

Single-page websites often lack clear structure and organization. With all content thrown into a single page, it’s common for users to have a confusing and frustrating experience.

This is because the only way for a user to navigate a single-page site is to scroll the page and click on anchor links (if available). The more content you squeeze onto a single page, the more frustrating this experience can become for the user, as it takes more effort to find the desired content.

Multi-page websites typically have a clear hierarchy of content defined by a header navigation menu and breadcrumbs. Users are familiar with navigating these setups. When configured correctly, these also provide a seamless experience for users to hop between pages at their leisure.

Content is often truncated to help users more easily navigate a single-page website. However, this approach does come with its drawbacks. By streamlining your content, you could be failing to include information that your user is looking for and stripping out content with SEO value.

Limited potential to acquire backlinks

Single-page websites are often transactional and conversion-focused. They typically include minimal content of an informational nature, such as blog posts, studies, or campaign-style pages.

In my first Ahrefs blog post titled “Here’s Why You Should Prioritize Internal Linking,” I mentioned how webmasters typically link to content of an informational nature as opposed to a transactional one.

So by going for a single-page website, you are likely to be compromising your ability to build quality backlinks. 

Link building is one of the three key pillars of SEO. By rolling out a website structure that is far from optimal for building links, you’ll be limiting your ranking potential.

Having a multi-page website gives you more flexibility to roll out more of the content that naturally attracts links.

What are the SEO benefits of running a single-page website?

At this stage, you may be wondering why anyone would run a single-page website. Even though I’ve outlined plenty of reasons not to, a single-page website may just be the right fit for you at this moment in time.

Many webmasters may opt to run a single-page website for the short term, with a view to expand and scale up their website in the long term. In this instance, a single-page website makes for a nice placeholder or MVP version of a site.

They are also relatively cheap and easy to set up. You only need the resource to design, create, and host a single page as opposed to several pages.

These are some obvious non-SEO reasons as to why single-page websites may be the right fit for you. There are also some SEO benefits too.

They provide a great starting point for brand launches

If you are in the process of launching a new brand, you’ll likely be working relentlessly behind the scenes to ensure your full-scale website is ready for launch.

A single-page website often makes the perfect placeholder site prior to a brand’s launch (where you will likely switch to the full website upon launch). Having the single-page website in place ensures Google has, at the very least, crawled and indexed your website in time for the brand’s launch.

This helps to avoid a situation where your brand (and website) launches without being indexed on Google. This could be catastrophic, with your website potentially missing out on valuable clicks on the big day of your brand’s launch.

Having at least a single-page website in place allows you to be indexed and build up crucial rankings for key branded terms ahead of a launch.

PageRank is focused on a single page

I mentioned earlier that it’s generally more difficult to build backlinks toward a single-page website. This is certainly the case. However, one advantage to having a single-page website is that you are less likely to suffer from PageRank dilution.

All backlinks that are built to your site will point toward a single URL. This means that all PageRank built toward your site is associated with a single page, as opposed to being diluted as authority is passed on through internal links.

When PageRank is passed on through internal links on multi-page websites, slightly less and less of the overall value of that backlink is passed on. This is called the “PageRank Damping Factor,” where the value passed on diminished with each “hop.”

Example showing the PageRank Damping Factor

Let’s say, for example, we have Store A and Store B that sell the same product, except Store A is a single-page website and Store B is a multi-page website. They both receive a backlink of the same PageRank value, targeting the root domain of the respective websites.

Because Store A’s product sits on the homepage (root domain) as a single-page website, this product receives the maximum value from the backlink.

But Store B has to use an internal link to pass on the PageRank from the homepage to the dedicated product page. Because the value dampens through each internal link “hop,” Store B’s product doesn’t receive as much of a PageRank boost as Store A’s.

Example showing PageRank damping on a multi-page website but not damping on a single-page website

Having said that, in most cases, I’d still go with a multi-page website due to its natural ability to acquire more backlinks in comparison.

They naturally offer a good experience to mobile users

Another benefit to single-page websites is that they are often pretty optimal for mobile users in nature. In fact, they are often built with mobile users in mind.

Navigating via anchor links often lands well with mobile users, and the more succinct and snappy nature of the copy is well aligned with mobile optimization best practices.

Ensuring your users have good experience via mobile devices is more important than ever. According to Statista, a majority of all worldwide website visits come from users on mobile devices. 

Google will also predominantly crawl the mobile version of your site and evaluate mobile-friendliness as part of its ranking signals. It’s never been more important to optimize for mobile.

Tips for optimizing a single-page website

So we’ve been through the advantages and disadvantages (relating to SEO) of running a single-page website. 

If a single-page website is currently the right option for you, here are my top SEO tips for creating and running one.

Use a clear hierarchy

As we highlighted earlier on, single-page websites often lack a clear architecture instilled by navigation menus and breadcrumbs. 

With that in mind, you’ll want to set a clear on-page hierarchy for your content. Using a logical heading structure consisting of a single H1 for the main heading and H2s and H3s for the subheadings makes a great starting point.

Example showing logical heading tag structure vs. illogical structure

Using these heading tags in a logical order to break up your content makes it easier for users to scan and navigate the page. Having a messy heading structure forces your user to understand the page structure when navigating the page.

This is even more critical for users who are visually impaired and may be using a screen reader, so be sure not to skip out any heading levels (e.g., nesting an H4 directly under an H2).

Don’t overlook image optimization

Image optimization is often overlooked in SEO. Given how single-page websites often pack in lots of imagery, image SEO should not be skipped.

There aren’t any extra rules to follow when it comes to single-page websites, so be sure to follow best practices such as:

  • Using descriptive alt text and file names.
  • Compressing image file sizes and using next-gen file types.
  • Loading images via a CDN (content delivery network).

Following these best practices will not only support potential rankings via image search but also enhance page speed performance.

Don’t neglect page speed

This leads me nicely to my next tip; don’t neglect page speed. As it’s the only page your users will load, be sure to make it fast and responsive.

Sure, out of the hundreds of ranking signals at play, page speed isn’t at the top of the list when it comes to SEO priorities. However, it’s hard to ignore that having a fast and responsive website supports not only organic rankings but also user experience.

Discussing Core Web Vitals on Reddit, John Mueller described Core Web Vitals as being “more than a tie breaker” signal.

Think with Google states that 53% of visits are abandoned if a mobile site takes longer than three seconds to load.

Couple these points together, and we have a pretty good case for ensuring page speed remains a relevant consideration.

Lazy-loading below-the-fold content should be a key consideration for single-page websites. This is because you’ll likely be packing in more content than usual into the single page, weighing down page load times.

This means that any resources that require the user to scroll in order to be seen will be delayed in the initial page load. Instead, these resources will load as the user scrolls.

Images below the fold not being rendered in an initial page load

Double down on link building

As we mentioned earlier in the article, single-page websites suffer when it comes to naturally pulling in backlinks. With this in mind, you’ll likely have to dedicate even more time to link building than usual.

You’ll need to double down on strategies that do not require the creation of new pages.

This is because you are unable to roll out pages dedicated to attracting links, for example, using strategies such as link baiting.

Providing expert comments/quotes for third-party websites is a great way to build links without the need to launch new pages.

The process is pretty simple:

  1. A journalist shares a request for an expert comment.
  2. You pitch to provide a comment.
  3. If successful, the journalist includes your comment in their piece.

There’s no guarantee the journalist will actually include a backlink alongside the comment. That said, there is a strong chance the journalist will include a link to credit the contributor. Once you start to build up the expert comments, the backlinks will start to build up too.

When it comes to finding journalist requests for expert comments, Twitter makes a great starting point. Journalists will often include #journorequest, making it easy to find relevant requests with a custom search.

Example of a #journorequest tweet

There are also plenty of third-party platforms that journalists will use to submit such requests, such as HARO.

Finding unlinked brand mentions is another great strategy for building backlinks where you do not need to create any new content.

An unlinked brand mention is an online mention (citation) of your brand name or even a key spokesperson from your company where the publisher doesn’t include a backlink.

Overall, the process is relatively straightforward:

  1. Discover brand mentions via Ahrefs’ Content Explorer
  2. Crawl the mentions with a custom search to filter out pages that already provide a link
  3. Perform outreach to the publishers that do not link, requesting that they add a link

Ahrefs’ Joshua Hardwick has provided a detailed guide on converting unlinked brand mentions into backlinks. It’s well worth a read for more details on the above steps.

These are just two examples of link building strategies that don’t require content. Check out “9 Easy Link Building Strategies (That Anyone Can Use) for eight more valid strategies that I haven’t mentioned here.

Follow on-page best practices

My last tip is a pretty simple one, but it is probably one of the most crucial tips. Following on-page SEO best practices is a must.

You will be limited by the constraints of having just a single page, so you’ll have to forgo internal linking, for example. 

That said, it remains essential to optimize key on-page elements, from meta titles and descriptions to keyword targeting.

Final thoughts

Single-page websites aren’t for everybody. In most cases, I personally will go for a multi-page setup instead.

That said, there are some instances where a single-page website is practical, particularly as a short-term solution or MVP version of a website.

Key takeaways:

  • Keyword targeting is compromised due to a lack of content.
  • Single-page websites struggle to build and show expertise.
  • You’ll need to double down on link building, as single-page websites naturally acquire fewer backlinks.

Have any questions? Ping me on Twitter and let me know.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

How To Use The Google Ads Search Terms Report

Published

on

By

How To Use The Google Ads Search Terms Report

One of the most essential aspects of a profitable Google Ads strategy is reaching the right people, with the right message, while they’re searching.

To do this correctly, you need to know exactly how your ads are doing and what words potential customers are using to search.

This is where the Google Ads search terms report comes in handy.

This report is a goldmine and an invaluable asset to every Google Ads account.

With insights into exact phrases being used to trigger your ads, the search terms report can help:

  • Significantly refine your keyword strategy.
  • Enhance your targeting.
  • Boost your return on investment (ROI).

Let’s get into why the Google Ads search terms report is not only helpful but essential for maximizing Google Ads profitability.

What Is The Google Ads Search Terms Report?

The search terms report is a performance tool that shows how your ad performed when triggered by actual searches on the Google Search Network.

The report shows specific terms and phrases that triggered your ad to show, which helps determine if you’re bidding on the right keywords or using the right match types.

If you find search terms that aren’t relevant for your business, you can easily add them to your negative keyword list repository.

This helps you spend your budget more effectively by ensuring your ads are only triggered for relevant, useful searches by potential customers.

Keep in mind that there is a difference between a search term and a keyword:

  • Search term: Shows the exact word or phrase a customer enters on the Google Search Network to trigger an ad.
  • Keyword: The word or phrase that Google Ads advertisers target and bid on to show their ads to customers.

How To Create A Search Terms Report

Creating a search terms report in your Google Ads account is simple, and better yet – it can be automated!

To view your search terms report, you’ll need to:

  • Log into your Google Ads account.
  • Navigate to “Campaigns” >> “Insights & reports” >> “Search terms”

Below is an example of where to navigate in your Google Ads account to find the search terms report.

Screenshot taken by author, April 2024

After running this report, there are multiple actions you can take as a marketer:

  • Add top-performing searches to corresponding ad groups as keywords.
  • Select the desired match type (e.g. broad, phrase, exact) if adding new keywords.
  • Add irrelevant search terms to a negative keyword list.

3 Ways To Use Search Terms Report Data

As mentioned above, there are numerous ways you can use the search terms report data to optimize campaign performance.

Let’s take a look at three examples of how to use this report to get the best bang for your buck.

1. Refine Existing Keyword Lists

The first area the search terms report can help with is refining existing keyword lists.

By combing through the search terms report, you can find areas of opportunities, including:

  • What searches are leading to conversions.
  • What searches are irrelevant to the product or service.
  • What searches have high impressions but low clicks.
  • How searches are being mapped to existing keywords and ad groups.

For searches leading to conversions, it likely makes sense to add those as keywords to an existing ad group or create a new ad group.

If you’re finding some searches to be irrelevant to what you’re selling, it’s best to add them as negative keywords. That prevents your ad from showing up for that search moving forward.

If some searches have a high volume of impressions, but very few clicks, these will take further consideration. If it’s a keyword worth bidding on, it may indicate that the bid strategy isn’t competitive enough – meaning you’ll have to take action on your bid strategy.

If a search term is being triggered by multiple keywords and ad groups, this is a case of cross-pollution of keywords. This can lead to lower ROI because it’s essentially having multiple keywords bid on that search term, which can drive up the cost. If this happens, you have a few options:

  • Review and update existing keyword match types as necessary.
  • Add negative keywords where appropriate at the ad group or campaign level to avoid cross-pollution.

Ultimately, using the search terms report in this way allows you to determine what is performing well and eliminate poor performers.

2. Understand How Your Audience Is Actually Searching For Your Product

Something I often see is a mismatch of how a company talks about its product or service vs. how a customer is actually searching for it in the real world.

If you’re bidding on keywords you think describe your product or service but are not getting any traction, you could be misaligning expectations.

Oftentimes, searches that lead to conversions are from terms you wouldn’t have thought to bid on without looking at the search terms report.

One of this report’s most underutilized use cases is finding lesser-known ways customers are searching for and finding your product.

Finding these types of keywords may result in the creation of a new campaign, especially if the search terms don’t fit existing ad group structures.

Building out campaigns by different search themes allows for appropriate bidding strategies for each because not all keyword values are created equal!

Understanding how a customer is describing their need for a product or service not only helps your keyword strategy but can lead to better-aligned product positioning.

This leads us to a third way the search term report can help your campaigns.

3. Optimize Ad Copy and Landing Pages

As discussed in #2, customers’ language and phrases can provide valuable insights into their needs and preferences.

Marketers can use the search terms report to better tailor ad copy, making it more relevant and appealing to prospective customers.

And let’s not forget about the corresponding landing page!

Once a user clicks on an ad, they expect to see an alignment of what they searched for and what is presented on a website.

Make sure that landing page content is updated regularly to better match the searcher’s intent.

This can result in a better user experience and an improvement in conversion rates.

How Using The Search Terms Report Can Help ROI

All three examples above are ways that the search terms report can improve campaign ROI.

How so?

Let’s take a look at each example further.

How Refining Keywords Helps ROI

Part of refining existing keywords is negating any irrelevant search terms that trigger an ad.

Having a solid negative keyword strategy gets rid of “unwanted” spending on keywords that don’t make sense.

That previously “wasted” spend then gets redirected to campaigns that regularly drive higher ROI.

Additionally, adding top-performing search terms gives you better control from a bid strategy perspective.

Being able to pull the appropriate levers and setting proper bid strategies by search theme ultimately leads to better ROI.

How Understanding Audience Intent Helps ROI

By understanding the exact language and search terms that potential customers use, marketers can update ad copy and landing pages to better match those searches.

This can increase ad relevance and Ad Rank within Google Ads.

These items help with keyword Quality Score, which can help reduce CPCs as your Quality Score increases.

More relevant ads likely lead to higher click-through rates, which leads to a higher likelihood of converting those users!

How Updating Ad Copy And Landing Pages Helps ROI

This example goes hand-in-hand with the above recommendation.

As you start to better understand the audience’s search intent, updating ad copy and landing pages to reflect their search indicates better ad relevance.

Once a user clicks on that relevant ad, they find the content of the landing page matches better to what they’re looking for.

This enhanced relevance can significantly increase the likelihood of conversion, which ultimately boosts ROI.

Use This Report To Make Data-Driven Decisions

Google Ads is an integral part of any digital marketing strategy, often accounting for a large portion of your marketing budget.

By regularly reviewing the search terms report, you can refine your marketing budget to make your Google Ads campaigns more effective.

Using this report to make data-driven decisions that fine-tune multiple facets of campaign management leads to more effective ad spending, higher conversions, and ultimately higher ROI.

More resources: 


Featured Image: FGC/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google’s Search Algorithm Exposed in Document Leak

Published

on

The Search Algorithm Exposed: Inside Google’s Search API Documents Leak

Google’s search algorithm is, essentially, one of the biggest influencers of what gets found on the internet. It decides who gets to be at the top and enjoy the lion’s share of the traffic, and who gets regulated to the dark corners of the web — a.k.a. the 2nd and so on pages of the search results. 

It’s the most consequential system of our digital world. And how that system works has been largely a mystery for years, but no longer. The Google search document leak, just went public just yesterday, drops thousands of pages of purported ranking algorithm factors onto our laps. 

The Leak

There’s some debate as to whether the documentation was “leaked,” or “discovered.” But what we do know is that the API documentation was (likely accidentally) pushed live on GitHub— where it was then found.

The thousands and thousands of pages in these documents, which appear to come from Google’s internal Content API Warehouse, give us an unprecedented look into how Google search and its ranking algorithms work. 

Fast Facts About the Google Search API Documentation

  • Reported to be the internal documentation for Google Search’s Content Warehouse API.
  • The documentation indicates this information is accurate as of March 2024.
  • 2,596 modules are represented in the API documentation with 14,014 attributes. These are what we might call ranking factors or features, but not all attributes may be considered part of the ranking algorithm. 
  • The documentation did not provide how these ranking factors are weighted. 

And here’s the kicker: several factors found on this document were factors that Google has said, on record, they didn’t track and didn’t include in their algorithms. 

That’s invaluable to the SEO industry, and undoubtedly something that will direct how we do SEO for the foreseeable future.

Is The Document Real? 

Another subject of debate is whether these documents are real. On that point, here’s what we know so far:

  • The documentation was on GitHub and was briefly made public from March to May 2024.
  • The documentation contained links to private GitHub repositories and internal pages — these required specific, Google-credentialed logins to access.
  • The documentation uses similar notation styles, formatting, and process/module/feature names and references seen in public Google API documentation.
  • Ex-Googlers say documentation similar to this exists on almost every Google team, i.e., with explanations and definitions for various API attributes and modules.

No doubt Google will deny this is their work (as of writing they refuse to comment on the leak). But all signs, so far, point to this document being the real deal, though I still caution everyone to take everything you learn from it with a grain of salt.

What We Learnt From The Google Search Document Leak

With over 2,500 technical documents to sift through, the insights we have so far are just the tip of the iceberg. I expect that the community will be analyzing this leak for months (possibly years) to gain more SEO-applicable insights.

Other articles have gotten into the nitty-gritty of it already. But if you’re having a hard time understanding all the technical jargon in those breakdowns, here’s a quick and simple summary of the points of interest identified in the leak so far:

  • Google uses something called “Twiddlers.” These are functions that help rerank a page (think boosting or demotion calculations). 
  • Content can be demoted for reasons such as SERP signals (aka user behavior) indicating dissatisfaction, a link not matching the target site, using exact match domains, product reviews, location, or sexual content.
  • Google uses a variety of measurements related to clicks, including “badClicks”, ”goodClicks”, ”lastLongestClicks” and ”unsquashedClicks”.
  • Google keeps a copy of every version of every page it has ever indexed. However, it only uses the last 20 changes of any given URL when analyzing a page.
  • Google uses a domain authority metric, called “siteAuthority
  • Google uses a system called “NavBoost” that uses click data for evaluating pages.
  • Google has a “sandbox” that websites are segregated to, based on age or lack of trust signals. Indicated by an attribute called “hostAge
  • May be related to the last point, but there is an attribute called “smallPersonalSite” in the documentation. Unclear what this is used for.
  • Google does identify entities on a webpage and can sort, rank, and filter them.
  • So far, the only attributes that can be connected to E-E-A-T are author-related attributes.
  • Google uses Chrome data as part of their page quality scoring, with a module featuring a site-level measure of views from Chrome (“chromeInTotal”)
  • The number, diversity, and source of your backlinks matter a lot, even if PageRank has not been mentioned by Google in years.
  • Title tags being keyword-optimized and matching search queries is important.
  • siteFocusScore” attribute measures how much a site is focused on a given topic. 
  • Publish dates and how frequently a page is updated determines content “freshness” — which is also important. 
  • Font size and text weight for links are things that Google notices. It appears that larger links are more positively received by Google.

Author’s Note: This is not the first time a search engine’s ranking algorithm was leaked. I covered the Yandex hack and how it affects SEO in 2023, and you’ll see plenty of similarities in the ranking factors both search engines use.

Action Points for Your SEO

I did my best to review as much of the “ranking features” that were leaked, as well as the original articles by Rand Fishkin and Mike King. From there, I have some insights I want to share with other SEOs and webmasters out there who want to know how to proceed with their SEO.

Links Matter — Link Value Affected by Several Factors 

Links still matter. Shocking? Not really. It’s something I and other SEOs have been saying, even if link-related guidelines barely show up in Google news and updates nowadays.

Still, we need to emphasize link diversity and relevance in our off-page SEO strategies. 

Some insights from the documentation:

  • PageRank of the referring domain’s homepage (also known as Homepage Trust) affects the value of the link.
  • Indexing tier matters. Regularly updated and accessed content is of the highest tier, and provides more value for your rankings.

If you want your off-page SEO to actually do something for your website, then focus on building links from websites that have authority, and from pages that are either fresh or are otherwise featured in the top tier. 

Some PR might help here — news publications tend to drive the best results because of how well they fulfill these factors.

As for guest posts, there’s no clear indication that these will hurt your site, but I definitely would avoid approaching them as a way to game the system. Instead, be discerning about your outreach and treat it as you would if you were networking for new business partners.

Aim for Successful Clicks 

The fact that clicks are a ranking factor should not be a surprise. Despite what Google’s team says, clicks are the clearest indicator of user behavior and how good a page is at fulfilling their search intent.

Google’s whole deal is providing the answers you want, so why wouldn’t they boost pages that seem to do just that?

The core of your strategy should be creating great user experiences. Great content that provides users with the right answers is how you do that. Aiming for qualified traffic is how you do that. Building a great-looking, functioning website is how you do that.

Go beyond just picking clickbait title tags and meta descriptions, and focus on making sure users get what they need from your website.

Author’s Note: If you haven’t been paying attention to page quality since the concepts of E-E-A-T and the HCU were introduced, now is the time to do so. Here’s my guide to ranking for the HCU to help you get started.

Keep Pages Updated

An interesting click-based measurement is the “last good click.” That being in a module related to indexing signals suggests that content decay can affect your rankings. 

Be vigilant about which pages on your website are not driving the expected amount of clicks for its SERP position. Outdated posts should be audited to ensure content has up-to-date and accurate information to help users in their search journey. 

This should revive those posts and drive clicks, preventing content decay. 

It’s especially important to start on this if you have content pillars on your website that aren’t driving the same traffic as they used to.

Establish Expertise & Authority  

Google does notice the entities on a webpage, which include a bunch of things, but what I want to focus on are those related to your authors.

E-E-A-T as a concept is pretty nebulous — because scoring “expertise” and “authority” of a website and its authors is nebulous. So, a lot of SEOs have been skeptical about it.

However, the presence of an “author” attribute combined with the in-depth mapping of entities in the documentation shows there is some weight to having a well-established author on your website.

So, apply author markups, create an author bio page and archive, and showcase your official profiles on your website to prove your expertise. 

Build Your Domain Authority

After countless Q&As and interviews where statements like “we don’t have anything like domain authority,” and “we don’t have website authority score,” were thrown around, we find there does exist an attribute called “siteAuthority”.

Though we don’t know specifically how this measure is computed, and how it weighs in the overall scoring for your website, we know it does matter to your rankings.

So, what do you need to do to improve site authority? It’s simple — keep following best practices and white-hat SEO, and you should be able to grow your authority within your niche. 

Stick to Your Niche

Speaking of niches — I found the “siteFocusScore” attribute interesting. It appears that building more and more content within a specific topic is considered a positive.

It’s something other SEOs have hypothesized before. After all, the more you write about a topic, the more you must be an authority on that topic, right?

But anyone can write tons of blogs on a given topic nowadays with AI, so how do you stand out (and avoid the risk of sounding artificial and spammy?)

That’s where author entities and link-building come in. I do think that great content should be supplemented by link-building efforts, as a sort of way to show that hey, “I’m an authority with these credentials, and these other people think I’m an authority on the topic as well.”

Key Takeaway

Most of the insights from the Google search document leak are things that SEOs have been working on for months (if not years). However, we now have solid evidence behind a lot of our hunches, providing that our theories are in fact best practices. 

The biggest takeaway I have from this leak: Google relies on user behavior (click data and post-click behavior in particular) to find the best content. Other ranking factors supplement that. Optimize to get users to click on and then stay on your page, and you should see benefits to your rankings.

Could Google remove these ranking factors now that they’ve been leaked? They could, but it’s highly unlikely that they’ll remove vital attributes in the algorithm they’ve spent years building. 

So my advice is to follow these now validated SEO practices and be very critical about any Google statements that follow this leak.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Search Leak: Conflicting Signals, Unanswered Questions

Published

on

By

Google Search Leak: Conflicting Signals, Unanswered Questions

An apparent leak of Google Search API documentation has sparked intense debate within the SEO community, with some claiming it proves Google’s dishonesty and others urging caution in interpreting the information.

As the industry grapples with the allegations, a balanced examination of Google’s statements and the perspectives of SEO experts is crucial to understanding the whole picture.

Leaked Documents Vs. Google’s Public Statements

Over the years, Google has consistently maintained that specific ranking signals, such as click data and user engagement metrics, aren’t used directly in its search algorithms.

In public statements and interviews, Google representatives have emphasized the importance of relevance, quality, and user experience while denying the use of specific metrics like click-through rates or bounce rates as ranking-related factors.

However, the leaked API documentation appears to contradict these statements.

It contains references to features like “goodClicks,” “badClicks,” “lastLongestClicks,” impressions, and unicorn clicks, tied to systems called Navboost and Glue, which Google VP Pandu Nayak confirmed in DOJ testimony are parts of Google’s ranking systems.

The documentation also alleges that Google calculates several metrics using Chrome browser data on individual pages and entire domains, suggesting the full clickstream of Chrome users is being leveraged to influence search rankings.

This contradicts past Google statements that Chrome data isn’t used for organic searches.

The Leak’s Origins & Authenticity

Erfan Azimi, CEO of digital marketing agency EA Eagle Digital, alleges he obtained the documents and shared them with Rand Fishkin and Mike King.

Azimi claims to have spoken with ex-Google Search employees who confirmed the authenticity of the information but declined to go on record due to the situation’s sensitivity.

While the leak’s origins remain somewhat ambiguous, several ex-Googlers who reviewed the documents have stated they appear legitimate.

Fishkin states:

“A critical next step in the process was verifying the authenticity of the API Content Warehouse documents. So, I reached out to some ex-Googler friends, shared the leaked docs, and asked for their thoughts.”

Three ex-Googlers responded, with one stating, “It has all the hallmarks of an internal Google API.”

However, without direct confirmation from Google, the authenticity of the leaked information is still debatable. Google has not yet publicly commented on the leak.

It’s important to note that, according to Fishkin’s article, none of the ex-Googlers confirmed that the leaked data was from Google Search. Only that it appears to have originated from within Google.

Industry Perspectives & Analysis

Many in the SEO community have long suspected that Google’s public statements don’t tell the whole story. The leaked API documentation has only fueled these suspicions.

Fishkin and King argue that if the information is accurate, it could have significant implications for SEO strategies and website search optimization.

Key takeaways from their analysis include:

  • Navboost and the use of clicks, CTR, long vs. Short clicks, and user data from Chrome appear to be among Google’s most powerful ranking signals.
  • Google employs safelists for sensitive topics like COVID-19, elections, and travel to control what sites appear.
  • Google uses Quality Rater feedback and ratings in its ranking systems, not just as a training set.
  • Click data influences how Google weights links for ranking purposes.
  • Classic ranking factors like PageRank and anchor text are losing influence compared to more user-centric signals.
  • Building a brand and generating search demand is more critical than ever for SEO success.

However, just because something is mentioned in API documentation doesn’t mean it’s being used to rank search results.

Other industry experts urge caution when interpreting the leaked documents.

They point out that Google may use the information for testing purposes or apply it only to specific search verticals rather than use it as active ranking signals.

There are also open questions about how much weight these signals carry compared to other ranking factors. The leak doesn’t provide the full context or algorithm details.

Unanswered Questions & Future Implications

As the SEO community continues to analyze the leaked documents, many questions still need to be answered.

Without official confirmation from Google, the authenticity and context of the information are still a matter of debate.

Key open questions include:

  • How much of this documented data is actively used to rank search results?
  • What is the relative weighting and importance of these signals compared to other ranking factors?
  • How have Google’s systems and use of this data evolved?
  • Will Google change its public messaging and be more transparent about using behavioral data?

As the debate surrounding the leak continues, it’s wise to approach the information with a balanced, objective mindset.

Unquestioningly accepting the leak as gospel truth or completely dismissing it are both shortsighted reactions. The reality likely lies somewhere in between.

Potential Implications For SEO Strategies and Website Optimization

It would be highly inadvisable to act on information shared from this supposed ‘leak’ without confirming whether it’s an actual Google search document.

Further, even if the content originates from search, the information is a year old and could have changed. Any insights derived from the leaked documentation should not be considered actionable now.

With that in mind, while the full implications remain unknown, here’s what we can glean from the leaked information.

1. Emphasis On User Engagement Metrics

If click data and user engagement metrics are direct ranking factors, as the leaked documents suggest, it could place greater emphasis on optimizing for these metrics.

This means crafting compelling titles and meta descriptions to increase click-through rates, ensuring fast page loads and intuitive navigation to reduce bounces, and strategically linking to keep users engaged on your site.

Driving traffic through other channels like social media and email can also help generate positive engagement signals.

However, it’s important to note that optimizing for user engagement shouldn’t come at the expense of creating reader-focused content. Gaming engagement metrics are unlikely to be a sustainable, long-term strategy.

Google has consistently emphasized the importance of quality and relevance in its public statements, and based on the leaked information, this will likely remain a key focus. Engagement optimization should support and enhance quality content, not replace it.

2. Potential Changes To Link-Building Strategies

The leaked documents contain information about how Google treats different types of links and their impact on search rankings.

This includes details about the use of anchor text, the classification of links into different quality tiers based on traffic to the linking page, and the potential for links to be ignored or demoted based on various spam factors.

If this information is accurate, it could influence how SEO professionals approach link building and the types of links they prioritize.

Links that drive real click-throughs may carry more weight than links on rarely visited pages.

The fundamentals of good link building still apply—create link-worthy content, build genuine relationships, and seek natural, editorially placed links that drive qualified referral traffic.

The leaked information doesn’t change this core approach but offers some additional nuance to be aware of.

3. Increased Focus On Brand Building and Driving Search Demand

The leaked documents suggest that Google uses brand-related signals and offline popularity as ranking factors. This could include metrics like brand mentions, searches for the brand name, and overall brand authority.

As a result, SEO strategies may emphasize building brand awareness and authority through both online and offline channels.

Tactics could include:

  • Securing brand mentions and links from authoritative media sources.
  • Investing in traditional PR, advertising, and sponsorships to increase brand awareness.
  • Encouraging branded searches through other marketing channels.
  • Optimizing for higher search volumes for your brand vs. unbranded keywords.
  • Building engaged social media communities around your brand.
  • Establishing thought leadership through original research, data, and industry contributions.

The idea is to make your brand synonymous with your niche and build an audience that seeks you out directly. The more people search for and engage with your brand, the stronger those brand signals may become in Google’s systems.

4. Adaptation To Vertical-Specific Ranking Factors

Some leaked information suggests that Google may use different ranking factors or algorithms for specific search verticals, such as news, local search, travel, or e-commerce.

If this is the case, SEO strategies may need to adapt to each vertical’s unique ranking signals and user intents.

For example, local search optimization may focus more heavily on factors like Google My Business listings, local reviews, and location-specific content.

Travel SEO could emphasize collecting reviews, optimizing images, and directly providing booking/pricing information on your site.

News SEO requires focusing on timely, newsworthy content and optimized article structure.

While the core principles of search optimization still apply, understanding your particular vertical’s nuances, based on the leaked information and real-world testing, can give you a competitive advantage.

The leaks suggest a vertical-specific approach to SEO could give you an advantage.

Conclusion

The Google API documentation leak has created a vigorous discussion about Google’s ranking systems.

As the SEO community continues to analyze and debate the leaked information, it’s important to remember a few key things:

  1. The information isn’t fully verified and lacks context. Drawing definitive conclusions at this stage is premature.
  2. Google’s ranking algorithms are complex and constantly evolving. Even if entirely accurate, this leak only represents a snapshot in time.
  3. The fundamentals of good SEO – creating high-quality, relevant, user-centric content and promoting it effectively – still apply regardless of the specific ranking factors at play.
  4. Real-world testing and results should always precede theorizing based on incomplete information.

What To Do Next

As an SEO professional, the best course of action is to stay informed about the leak.

Because details about the document remain unknown, it’s not a good idea to consider any takeaways actionable.

Most importantly, remember that chasing algorithms is a losing battle.

The only winning strategy in SEO is to make your website the best result for your message and audience. That’s Google’s endgame, and that’s where your focus should be, regardless of what any particular leaked document suggests.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending