Connect with us
Cloak And Track Your Affiliate Links With Our User-Friendly Link Cloaking Tool, Try It Free

SEO

Baidu Ranking Factors for 2024: A Comprehensive Data Study

Published

on

Baidu Ranking Factors for 2024: A Comprehensive Data Study

As China’s largest search engine and a global AI and Internet technology leader, Baidu is a powerhouse of innovation. The ERNIE language model, surpassing Google’s BERT in Chinese language processing, positions Baidu at the cutting edge of technological advancement.

In our comprehensive Baidu SEO Ranking Factors Correlation Study*, we analyzed the SERPs for 10,000 Chinese keywords, delving into the top 20 rankings to uncover the factors influencing Baidu’s search engine algorithms.

Search Engine Insights

This study is a goldmine for SEO practitioners globally, not just those targeting the Chinese market. Baidu’s unique approach to search engine technology offers invaluable insights, especially in an era where a deep understanding of algorithms and how search engines work is crucial for SEO success.

Similar to how the SEO community has extensively studied the leaked Yandex papers, understanding Baidu’s SERP construction is equally critical.

Baidu Services in Baidu SERPs

In understanding Baidu’s influence in SEO, it’s important to recognize its array of proprietary services that often dominate the search results. For example, services like Baidu Maps are integral for local searches, similar to the role of Google Maps in other regions.

A notable 34.9% of the top 10 search results are dominated by Baidu’s own services, marking a significant increase from 24.7%, as reported in Searchmetrics’ Baidu Ranking Factors Study in 2020**.

2020 2023
Percentage of Baidu’s own results in top 10 24.70% 34.91%
Percentage of Baidu’s own results in top 20 NA 24.91%
Percentage of Baidu’s own results on position #1 39.00% 60.13%

This dominance extends to 60.13% of first-place positions, up from 39%.

Image by author, December 2023chart showing how much % of top rankings are claimed by Baidu s own services 2020 and 2023

This data isn’t just informative; it’s a clear directive for SEO experts to recalibrate their strategies in China’s unique digital space.

Baidu’s prioritization of its platforms, from Baike to Wenku, signifies more than a preference – it’s a strategic move to retain users within its ecosystem.

these are the most important Baidu services ranked on Baidu's SERPsImage by author, December 2023these are the most important Baidu services ranked on Baidu's SERPs

Baidu Baike, their version of Wikipedia, stands out for its heavily moderated content, ensuring quality but also presenting a challenge for content creators.

The Q&A platform Baidu Zhidao, akin to Quora, and Baidu Wenku, a comprehensive file-sharing service, also frequently appear in search results, reflecting Baidu’s unique algorithm preferences.

These platforms, especially Wenku, tend to have a more prominent presence in Baidu’s SERPs compared to similar platforms in Google’s ecosystem, underscoring the tailored approach Baidu takes in meeting its user’s search needs.

China SEO experts like Stephanie Qian (of The Egg Company) and Veronique Duong (of Rankwell) highlight the potential of leveraging these high-authority domains for enhanced visibility.

This isn’t just a shift in Baidu’s SERPs; it’s a new playbook for Baidu’s SEO success in 2024.

The Unique SEO Landscape In China

Navigating China’s SEO landscape involves understanding unique factors beyond typical SEO strategies. Central to this is China’s rigorous internet regulation, the Great Chinese Firewall, which aims to shield its populace from content considered harmful.

This leads to slower load times for sites hosted outside China due to content scanning and potential blocking. Furthermore, websites on servers flagged for illegal content risk being completely inaccessible in mainland China.

Baidu, the dominant search engine in China, primarily serves the mainland’s Mandarin-speaking audience, favoring content in Simplified Chinese. This contrasts with the Traditional Chinese used in Taiwan and Hong Kong.

Although Baidu indexes global content, its algorithm shows a clear preference for Simplified Chinese, a crucial consideration for SEO targeting this region.

Regarding market share, our study counters the narrative of Bing overtaking Baidu.

In the Chinese market, Baidu remains the primary source of organic traffic, contributing around 70% for our B2B clients, while Bing-China accounts for about 20% – based on analytics data of our B2B clients in China.

This contradicts reports based on StatCounter data, which is used by only 0.01% of top-ranking pages on Baidu, and, as per BuiltWith, is only used by 946 websites.

In-Depth Analysis Of 2024 Baidu Ranking Factors

Domain And URL Structures

The findings paint a clear picture: Baidu’s ranking algorithm shows a distinct preference for certain TLDs and URL structures, with a notable lean towards Chinese TLDs and simplified, linguistically uniform URLs.

For global clients targeting the Chinese market, adapting to these preferences is key.

TLDs: The Rise Of Chinese Top-Level-Domains

The distribution of Top-Level Domains (TLDs) among Baidu’s top-ranking results shows a clear preference:

top TLDs in Baidu's SERPs, without Baidu.com sub-domains counted.Image from author, December 2023top TLDs in Baidu's SERPs, without Baidu.com sub-domains counted.
  • .com domains lead with 72.59%.
  • .cn domains have seen a significant rise, from 3.8% in 2020 (via Searchmetrics) to 14.06% in 2023.
  • .com.cn follows with an increase from 5.5% in 2020 to 6.55%.

This upward trend for Chinese TLDs, notably .cn, suggests their growing importance as a potential ranking factor for 2024.

percantage or ranking URLs per position from a .cn domainImage from author, December 2023percantage or ranking URLs per position from a .cn domain

Subdomains and URL Structures

A majority of ranking pages, 58.42%, are found on a ‘www’ subdomain.

Interestingly, URLs with Chinese characters are rare, constituting only 0.8% of ranking URLs and even fewer in domain names, at just 0.0035%.

rankings in Baidu's top20 from domains that contain Chinese Characters in their namesImage from author, December 2023rankings in Baidu's top20 from domains that contain Chinese Characters in their names

Stephanie Qian from The Egg Company comments,

“Baidu’s official stance discourages the use of Chinese characters in URLs, dispelling myths about their potential ranking benefits.”

URL Length and Language Indicators

Contrary to the belief that shorter URLs rank better on Baidu, our study found the average URL length of well-ranking pages to be 48.25 characters, with 2.3 folders/directories.

This finding suggests that the internal linking structure might play a more crucial role than URL length or proximity to the root domain.

Further, only 2.3% of top-ranking pages use Chinese language indicators in their URLs (like, for example, ‘cn.’ subdomain or ‘/cn/’ folder), supporting the narrative that Baidu favors mono-lingual Chinese websites.

This insight is particularly relevant for multi-lingual international websites aiming to optimize for Baidu.

Onpage Best Practices For Chinese SEO

For Baidu SEO in 2024, it’s not just about including keywords but strategically placing them within well-structured, relevant content. This approach aligns with modern SEO practices where user experience and content relevance reign supreme.

Title Tags And Meta-Descriptions

The average length of title tags on top-ranking pages is 25 Chinese characters, while meta-descriptions average 86 characters. These lengths ensure visibility in Baidu’s SERPs without being truncated.

Interestingly, 36% of top-ranking pages use exact match keywords in the title tags, a figure that rises to 54.4% for more competitive short-head keywords.

presence of the exact match keyword in title tags of top ranking pages on Baidu organic searchImage from author, December 2023presence of the exact match keyword in title tags of top ranking pages on Baidu organic search
Whole keyword set Shorthead keywords Midtail keywords Longtail keywords
Correlation score -0.1 -0.17 -0.14 -0.02
Percentage 36% 54.4% 41.7% 18.6%

For meta-descriptions, 22.2% of top-ranking pages include the exact match keyword, increasing to 34.4% for short-head keywords.

The positioning of the keyword also matters: it’s typically at the front of the title tag but around the 10th position in meta-descriptions.

Headings: Hierarchy And Keyword Placement

Headings play a vital role in Baidu SEO:

  • 71.2% of top-ranking pages correctly use one H1 tag.
  • Nearly half (47.8%) use hierarchical headline structures effectively.
  • 21.1% incorporate the exact match keyword in H1, usually around the 4th or 5th position.
  • H2 and H3 tags are used by 44% and 46% of top-ranking pages, respectively, averaging around nine headlines each.
  • Lesser used H4 headlines appear in 22.4% of top-ranking pages, while H5 and H6 are used by less than 10%.
headline usage on Chinese websitesImage from author, December 2023headline usage on Chinese websites

Content And Keyword Density

Content length is a significant factor, with top-ranking pages averaging 4929 characters, although the median is 3147 characters.

About 85% of the content is in Chinese Characters, a vital benchmark for international companies localizing content.

Exact match keywords are used in the content of 49% of top-ranking pages, with the likelihood increasing for more competitive keywords (57% for mid-tail and 66% for short-head keywords).

However, keyword density is less than 1% on average, indicating a move away from over-optimized, spammy content.

The first appearance of the keyword is often within the first 18% of the content.

The Role Of Images

Images are crucial. More than 94% of top-ranking pages feature an average of 27.5 images; 55.4% use alt-tags, and 12.8% include the keyword in at least one alt-tag.

Internal Links

Interestingly, using the keyword in the anchor text of outbound links does not appear to dampen ranking potential, as 20.3% of top-ranking pages do so.

Backlinks: A Key Factor In Baidu’s SEO Rankings

In addition to on-page SEO elements, backlinks play a crucial role in determining rankings on Baidu.

Our analysis, backed by data from DataForSEO and Majestic, reveals a strong positive correlation between the number of referring domains and improved rankings.

Quantity And Quality Of Referring Domains

The quantity of referring domains significantly impacts Baidu rankings. Websites with a higher number of referring domains generally achieve better positions.

number of referring domains did correlate with better rankings on Baidu in 2020 and still in 2023number of referring domains did correlate with better rankings on Baidu in 2020 and still in 2023

Interestingly, data shows that even sites with fewer referring domains can rank well. The 50 lowest-ranked domains had an average of only 1.1 linking domains according to DataForSEO, and 1.3 as per Majestic’s data.

This indicates that while the number of backlinks is important, there are opportunities for sites with fewer links to still perform well on Baidu.

The Impact Of Link Quality

Link quality is equally crucial.

There’s a strong correlation between high-quality links (as measured by Majestic’s Trust Flow/Citation Flow and DataForSEO Rank) and better rankings on Baidu.

Sites with higher-quality links tend to rank more favorably.

higher Majestic Trust-Flow scores correlate with better rankings on BaiduScreenshot from Majestic’s Trust Flow/Citation Flow and DataForSEO Rank, December 2023higher Majestic Trust-Flow scores correlate with better rankings on Baidu

Additionally, top-ranking sites typically have a lower DataForSEO Backlinks Spam Score, underlining the importance of not just the quantity but the quality and trustworthiness of backlinks.

These insights highlight that a well-rounded backlink profile, combining a healthy number of links with high quality, is essential for achieving and maintaining high rankings on Baidu.

It’s a balance of garnering enough attention to be seen as authoritative yet ensuring that attention comes from reputable, high-quality sources.

This approach aligns with broader SEO best practices, emphasizing the importance of building a natural and reputable backlink profile for sustained SEO success.

Emerging Trends And Practical SEO Strategies For Baidu

As SEO strategies evolve, understanding the impact of specific elements like tags, security protocols, and social media integrations is crucial, especially for Baidu.

The analysis sheds light on these advanced aspects.

Tag Usage And Structure

  • List Usage: A significant 86.5% of top-ranking pages employ <ul> lists, averaging 10.8 lists per page with 7.9 points each. Interestingly, 12.9% incorporate the target keyword within these lists.
  • Tables: 18.2% of top-ranking pages use <table> tags, but a mere 3.1% include the target keyword within these tables, suggesting tables are less about keyword placement and more about structured data presentation.
  • Emphasizing Tags: 9.7% of top-ranking pages use emphasizing tags like <strong>, <em>, and <i>, indicating a selective approach to their usage.Technical SEO and Security

Technical SEO And Security

  • HTTPS: Now an official ranking factor for Baidu, the adoption of HTTPS has risen from 55% in 2020 (Searchmetrics’ study) to 69.6% among top-ranking pages
  • Mobile Optimization: A significant trend is the decline in referencing separate mobile pages, from 35% in 2020 to just 10.3% today, reflecting a shift towards responsive design.
  • Google Tag Manager: Usage among top-ranking pages has decreased from 8% in 2020 to only 2.5%, possibly reflecting localization preferences in tools and technologies.

Hreflang And International SEO

  • Hreflang Usage: Just 1.5% of top-ranking pages utilize Hreflang, with experts like Dan Taylor and Owain Lloyd-Williams noting Baidu’s not supporting this tag. Simon Lesser’s observation highlights the dominance of domestic Chinese-only sites on Baidu.

Emerging Trends In Code And Markup

  • HTML5 Adoption: From less than 30% in 2020, HTML5 usage among top-ranking pages has jumped to 53.2%.
  • Schema.org: Despite Baidu’s official non-support, 11% of top-ranking pages implement Schema.org structured data, with expert Owain Lloyd-Williams suggesting its potential benefits, while Adam Di Frisco advises caution due to Baidu’s current stance.

Social Media Integration

  • Chinese Social Media: 60% of top-ranking pages include Chinese social media integrations, indicating its significance in Baidu’s SEO.
  • Western Social Media: In contrast, only 2% integrate Western platforms like Facebook or YouTube, reflecting Baidu’s regional focus.

These findings underscore the evolving complexities of Baidu SEO. While some global best practices apply, others require adaptation for this unique market.

The strategic use of tags, embracing new technologies like HTML5, and localizing social media integrations emerge as pivotal elements for achieving top rankings on Baidu.

Beyond The Study: Other Influential Factors In Baidu SEO

In Baidu SEO, certain key ranking factors, while not directly measurable, are critical.

Experienced Baidu SEO professionals recognize the importance of user signals, like click-through rates in the SERPs, as influential for rankings. This aligns with insights from Google’s antitrust trial documents, suggesting a similar approach by Baidu.

Equally important is Baidu’s advancement in AI, especially with Baidu ERNIE, surpassing Google’s BERT in understanding Chinese language nuances.

This suggests that Baidu uses advanced NLP in its content analysis algorithms, making techniques like WDF-IDF, tailored for Chinese, vital for creating high-quality content that resonates with both users and Baidu’s AI-driven analysis.

Debunking 4 Common Baidu SEO Myths

Let’s debunk some of the prevalent Baidu SEO myths with insights from our recent study.

Myth 1: Necessity Of A .cn Domain

The common belief is that without a .cn domain, success on Baidu is unattainable.

However, our study shows that .com domains actually dominate Baidu’s search results. While there is a growing trend of Chinese TLDs in top SERPs, the idea that a .cn domain is essential is more myth than reality.

Myth 2: ICP License As A Ranking Requirement

Another myth is that an ICP (Internet Content Provider) license is mandatory for ranking on Baidu.

Contrary to this belief, less than half (48%) of the top-ranking pages have an ICP reference. This is corroborated by our experience with client websites without licenses still achieving rankings.

top rankings on Baidu with an ICP license referenced in the footertop rankings on Baidu with an ICP license referenced in the footer

Myth 3: Only Mainland China-Hosted Websites Rank

The misconception that only websites hosted in Mainland China can rank on Baidu is widespread. In reality, any website accessible in China can rank.

However, it’s worth noting that websites hosted outside of China may experience slower loading speeds, which could impact rankings.

Myth 4: Meta Keywords As A Ranking Factor

Many believe that meta keywords are still a relevant ranking factor for Baidu.

Despite this belief, Baidu’s official stance, as noted by spokesperson Lee, is that meta keywords are no longer considered in their ranking algorithm.

These insights hopefully help clear the air around Baidu SEO. It is important to adapt to factual strategies rather than following outdated myths.

Conclusion: Navigating The Future Of Baidu SEO

As we demystify the landscape of Baidu SEO for 2024, it’s evident that success hinges on a blend of embracing new trends and dismissing outdated myths.

From recognizing the dominance of .com domains, to the rise of .cn and .com.cn TLDs, to understanding the non-essential (but recommended) nature of ICP licenses and the reduced emphasis on meta keywords, SEO strategies must evolve with these insights.

The rise of AI, the significance of user signals, and the nuanced approach to content and backlinks underscore the need for sophisticated, data-driven strategies.

As Baidu continues to refine its algorithms, SEO professionals must adapt, ensuring their tactics not only align with current best practices but are also poised to leverage future advancements.

This journey through Baidu’s SEO terrain equips practitioners with the knowledge and tools to navigate the complexities of ranking on China’s leading search engine, setting the stage for success in the dynamic world of digital marketing.

*We invite you to read the full Baidu SEO Ranking Factors Study we created for you and draw your own conclusions.

**You can also read and compare to the Searchmetrics’ Baidu Ranking Factors Study from 2020.

More resources: 


Featured Image: myboys.me/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

The Three Pillars Of SEO: Authority, Relevance, And Experience

Published

on

By

The Three Pillars Of SEO: Authority, Relevance, And Experience

If there’s one thing we SEO pros are good at, it’s making things complicated.

That’s not necessarily a criticism.

Search engine algorithms, website coding and navigation, choosing and evaluating KPIs, setting content strategy, and more are highly complex tasks involving lots of specialized knowledge.

But as important as those things all are, at the end of the day, there is really just a small set of things that will make most of the difference in your SEO success.

In SEO, there are really just three things – three pillars – that are foundational to achieving your SEO goals.

  • Authority.
  • Relevance.
  • Experience (of the users and bots visiting the site).

Nutritionists tell us our bodies need protein, carbohydrates, and fats in the right proportions to stay healthy. Neglect any of the three, and your body will soon fall into disrepair.

Similarly, a healthy SEO program involves a balanced application of authority, relevance, and experience.

Authority: Do You Matter?

In SEO, authority refers to the importance or weight given to a page relative to other pages that are potential results for a given search query.

Modern search engines such as Google use many factors (or signals) when evaluating the authority of a webpage.

Why does Google care about assessing the authority of a page?

For most queries, there are thousands or even millions of pages available that could be ranked.

Google wants to prioritize the ones that are most likely to satisfy the user with accurate, reliable information that fully answers the intent of the query.

Google cares about serving users the most authoritative pages for their queries because users that are satisfied by the pages they click through to from Google are more likely to use Google again, and thus get more exposure to Google’s ads, the primary source of its revenue.

Authority Came First

Assessing the authority of webpages was the first fundamental problem search engines had to solve.

Some of the earliest search engines relied on human evaluators, but as the World Wide Web exploded, that quickly became impossible to scale.

Google overtook all its rivals because its creators, Larry Page and Sergey Brin, developed the idea of PageRank, using links from other pages on the web as weighed citations to assess the authoritativeness of a page.

Page and Brin realized that links were an already-existing system of constantly evolving polling, in which other authoritative sites “voted” for pages they saw as reliable and relevant to their users.

Search engines use links much like we might treat scholarly citations; the more scholarly papers relevant to a source document that cite it, the better.

The relative authority and trustworthiness of each of the citing sources come into play as well.

So, of our three fundamental categories, authority came first because it was the easiest to crack, given the ubiquity of hyperlinks on the web.

The other two, relevance and user experience, would be tackled later, as machine learning/AI-driven algorithms developed.

Links Still Primary For Authority

The big innovation that made Google the dominant search engine in a short period was that it used an analysis of links on the web as a ranking factor.

This started with a paper by Larry Page and Sergey Brin called The Anatomy of a Large-Scale Hypertextual Web Search Engine.

The essential insight behind this paper was that the web is built on the notion of documents inter-connected with each other via links.

Since putting a link on your site to a third-party site might cause a user to leave your site, there was little incentive for a publisher to link to another site unless it was really good and of great value to their site’s users.

In other words, linking to a third-party site acts a bit like a “vote” for it, and each vote could be considered an endorsement, endorsing the page the link points to as one of the best resources on the web for a given topic.

Then, in principle, the more votes you get, the better and the more authoritative a search engine would consider you to be, and you should, therefore, rank higher.

Passing PageRank

A significant piece of the initial Google algorithm was based on the concept of PageRank, a system for evaluating which pages are the most important based on scoring the links they receive.

So, a page that has large quantities of valuable links pointing to it will have a higher PageRank and will, in principle, be likely to rank higher in the search results than other pages without as high a PageRank score.

When a page links to another page, it passes a portion of its PageRank to the page it links to.

Thus, pages accumulate more PageRank based on the number and quality of links they receive.

Not All Links Are Created Equal

So, more votes are better, right?

Well, that’s true in theory, but it’s a lot more complicated than that.

PageRank scores range from a base value of one to values that likely exceed trillions.

Higher PageRank pages can have a lot more PageRank to pass than lower PageRank pages. In fact, a link from one page can easily be worth more than one million times a link from another page.

Three Pillars of SEO: Authority, Relevance, and Trust | SEJ

But the PageRank of the source page of a link is not the only factor in play.

Google also looks at the topic of the linking page and the anchor text of the link, but those have to do with relevance and will be referenced in the next section.

It’s important to note that Google’s algorithms have evolved a long way from the original PageRank thesis.

The way that links are evaluated has changed in significant ways – some of which we know, and some of which we don’t.

What About Trust?

You may hear many people talk about the role of trust in search rankings and in evaluating link quality.

For the record, Google says it doesn’t have a concept of trust it applies to links (or ranking), so you should take those discussions with many grains of salt.

These discussions began because of a Yahoo patent on the concept of TrustRank.

The idea was that if you started with a seed set of hand-picked, highly trusted sites and then counted the number of clicks it took you to go from those sites to yours, the fewer clicks, the more trusted your site was.

Google has long said it doesn’t use this type of metric.

However, in 2013 Google was granted a patent related to evaluating the trustworthiness of links. We should not though that the existence of a granted patent does not mean it’s used in practice.

For your own purposes, however, if you want to assess a site’s trustworthiness as a link source, using the concept of trusted links is not a bad idea.

If they do any of the following, then it probably isn’t a good source for a link:

  • Sell links to others.
  • Have less than great content.
  • Otherwise, don’t appear reputable.

Google may not be calculating trust the way you do in your analysis, but chances are good that some other aspect of its system will devalue that link anyway.

Fundamentals Of Earning & Attracting Links

Now that you know that obtaining links to your site is critical to SEO success, it’s time to start putting together a plan to get some.

The key to success is understanding that Google wants this entire process to be holistic.

Google actively discourages, and in some cases punishes, schemes to get links in an artificial way. This means certain practices are seen as bad, such as:

  • Buying links for SEO purposes.
  • Going to forums and blogs and adding comments with links back to your site.
  • Hacking people’s sites and injecting links into their content.
  • Distributing poor-quality infographics or widgets that include links back to your pages.
  • Offering discount codes or affiliate programs as a way to get links.
  • And many other schemes where the resulting links are artificial in nature.

What Google really wants is for you to make a fantastic website and promote it effectively, with the result that you earn or attract links.

So, how do you do that?

Who Links?

The first key insight is understanding who it is that might link to the content you create.

Here is a chart that profiles the major groups of people in any given market space (based on research by the University of Oklahoma):

Three Pillars of SEO: Authority, Relevance, and Trust | SEJ

Who do you think are the people that might implement links?

It’s certainly not the laggards, and it’s also not the early or late majority.

It’s the innovators and early adopters. These are the people who write on media sites or have blogs and might add links to your site.

There are also other sources of links, such as locally-oriented sites, such as the local chamber of commerce or local newspapers.

You might also find some opportunities with colleges and universities if they have pages that relate to some of the things you’re doing in your market space.

Relevance: Will Users Swipe Right On Your Page?

You have to be relevant to a given topic.

Think of every visit to a page as an encounter on a dating app. Will users “swipe right” (thinking, “this looks like a good match!)?

If you have a page about Tupperware, it doesn’t matter how many links you get – you’ll never rank for queries related to used cars.

This defines a limitation on the power of links as a ranking factor, and it shows how relevance also impacts the value of a link.

Consider a page on a site that is selling a used Ford Mustang. Imagine that it gets a link from Car and Driver magazine. That link is highly relevant.

Also, think of this intuitively. Is it likely that Car and Driver magazine has some expertise related to Ford Mustangs? Of course it does.

In contrast, imagine a link to that Ford Mustang from a site that usually writes about sports. Is the link still helpful?

Probably, but not as helpful because there is less evidence to Google that the sports site has a lot of knowledge about used Ford Mustangs.

In short, the relevance of the linking page and the linking site impacts how valuable a link might be considered.

What are some ways that Google evaluates relevance?

The Role Of Anchor Text

Anchor text is another aspect of links that matters to Google.

Three Pillars of SEO: Authority, Relevance, and Trust | SEJ

The anchor text helps Google confirm what the content on the page receiving the link is about.

For example, if the anchor text is the phrase “iron bathtubs” and the page has content on that topic, the anchor text, plus the link, acts as further confirmation that the page is about that topic.

Thus, the links evaluate both the page’s relevance and authority.

Be careful, though, as you don’t want to go aggressively obtaining links to your page that all use your main keyphrase as the anchor text.

Google also looks for signs that you are manually manipulating links for SEO purposes.

One of the simplest indicators is if your anchor text looks manually manipulated.

Internal Linking

There is growing evidence that Google uses internal linking to evaluate how relevant a site is to a topic.

Properly structured internal links connecting related content are a way of showing Google that you have the topic well-covered, with pages about many different aspects.

By the way, anchor text is as important when creating external links as it is for external, inbound links.

Your overall site structure is related to internal linking.

Think strategically about where your pages fall in your site hierarchy. If it makes sense for users it will probably be useful to search engines.

The Content Itself

Of course, the most important indicator of the relevance of a page has to be the content on that page.

Most SEO professionals know that assessing content’s relevance to a query has become way more sophisticated than merely having the keywords a user is searching for.

Due to advances in natural language processing and machine learning, search engines like Google have vastly increased their competence in being able to assess the content on a page.

What are some things Google likely looks for in determining what queries a page should be relevant for?

  • Keywords: While the days of keyword stuffing as an effective SEO tactic are (thankfully) way behind us, having certain words on a page still matters. My company has numerous case studies showing that merely adding key terms that are common among top-ranking pages for a topic is often enough to increase organic traffic to a page.
  • Depth: The top-ranking pages for a topic usually cover the topic at the right depth. That is, they have enough content to satisfy searchers’ queries and/or are linked to/from pages that help flesh out the topic.
  • Structure: Structural elements like H1, H2, and H3, bolded topic headings, and schema-structured data may help Google better understand a page’s relevance and coverage.

What About E-E-A-T?

E-E-A-T is a Google initialism standing for Experienced-Expertise-Authoritativeness-Trustworthiness.

It is the framework of the Search Quality Rater’s Guidelines, a document used to train Google Search Quality Raters.

Search Quality Raters evaluate pages that rank in search for a given topic using defined E-E-A-T criteria to judge how well each page serves the needs of a search user who visits it as an answer to their query.

Those ratings are accumulated in aggregate and used to help tweak the search algorithms. (They are not used to affect the rankings of any individual site or page.)

Of course, Google encourages all site owners to create content that makes a visitor feel that it is authoritative, trustworthy, and written by someone with expertise or experience appropriate to the topic.

The main thing to keep in mind is that the more YMYL (Your Money or Your Life) your site is, the more attention you should pay to E-E-A-T.

YMYL sites are those whose main content addresses things that might have an effect on people’s well-being or finances.

If your site is YMYL, you should go the extra mile in ensuring the accuracy of your content, and displaying that you have qualified experts writing it.

Building A Content Marketing Plan

Last but certainly not least, create a real plan for your content marketing.

Don’t just suddenly start doing a lot of random stuff.

Take the time to study what your competitors are doing so you can invest your content marketing efforts in a way that’s likely to provide a solid ROI.

One approach to doing that is to pull their backlink profiles using tools that can do that.

With this information, you can see what types of links they’ve been getting and, based on that, figure out what links you need to get to beat them.

Take the time to do this exercise and also to map which links are going to which pages on the competitors’ sites, as well as what each of those pages rank for.

Building out this kind of detailed view will help you scope out your plan of attack and give you some understanding of what keywords you might be able to rank for.

It’s well worth the effort!

In addition, study the competitor’s content plans.

Learn what they are doing and carefully consider what you can do that’s different.

Focus on developing a clear differentiation in your content for topics that are in high demand with your potential customers.

This is another investment of time that will be very well spent.

Experience

As we traced above, Google started by focusing on ranking pages by authority, then found ways to assess relevance.

The third evolution of search was evaluating the site and page experience.

This actually has two separate but related aspects: the technical health of the site and the actual user experience.

We say the two are related because a site that is technically sound is going to create a good experience for both human users and the crawling bots that Google uses to explore, understand a site, and add pages to its index, the first step to qualifying for being ranked in search.

In fact, many SEO pros (and I’m among them) prefer to speak of SEO not as Search Engine Optimization but as Search Experience Optimization.

Let’s talk about the human (user) experience first.

User Experience

Google realized that authoritativeness and relevancy, as important as they are, were not the only things users were looking for when searching.

Users also want a good experience on the pages and sites Google sends them to.

What is a “good user experience”? It includes at least the following:

  • The page the searcher lands on is what they would expect to see, given their query. No bait and switch.
  • The content on the landing page is highly relevant to the user’s query.
  • The content is sufficient to answer the intent of the user’s query but also links to other relevant sources and related topics.
  • The page loads quickly, the relevant content is immediately apparent, and page elements settle into place quickly (all aspects of Google’s Core Web Vitals).

In addition, many of the suggestions above about creating better content also apply to user experience.

Technical Health

In SEO, the technical health of a site is how smoothly and efficiently it can be crawled by Google’s search bots.

Broken connections or even things that slow down a bot’s progress can drastically affect the number of pages Google will index and, therefore, the potential traffic your site can qualify for from organic search.

The practice of maintaining a technically healthy site is known as technical SEO.

The many aspects of technical SEO are beyond the scope of this article, but you can find many excellent guides on the topic, including Search Engine Journal’s Advanced Technical SEO.

In summary, Google wants to rank pages that it can easily find, that satisfy the query, and that make it as easy as possible for the searcher to identify and understand what they were searching for.

What About the Google Leak?

You’ve probably heard by now about the leak of Google documents containing thousands of labeled API calls and many thousands of attributes for those data buckets.

Many assume that these documents reveal the secrets of the Google algorithms for search. But is that a warranted assumption?

No doubt, perusing the documents is interesting and reveals many types of data that Google may store or may have stored in the past. But some significant unknowns about the leak should give us pause.

  • As  Google has pointed out, we lack context around these documents and how they were used internally by Google, and we don’t know how out of date they may be.
  • It is a huge leap from “Google may collect and store data point x” to “therefore data point x is a ranking factor.”
  • Even if we assume the document does reveal some things that are used in search, we have no indication of how they are used or how much weight they are given.

Given those caveats, it is my opinion that while the leaked documents are interesting from an academic point of view, they should not be relied upon for actually forming an SEO strategy.

Putting It All Together

Search engines want happy users who will come back to them again and again when they have a question or need.

They create and sustain happiness by providing the best possible results that satisfy that question or need.

To keep their users happy, search engines must be able to understand and measure the relative authority of webpages for the topics they cover.

When you create content that is highly useful (or engaging or entertaining) to visitors – and when those visitors find your content reliable enough that they would willingly return to your site or even seek you out above others – you’ve gained authority.

Search engines work hard to continually improve their ability to match the human quest for trustworthy authority.

As we explained above, that same kind of quality content is key to earning the kinds of links that assure the search engines you should rank highly for relevant searches.

That can be either content on your site that others want to link to or content that other quality, relevant sites want to publish, with appropriate links back to your site.

Focusing on these three pillars of SEO – authority, relevance, and experience – will increase the opportunities for your content and make link-earning easier.

You now have everything you need to know for SEO success, so get to work!

More resources: 


Featured Image: Kapralcev/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google’s Web Crawler Fakes Being “Idle” To Render JavaScript

Published

on

By

Google's Web Crawler Fakes Being "Idle" To Render JavaScript

In a recent episode of the Search Off The Record podcast, it was revealed that Google’s rendering system now pretends to be “idle” to trigger certain JavaScript events and improve webpage rendering.

The podcast features Zoe Clifford from Google’s rendering team, who discussed how the company’s web crawlers deal with JavaScript-based sites.

This revelation is insightful for web developers who use such methods to defer content loading.

Google’s “Idle” Trick

Googlebot simulates “idle” states during rendering, which triggers JavaScript events like requestIdleCallback.

Developers use this function to defer loading less critical content until the browser is free from other tasks.

Before this change, Google’s rendering process was so efficient that the browser was always active, causing some websites to fail to load important content.

Clifford explained:

“There was a certain popular video website which I won’t name…which deferred loading any of the page contents until after requestIdleCallback was fired.”

Since the browser was never idle, this event wouldn’t fire, preventing much of the page from loading properly.

Faking Idle Time To Improve Rendering

Google implemented a system where the browser pretends to be idle periodically, even when it’s busy rendering pages.

This tweak ensures that idle callbacks are triggered correctly, allowing pages to fully load their content for indexing.

Importance Of Error Handling

Clifford emphasized the importance of developers implementing graceful error handling in their JavaScript code.

Unhandled errors can lead to blank pages, redirects, or missing content, negatively impacting indexing.

She advised:

“If there is an error, I just try and handle it as gracefully as possible…web development is hard stuff.”

What Does This Mean?

Implications For Web Developers

  • Graceful Error Handling: Implementing graceful error handling ensures pages load as intended, even if certain code elements fail.
  • Cautious Use of Idle Callbacks: While Google has adapted to handle idle callbacks, be wary of over-relying on these functions.

Implications For SEO Professionals

  • Monitoring & Testing: Implement regular website monitoring and testing to identify rendering issues that may impact search visibility.
  • Developer Collaboration: Collaborate with your development team to create user-friendly and search engine-friendly websites.
  • Continuous Learning: Stay updated with the latest developments and best practices in how search engines handle JavaScript, render web pages, and evaluate content.

See also: Google Renders All Pages For Search, Including JavaScript-Heavy Sites

Other Rendering-Related Topics Discussed

The discussion also touched on other rendering-related topics, such as the challenges posed by user agent detection and the handling of JavaScript redirects.

The whole podcast provides valuable insights into web rendering and the steps Google takes to assess pages accurately.

See also: Google Renders All Pages For Search, Including JavaScript-Heavy Sites


Featured Image: fizkes/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google’s Indifference To Site Publishers Explained

Published

on

By

Google inadvertently reveals reasons that explain their seeming indifference to publishers hurt by algorithm updates

A publisher named Brandon Saltalamacchia interviewed Google’s SearchLiaison in which he offered hope that quality sites hit by Google’s algorithms may soon see their traffic levels bounce back. But that interview and a recent Google podcast reveal deeper issues that may explain why Google seems indifferent to publishers with every update.

Google Search Relations

Google has a team whose job is to communicate how site owners can do well on Google. So it’s not that Googlers themselves are indifferent to site publishers and creatives. Google provides a lot of feedback to publishers, especially through Google Search Console. The area in which Google is indifferent to publishers is directly in search at its most fundamental level.

Google’s algorithms are built on the premise that it has to provide a good user experience and is internally evaluated to that standard. This creates the situation where from Google’s perspective the algorithm is working the way it should. But from the perspective of website publishers Google’s ranking algorithms are failing. Putting a finger on why that’s happening is what this article is about.

Publishers Are Not Even An Afterthought To Google

The interview by Brandon Saltalamacchia comes against the background of many websites having lost traffic due to Google’s recent algorithm updates. From Google’s point of view their algorithms are working fine for users. But the steady feedback from website publishers is no, it’s not working. Google’s response for the past month is that they’re investigating how to improve.

What all of this reveals is that there is a real disconnect between how Google measures how their algorithms are working and how website publishers experience it in the real world. It may surprise most people to learn that that this disconnect begins with Google’s mission statement to make information “universally accessible and useful”  and ends with the rollout of an algorithm that is tested for metrics that take into account how users experience it but is 100% blind to how publishers experience it.

Some of the complaints about Google’s algorithms:

  • Ranking algorithms for reviews, travel and other topics are favoring big brands over smaller publishers.
  • Google’s decision to firehose traffic at Reddit contributes to the dismantling of the website publishing ecosystem.
  • AI Overviews summarizes web pages and deprives websites of search traffic.

The stated goal for Google’s algorithm decisions is to increase user satisfaction but the problem with that approach is that website publishers are left out of that equation.  Consider this: Google’s Search Quality Raters Guidelines says nothing about checking if big brands are dominating the search results. Zero.

Website publishers aren’t even an afterthought for Google. Publishers are not not considered at any stage of the creation, testing and rollout of ranking algorithms.

Google Historically Doesn’t Focus On Publishers

A remark by Gary Illyes in a recent Search Off The Record indicated that in Gary’s opinion Google is all about the user experience because if search is good for the user then that’ll trickle down to the publishers and will be good for them too.

In the context of Gary explaining whether Google will announce that something is broken in search, Gary emphasized that search relations is focused on the search users and not the publishers who may be suffering from whatever is broken.

John Mueller asked:

“So, is the focus more on what users would see or what site owners would see? Because, as a Search Relations team, we would focus more on site owners. But it sounds like you’re saying, for these issues, we would look at what users would experience.”

Gary Illyes answered:

“So it’s Search Relations, not Site Owners Relations, from Search perspective.”

Google’s Indifference To Publishers

Google’s focus on satisfying search users can in practice turn into indifference toward publishers.  If you read all the Google patents and research papers related to information retrieval (search technology) the one thing that becomes apparent is that the measure of success is always about the users. The impact to site publishers are consistently ignored. That’s why Google Search is perceived as indifferent to site publishers, because publishers have never been a part of the search satisfaction equation.

This is something that publishers and Google may not have wrapped their minds around just yet.

Later on, in the Search Off The Record  podcast, the Googlers specifically discuss how an update is deemed to be working well regardless if a (relatively) small amount of publishers are complaining that Google Search is broken, because what matters is if Google perceives that they are doing the right thing from Google’s perspective.

John said:

“…Sometimes we get feedback after big ranking updates, like core updates, where people are like, “Oh, everything is broken.”

At the 12:06 minute mark of the podcast Gary made light of that kind of feedback:

“Do we? We get feedback like that?”

Mueller responded:

“Well, yeah.”

Then Mueller completed his thought:

“I feel bad for them. I kind of understand that. I think those are the kind of situations where we would look at the examples and be like, “Oh, I see some sites are unhappy with this, but overall we’re doing the right thing from our perspective.”

And Gary responded:

“Right.”

And John asks:

“And then we wouldn’t see it as an issue, right?”

Gary affirmed that Google wouldn’t see it as an issue if a legit publisher loses traffic when overall the algorithm is working as they feel it should.

“Yeah.”

It is precisely that shrugging indifference that a website publisher, Brandon Saltalamacchia, is concerned about and discussed with SearchLiaison in a recent blog post.

Lots of Questions

SearchLiaison asked many questions about how Google could better support content creators, which is notable because Google has a long history of focusing on their user experience but seemingly not also considering what the impact on businesses with an online presence.

That’s a good sign from SearchLiaison but not entirely a surprise because unlike most Googlers, SearchLiaison (aka Danny Sullivan) has decades of experience as a publisher so he knows what it’s like on our side of the search box.

It will be interesting if SearchLiaison’s concern for publishers makes it back to Google in a more profound way so that there’s a better understanding that the Search Ecosystem is greater than Google’s users and encompasses website publishers, too. Algorithm updates should be about more than how they impact users, the updates should also be about how they impact publishers.

Hope For Sites That Lost Traffic

Perhaps the most important news from the interview is that SearchLiaison expressed that there may be changes coming over the next few months that will benefit the publishers who have lost rankings over the past few months of updates.

Brandon wrote:

“One main take away from my conversation with Danny is that he did say to hang on, to keep doing what we are doing and that he’s hopeful that those of us building great websites will see some signs of recovery over the coming months.”

Yet despite those promises from Danny, Brandon didn’t come away with hope.

Brandon wrote:

“I got the sense things won’t change fast, nor anytime soon. “

Read the entire interview:

A Brief Meeting With Google After The Apocalypse

Listen to the Search Off The Record Podcast

Featured Image by Shutterstock/Roman Samborskyi

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending