Connect with us

SEO

The Ranking Factors & The Myths We Found

Published

on

The Ranking Factors & The Myths We Found

Celebrate the Holidays with some of SEJ’s best articles of 2023.

Our Festive Flashback series runs from December 21 – January 5, featuring daily reads on significant events, fundamentals, actionable strategies, and thought leader opinions.

2023 has been quite eventful in the SEO industry and our contributors produced some outstanding articles to keep pace and reflect these changes.

Catch up on the best reads of 2023 to give you plenty to reflect on as you move into 2024.


Yandex is the search engine with the majority of market share in Russia and the fourth-largest search engine in the world.

On January 27, 2023, it suffered what is arguably one of the largest data leaks that a modern tech company has endured in many years – but is the second leak in less than a decade.

In 2015, a former Yandex employee attempted to sell Yandex’s search engine code on the black market for around $30,000.

The initial leak in January this year revealed 1,922 ranking factors, of which more than 64% were listed as unused or deprecated (superseded and best avoided).

This leak was just the file labeled kernel, but as the SEO community and I delved deeper, more files were found that combined contain approximately 17,800 ranking factors.

When it comes to practicing SEO for Yandex, the guide I wrote two years ago, for the most part, still applies.

Yandex, like Google, has always been public with its algorithm updates and changes, and in recent years, how it has adopted machine learning.

Notable updates from the past two-three years include:

  • Vega (which doubled the size of the index).
  • Mimicry (penalizing fake websites impersonating brands).
  • Y1 update (introducing YATI).
  • Y2 update (late 2022).
  • Adoption of IndexNow.
  • A fresh rollout and assumed update of the PF filter.

On a personal note, this data leak is like a second Christmas.

Since January 2020, I’ve run an SEO news website as a hobby dedicated to covering Yandex SEO and search news in Russia with 600+ articles, so this is probably the peak event of the hobby site.

I’ve also spoken twice at the Optimization conference – the largest SEO conference in Russia.

This is also a good test to see how closely Yandex’s public statements match the codebase secrets.

In 2019, working with Yandex’s PR team, I was able to interview engineers in their Search team and ask a number of questions sourced from the wider Western SEO community.

You can read the interview with the Yandex Search team here.

Whilst Yandex is primarily known for its presence in Russia, the search engine also has a presence in Turkey, Kazakhstan, and Georgia.

The data leak was believed to be politically motivated and the actions of a rogue employee, and contains a number of code fragments from Yandex’s monolithic repository, Arcadia.

Within the 44GB of leaked data, there’s information relating to a number of Yandex products including Search, Maps, Mail, Metrika, Disc, and Cloud.

What Yandex Has Had To Say

As I write this post (January 31st, 2023), Yandex has publicly stated that:

the contents of the archive (leaked code base) correspond to the outdated version of the repository – it differs from the current version used by our services

And:

It is important to note that the published code fragments also contain test algorithms that were used only within Yandex to verify the correct operation of the services.

So, how much of this code base is actively used is questionable.

Yandex has also revealed that during its investigation and audit, it found a number of errors that violate its own internal principles, so it is likely that portions of this leaked code (that are in current use) may be changing in the near future.

Factor Classification

Yandex classifies its ranking factors into three categories.

This has been outlined in Yandex’s public documentation for some time, but I feel is worth including here, as it better helps us understand the ranking factor leak.

  • Static factors – Factors that are related directly to the website (e.g. inbound backlinks, inbound internal links, headers, and ads ratio).
  • Dynamic factors – Factors that are related to both the website and the search query (e.g. text relevance, keyword inclusions, TF*IDF).
  • User search-related factors – Factors relating to the user query (e.g. where is the user located, query language, and intent modifiers).

The ranking factors in the document are tagged to match the corresponding category, with TG_STATIC and TG_DYNAMIC, and then TG_QUERY_ONLY, TG_QUERY, TG_USER_SEARCH, and TG_USER_SEARCH_ONLY.

Yandex Leak Learnings So Far

From the data thus far, below are some of the affirmations and learnings we’ve been able to make.

There is so much data in this leak, it is very likely that we will be finding new things and making new connections in the next few weeks.

These include:

  • PageRank (a form of).
  • At some point Yandex utilized TF*IDF.
  • Yandex still uses meta keywords, which are also highlighted in its documentation.
  • Yandex has specific factors for medical, legal, and financial topics (YMYL).
  • It also uses a form of page quality scoring, but this is known (ICS score).
  • Links from high-authority websites have an impact on rankings.
  • There’s nothing new to suggest Yandex can crawl JavaScript yet outside of already publicly documented processes.
  • Server errors and excessive 4xx errors can impact ranking.
  • The time of day is taken into consideration as a ranking factor.

Below, I’ve expanded on some other affirmations and learnings from the leak.

Where possible, I’ve also tied these leaked ranking factors to the algorithm updates and announcements that relate to them, or where we were told about them being impactful.

MatrixNet

MatrixNet is mentioned in a few of the ranking factors and was announced in 2009, and then superseded in 2017 by Catboost, which was rolled out across the Yandex product sphere.

This further adds validity to comments directly from Yandex, and one of the factor authors DenPlusPlus (Den Raskovalov), that this is, in fact, an outdated code repository.

MatrixNet was originally introduced as a new, core algorithm that took into consideration thousands of ranking factors and assigned weights based on the user location, the actual search query, and perceived search intent.

It is typically seen as an early version of Google’s RankBrain, when they are indeed two very different systems. MatrixNet was launched six years before RankBrain was announced.

MatrixNet has also been built upon, which isn’t surprising, given it is now 14 years old.

In 2016, Yandex introduced the Palekh algorithm that used deep neural networks to better match documents (webpages) and queries, even if they didn’t contain the right “levels” of common keywords, but satisfied the user intents.

Palekh was capable of processing 150 pages at a time, and in 2017 was updated with the Korolyov update, which took into account more depth of page content, and could work off 200,000 pages at once.

URL & Page-Level Factors

From the leak, we have learned that Yandex takes into consideration URL construction, specifically:

  • The presence of numbers in the URL.
  • The number of trailing slashes in the URL (and if they are excessive).
  • The number of capital letters in the URL is a factor.
Screenshot from author, January 2023Yandex leak of ranking factors

The age of a page (document age) and the last updated date are also important, and this makes sense.

As well as document age and last update, a number of factors in the data relate to freshness – particularly for news-related queries.

Yandex formerly used timestamps, specifically not for ranking purposes but “reordering” purposes, but this is now classified as unused.

Also in the deprecated column are the use of keywords in the URL. Yandex has previously measured that three keywords from the search query in the URL would be an “optimal” result.

Internal Links & Crawl Depth

Whilst Google has gone on the record to say that for its purposes, crawl depth isn’t explicitly a ranking factor, Yandex appears to have an active piece of code that dictates that URLs that are reachable from the homepage have a “higher” level of importance.

Yandex factorsScreenshot from author, January 2023Yandex factors

This mirrors John Mueller’s 2018 statement that Google gives “a little more weight” to pages found more than one click from the homepage.

The ranking factors also highlight a specific token weighting for webpages that are “orphans” within the website linking structure.

Clicks & CTR

In 2011, Yandex released a blog post talking about how the search engine uses clicks as part of its rankings and also addresses the desires of the SEO pros to manipulate the metric for ranking gain.

Specific click factors in the leak look at things like:

  • The ratio of the number of clicks on the URL, relative to all clicks on the search.
  • The same as above, but broken down by region.
  • How often do users click on the URL for the search?

Manipulating Clicks

Manipulating user behavior, specifically “click-jacking”, is a known tactic within Yandex.

Yandex has a filter, known as the PF filter, that actively seeks out and penalizes websites that engage in this activity using scripts that monitor IP similarities and then the “user actions” of those clicks – and the impact can be significant.

The below screenshot shows the impact on organic sessions (сессии) after being penalized for imitating user clicks.

Image Source: Russian Search NewsImage from Russian Search News, January 2023Image Source: Russian Search News

User Behavior

The user behavior takeaways from the leak are some of the more interesting findings.

User behavior manipulation is a common SEO violation that Yandex has been combating for years. At the 2020 Optimization conference, then Head of Yandex Webmaster Tools Mikhail Slevinsky said the company is making good progress in detecting and penalizing this type of behavior.

Yandex penalizes user behavior manipulation with the same PF filter used to combat CTR manipulation.

Dwell Time

102 of the ranking factors contain the tag TG_USERFEAT_SEARCH_DWELL_TIME, and reference the device, user duration, and average page dwell time.

All but 39 of these factors are deprecated.

Yandex factorsScreenshot from author, January 2023Yandex factors

Bing first used the term Dwell time in a 2011 blog, and in recent years Google has made it clear that it doesn’t use dwell time (or similar user interaction signals) as ranking factors.

YMYL

YMYL (Your Money, Your Life) is a concept well-known within Google and is not a new concept to Yandex.

Within the data leak, there are specific ranking factors for medical, legal, and financial content that exist – but this was notably revealed in 2019 at the Yandex Webmaster conference when it announced the Proxima Search Quality Metric.

Metrika Data Usage

Six of the ranking factors relate to the usage of Metrika data for the purposes of ranking. However, one of them is tagged as deprecated:

  • The number of similar visitors from the YandexBar (YaBar/Ябар).
  • The average time spent on URLs from those same similar visitors.
  • The “core audience” of pages on which there is a Metrika counter [deprecated].
  • The average time a user spends on a host when accessed externally (from another non-search site) from a specific URL.
  • Average ‘depth’ (number of hits within the host) of a user’s stay on the host when accessed externally (from another non-search site) from a particular URL.
  • Whether or not the domain has Metrika installed.

In Metrika, user data is handled differently.

Unlike Google Analytics, there are a number of reports focused on user “loyalty” combining site engagement metrics with return frequency, duration between visits, and source of the visit.

For example, I can see a report in one click to see a breakdown of individual site visitors:

MetrikaScreenshot from Metrika, January 2023Metrika

Metrika also comes “out of the box” with heatmap tools and user session recording, and in recent years the Metrika team has made good progress in being able to identify and filter bot traffic.

With Google Analytics, there is an argument that Google doesn’t use UA/GA4 data for ranking purposes because of how easy it is to modify or break the tracking code – but with Metrika counters, they are a lot more linear, and a lot of the reports are unchangeable in terms of how the data is collected.

Impact Of Traffic On Rankings

Following on from looking at Metrika data as a ranking factor; These factors effectively confirm that direct traffic and paid traffic (buying ads via Yandex Direct) can impact organic search performance:

  • Share of direct visits among all incoming traffic.
  • Green traffic share (aka direct visits) – Desktop.
  • Green traffic share (aka direct visits) – Mobile.
  • Search traffic – transitions from search engines to the site.
  • Share of visits to the site not by links (set by hand or from bookmarks).
  • The number of unique visitors.
  • Share of traffic from search engines.

News Factors

There are a number of factors relating to “News”, including two that mention Yandex.News directly.

Yandex.News was an equivalent of Google News, but was sold to the Russian social network VKontakte in August 2022, along with another Yandex product “Zen”.

So, it’s not clear if these factors related to a product no longer owned or operated by Yandex, or to how news websites are ranked in “regular” search.

Backlink Importance

Yandex has similar algorithms to combat link manipulation as Google – and has since the Nepot filter in 2005.

From reviewing the backlink ranking factors and some of the specifics in the descriptions, we can assume that the best practices for building links for Yandex SEO would be to:

  • Build links with a more natural frequency and varying amounts.
  • Build links with branded anchor texts as well as use commercial keywords.
  • If buying links, avoid buying links from websites that have mixed topics.

Below is a list of link-related factors that can be considered affirmations of best practices:

  • The age of the backlink is a factor.
  • Link relevance based on topics.
  • Backlinks built from homepages carry more weight than internal pages.
  • Links from the top 100 websites by PageRank (PR) can impact rankings.
  • Link relevance based on the quality of each link.
  • Link relevance, taking into account the quality of each link, and the topic of each link.
  • Link relevance, taking into account the non-commercial nature of each link.
  • Percentage of inbound links with query words.
  • Percentage of query words in links (up to a synonym).
  • The links contain all the words of the query (up to a synonym).
  • Dispersion of the number of query words in links.

However, there are some link-related factors that are additional considerations when planning, monitoring, and analyzing backlinks:

  • The ratio of “good” versus “bad” backlinks to a website.
  • The frequency of links to the site.
  • The number of incoming SEO trash links between hosts.

The data leak also revealed that the link spam calculator has around 80 active factors that are taken into consideration, with a number of deprecated factors.

This creates the question as to how well Yandex is able to recognize negative SEO attacks, given it looks at the ratio of good versus bad links, and how it determines what a bad link is.

A negative SEO attack is also likely to be a short burst (high frequency) link event in which a site will unwittingly gain a high number of poor quality, non-topical, and potentially over-optimized links.

Yandex uses machine learning models to identify Private Blog Networks (PBNs) and paid links, and it makes the same assumption between link velocity and the time period they are acquired.

Typically, paid-for links are generated over a longer period of time, and these patterns (including link origin site analysis) are what the Minusinsk update (2015) was introduced to combat.

Yandex Penalties

There are two ranking factors, both deprecated, named SpamKarma and Pessimization.

Pessimization refers to reducing PageRank to zero and aligns with the expectations of severe Yandex penalties.

SpamKarma also aligns with assumptions made around Yandex penalizing hosts and individuals, as well as individual domains.

Onpage Advertising

There are a number of factors relating to advertising on the page, some of them deprecated (like the screenshot example below).

Yandex factorsScreenshot from author, January 2023Yandex factors

It’s not known from the description exactly what the thought process with this factor was, but it could be assumed that a high ratio of adverts to visible screen was a negative factor – much like how Google takes umbrage if adverts obfuscate the page’s main content, or are obtrusive.

Tying this back to known Yandex mechanisms, the Proxima update also took into consideration the ratio of useful and advertising content on a page.

Can We Apply Any Yandex Learnings To Google?

Yandex and Google are disparate search engines, with a number of differences, despite the tens of engineers who have worked for both companies.

Because of this fight for talent, we can infer that some of these master builders and engineers will have built things in a similar fashion (though not direct copies), and applied learnings from previous iterations of their builds with their new employers.

What Russian SEO Pros Are Saying About The Leak

Much like the Western world, SEO professionals in Russia have been having their say on the leak across the various Runet forums.

The reaction in these forums has been different to SEO Twitter and Mastodon, with a focus more on Yandex’s filters, and other Yandex products that are optimized as part of wider Yandex optimization campaigns.

It is also worth noting that a number of conclusions and findings from the data match what the Western SEO world is also finding.

Common themes in the Russian search forums:

  • Webmasters asking for insights into recent filters, such as Mimicry and the updated PF filter.
  • The age and relevance of some of the factors, due to author names no longer being at Yandex, and mentions of long-retired Yandex products.
  • The main interesting learnings are around the use of Metrika data, and information relating to the Crawler & Indexer.
  • A number of factors outline the usage of DSSM, which in theory was superseded by the release of Palekh in 2016. This was a search algorithm utilizing machine learning, announced by Yandex in 2016.
  • A debate around ICS scoring in Yandex, and whether or not Yandex may provide more traffic to a site and influence its own factors by doing so.

The leaked factors, particularly around how Yandex evaluates site quality, have also come under scrutiny.

There is a long-standing sentiment in the Russian SEO community that Yandex oftentimes favors its own products and services in search results ahead of other websites, and webmasters are asking questions like:

Why does it bother going to all this trouble, when it just nails its services to the top of the page anyway?

In loosely translated documents, these are referred to as the Sorcerers or Yandex Sorcerers. In Google, we’d call these search engine results pages (SERPs) features – like Google Hotels, etc.

In October 2022, Kassir (a Russian ticket portal) claimed ₽328m compensation from Yandex due to lost revenue, caused by the “discriminatory conditions” in which Yandex Sorcerers took the customer base away from the private company.

This is off the back of a 2020 class action in which multiple companies raised a case with the Federal Antimonopoly Service (FAS) for anticompetitive promotion of its own services.

More resources:


Featured Image: FGC/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Reveals Two New Web Crawlers

Published

on

By

Google Reveals Two New Web Crawlers

Google revealed details of two new crawlers that are optimized for scraping image and video content for “research and development” purposes. Although the documentation doesn’t explicitly say so, it’s presumed that there is no impact in ranking should publishers decide to block the new crawlers.

It should be noted that the data scraped by these crawlers are not explicitly for AI training data, that’s what the Google-Extended crawler is for.

GoogleOther Crawlers

The two new crawlers are versions of Google’s GoogleOther crawler that was launched in April 2023. The original GoogleOther crawler was also designated for use by Google product teams for research and development in what is described as one-off crawls, the description of which offers clues about what the new GoogleOther variants will be used for.

The purpose of the original GoogleOther crawler is officially described as:

“GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.”

Two GoogleOther Variants

There are two new GoogleOther crawlers:

  • GoogleOther-Image
  • GoogleOther-Video

The new variants are for crawling binary data, which is data that’s not text. HTML data is generally referred to as text files, ASCII or Unicode files. If it can be viewed in a text file then it’s a text file/ASCII/Unicode file. Binary files are files that can’t be open in a text viewer app, files like image, audio, and video.

The new GoogleOther variants are for image and video content. Google lists user agent tokens for both of the new crawlers which can be used in a robots.txt for blocking the new crawlers.

1. GoogleOther-Image

User agent tokens:

  • GoogleOther-Image
  • GoogleOther

Full user agent string:

GoogleOther-Image/1.0

2. GoogleOther-Video

User agent tokens:

  • GoogleOther-Video
  • GoogleOther

Full user agent string:

GoogleOther-Video/1.0

Newly Updated GoogleOther User Agent Strings

Google also updated the GoogleOther user agent strings for the regular GoogleOther crawler. For blocking purposes you can continue using the same user agent token as before (GoogleOther). The new Users Agent Strings are just the data sent to servers to identify the full description of the crawlers, in particular the technology used. In this case the technology used is Chrome, with the model number periodically updated to reflect which version is used (W.X.Y.Z is a Chrome version number placeholder in the example listed below)

The full list of GoogleOther user agent strings:

  • Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; GoogleOther)
  • Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; GoogleOther) Chrome/W.X.Y.Z Safari/537.36

GoogleOther Family Of Bots

These new bots may from time to time show up in your server logs and this information will help in identifying them as genuine Google crawlers and will help publishers who may want to opt out of having their images and videos scraped for research and development purposes.

Read the updated Google crawler documentation

GoogleOther-Image

GoogleOther-Video

Featured Image by Shutterstock/ColorMaker

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

ChatGPT To Surface Reddit Content Via Partnership With OpenAI

Published

on

By

ChatGPT artificial intelligence chatbot app on smartphone screen with large shadow giving the feeling of floating on top of the background. White background.

Reddit partners with OpenAI to integrate content into ChatGPT.

  • Reddit and OpenAI announce a partnership.
  • Reddit content will be used in ChatGPT.
  • Concerns about accuracy of Reddit user-generated content.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

All You Need to Know

Published

on

All You Need to Know

SEO tracking involves regularly checking a set of metrics to evaluate a website’s performance in search engine results. Some of the most widely adopted metrics include keyword rankings, organic traffic, conversions, and referring domain growth.

Tracking the right metrics is crucial for SEO (search engine optimization) success. You need them to analyze your SEO performance, report to stakeholders, and take the right kind of action to improve your site’s visibility (such as improving content or building more backlinks).

Besides keeping an eye on your own website’s key metrics, it’s also smart to check out how your competitors are doing on the same metrics as you. If you notice they’re getting good results, you can figure out what tactics they’re using and consider using them too.

You can track SEO for your site to a fair degree using free tools like Google Search Console or Ahrefs Webmaster Tools. If you want deeper insights, better data, and the ability to analyze your competitors’ websites, you’ll need a tool like Ahrefs.

This guide is aimed at getting you started with tracking your SEO progress the right way. We’ll cover:

  • What metrics are worth tracking in SEO.
  • How to set up the tools to get the data you need.
  • How to track your competitors.
  • How to go a step further and build an SEO report.

While there are numerous metrics and KPIs you could track, it’s not necessary to monitor all of them continuously. You really just need these seven key metrics to effectively gauge whether your SEO efforts are working.

1. Keyword rankings

Keyword ranking refers to where your page shows up on the search engine results page (SERP) for a specific keyword. It’s like a spot on a list, and you want your page to be as high up on that list as possible — the higher the spot, the more visitors you can attract.

A typical relationship between position and traffic. Traffic drops dramatically with every position in the SERPs.

It’s important to keep an eye on where your keywords are ranking because if they drop lower on the list, your website might get fewer visitors. But you don’t have to watch the rankings for every single keyword, just the main ones that matter most for your key pages.

Also, if you notice your rankings are climbing higher, that’s a good sign. It means that your SEO efforts are paying off.

How to track keyword rankings

To track your keyword rankings, it’s best to use a rank tracker tool like Ahrefs’ Rank Tracker; a tool that allows you to create a list of keywords and automatically monitor their positions in the SERPs for different locations, both for mobile and desktop.

Rank Tracker will suggest keywords for tracking when you set up a new project. Just make sure you’re tracking them in the locations you want to rank (that is, countries where you can serve clients and languages in which you create content).

Adding keywords to track in Ahrefs.Adding keywords to track in Ahrefs.

No need to add each and every keyword from that list. Just add the ones that are important to you and you’ll likely want to track and improve. Typically, you’ll want to track target keywords — the main topic of the page and the main keyword you optimize for.

Once added, you can see your keywords in Rank Tracker’s Overview report.

Overview of tracked keywords in Ahrefs.Overview of tracked keywords in Ahrefs.

Another way to start tracking keywords is to hit Add keywords in the top right corner — best for adding single keywords or importing a list from a document.

Adding single keywords or keywords from a list.Adding single keywords or keywords from a list.

And once data starts rolling in, you will be able to see your ranking progress in time. In the screenshot below, the Ranking history report with a quick insight into recent ranking history and a full ranking history graph.

Ranking history in Ahrefs.Ranking history in Ahrefs.

Why do you need an SEO tool in the first place?

Google’s search results are personalized based on things like your location, browsing history, language, and device.

So when you check the SERPs manually, you might see results that are tailored specifically to you, which might not reflect the more general or widespread rankings.

2. Share of voice

Share of voice (SOV) is a measure of how many clicks your website gets from search engines compared to the total number of clicks available for the keywords you’re tracking.

The higher your rankings, the higher your Share of Voice, and the larger your slice of the market pie.

SOV is a one-of-a-kind metric because of two things:

  • It considers your performance in context to your competitors, giving you a more accurate picture of where you stand in your industry.
  • It doesn’t take into account the search volume of keywords with all of their fluctuations. If you see that your traffic has gone down but your Share of Voice (SOV) remains high, it suggests that the lower traffic is because the keywords you’re targeting have become less popular overall, rather than a decrease in the effectiveness of your SEO strategies.

How to track share of voice

The share of voice metric is another reason to get a rank tracking tool. If the feature is supported, these kinds of tools calculate the metric automatically, so there’s no need to keep a spreadsheet with manually tracked numbers.

In Ahrefs’s Rank Tracker, you’ll find SOV under the Competitors tab.

SOV metric in Ahrefs. SOV metric in Ahrefs.

SOV is calculated by taking all of the tracked keywords into account, yet some of your keywords might be more important than others. If that’s the case, you can track SOV only for a certain topic, SEO campaigns, specific authors, etc. Just select a set of keywords and assign a tag for them.

Adding tags in Ahrefs Rank Tracker. Adding tags in Ahrefs Rank Tracker.

Then, simply select that tag in the Competitors report.

Competitors overview in Rank Tracker. Competitors overview in Rank Tracker.

3. Organic traffic

Organic traffic is basically the number of clicks that come to your website from people finding it through Google. If your website shows up higher on the SERPs, usually more people will click on it and visit your site.

Keeping track of how many visitors come to your site from search engines helps you understand if what you’re doing with SEO is actually working. If you see more visitors over time, your SEO efforts are paying off.

Organic traffic is the pinnacle of SEO, but it’s also important to understand which keywords drive that traffic. So although it’s arguably the most important metric, it’s never a good idea to track this metric alone.

How to track organic traffic

There are basically two ways to track organic traffic: through Google Search Console (and integrations) and through SEO tools.

In terms of raw organic traffic from Google Search, the most accurate data will likely come from their Search Console (for Bing, that would be Webmaster Tools). You can view this data right inside the tool or integrate it with analytics tools like Google Analytics, Hubspot, and Ahrefs for more convenience.

Performance report in GSC.Performance report in GSC.
GSC integration in Ahrefs. GSC integration in Ahrefs.
The cool thing about using Ahrefs for your GSC data is using weekly and monthly data to see spot trends easier.

Raw traffic data is useful for getting a quick snapshot of your current performance, tracking growth trends, and calculating traffic growth for your reports.

But to dive a bit deeper into your organic traffic data, you might want to use a tool like Ahrefs’ Site Explorer because it makes it easier to analyze performance. Here are a few ways you can use the Overview and Top pages report in that tool.

Overlay competitor data on top for a quick performance analysis.

Organic traffic comparison of fours sites on one graph.Organic traffic comparison of fours sites on one graph.
Organic traffic comparison of fours sites on one graph.

Overlay organic pages to see how adding new content correlates with traffic.

Clear correlation between the number of published organic pages and organic traffic. Clear correlation between the number of published organic pages and organic traffic.
In this example, we see a clear correlation between the number of published organic pages and organic traffic (a sign of effective SEO).

See performance in a year-over-year comparison to gauge the impact of long-term projects.

In this example, a long-term content project allowed for the reclaiming of lost traffic from 2020.In this example, a long-term content project allowed for the reclaiming of lost traffic from 2020.
In this example, a long-term content project allowed for the reclaiming of lost traffic from 2020.

Use daily traffic chart to pinpoint the exact day when a traffic increase or decline happened (for instance, due to a Google core update).

Organic traffic affected by Google core update. Organic traffic affected by Google core update.

Identify pages that account for the biggest traffic losses and improve them. You’ll find this in the Top pages report inside Site Explorer.

Top pages report in Ahrefs.Top pages report in Ahrefs.

4. Conversions

Conversions measure how effectively your content translates into tangible results, like profits, content downloads, free trial sign-ups, or any other user action valuable to your business that indicates you’re dealing with a potential customer.

Conversions from organic visits to paid customers are typically hard to measure since this comes down to measuring the ROI of content, which is complicated in itself. However, when we asked marketers about this metric, we found a few interesting ways to solve that problem. For your inspiration, here’s what they measure:

  • Conversion as revenue/signups correlation with traffic. This metric assumes that more website visitors increase your chances of turning them into subscribers or buyers.
  • Conversion growth from bottom-funnel content. Content aimed at users who are on the brink of purchasing can greatly boost sales because it provides that last bit of persuasion they need to complete a purchase.
  • Conversion from first page to paying customer. If the first page a visitor lands on leads to a sale, it’s a clear sign that your content is doing its job effectively.

How to track conversions

Conversions are usually tracked with website analytics tools like Google Analytics 4 (GA 4) or Matomo. They always require a custom setup for each website you want to track, but it’s not an overly complicated process.

For example, in GA4, conversions are called “key events” and are based on tracking user interaction. If a specific event takes place, such as a purchase, a file download, or a form completion, the tool records this as a conversion.

To set up conversion tracking in GA4 you first need to create an event that will be counted as conversion and mark it a key event in the Admin panel of your site (aka property).

Key events control panel in GA4.Key events control panel in GA4.

Then, to see conversion from the organic traffic channel (the channel you’re optimizing with SEO), go to the Advertising panel.

Advertising panel in GA4.Advertising panel in GA4.

Here are a few ideas to use this report:

  • See how many and which key events were driven by organic search in the last month or quarter.
  • See how organic traffic stacks up to other acquisition channels.
  • See the share of organic traffic for events with longer conversion paths (the attribution paths tab).

For more information about how to properly set up GA4 for conversion tracking, see this guide.

5. Referring domain growth

Referring domains are essentially the individual websites that link back to your website. By monitoring these, you get a clear picture of how your link profile is expanding over time.

As your link profile grows with more quality links from diverse domains, it helps to build your site’s authority. This authority is crucial because search engines use it as one of the key factors to determine where your pages should rank in search results.

Essentially, the more authoritative your site becomes, the higher your pages are likely to rank and the harder it becomes for others to outrank you.

How to track referring domain growth

Here’s how to track referring domain growth using Ahrefs.

  • Set up a project (if you haven’t done so yet) and go to your Dashboard.
  • Click on the Backlinks card, which gives you a quick insight into backlinks growth.
  • Click on the card to get more data (if you need it).
Referring domains overview in Ahrefs Site Audit.Referring domains overview in Ahrefs Site Audit.
Referring domains report in Ahrefs Site Audit. Referring domains report in Ahrefs Site Audit.

Aim to build as many or ideally more links from unique domains than your competitors to increase your chances for ranking. Read our link-building guide to learn how:

6. Technical SEO issues

Technical SEO issues, often referred to as SEO health issues, encompass a range of potential hiccups that can hinder Google from effectively finding, crawling, and indexing your website. If Google struggles with any of these steps, your site might not show up correctly — or at all — in search results.

There are eight types of SEO issues you should keep a close eye on because they can impact your ranking the most:

Besides these issues, there are more than 100 other possible issues related to less important technical SEO factors and on-page SEO. I won’t cover all of them here since you can learn what they are and how to fix them right inside Ahrefs.

How to track technical SEO issues (aka SEO health)

Use Ahrefs’ Site Audit (free in Ahrefs Webmaster Tools) to monitor for serious technical issues, marked in the tool as “errors”.

  • Open Site Audit tool inside Ahrefs.
Where to find Site Audit in Ahrefs. Where to find Site Audit in Ahrefs.
  • Click on Errors in the “Issues distribution” card.
Issues distribution in Ahrefs. Issues distribution in Ahrefs.
  • Go to the issue list, then click on the question mark next to the error and follow the instructions.
All issues report in Ahrefs. All issues report in Ahrefs.

To keep your site in good SEO health, schedule regular crawls in Site Audit and fix the most pressing issues.

Note

Before we wrap up this section, here are some other popular metrics and why they haven’t made our list of recommended metrics to track regularly (although they may be useful for other things).

  • Domain Rating (DR). This metric indicates the overall strength of your website’s backlink profile. It’s a handy measure for quickly assessing other websites, particularly for link building purposes. However, it’s not the best metric for ongoing monitoring of your own site since it doesn’t provide specific actionable insights.
  • Click-through Rate (CTR). This measures the percentage of impressions on SERPs that result in clicks, and this data is accessible through Google Search Console. While CTR can be confusing as a metric for the entire site, it proves useful when analyzed at the individual page level.
  • Engagement metrics – Metrics such as bounce rate, engagement rate, dwell time, time on page, and session duration are often discussed in the context of SEO. However, they are either not directly relevant to SEO effectiveness or are unreliable for content analysis.

There are three ways you can track competitors using SEO tools.

  • Track competitors’ rankings for benchmarking.
  • Track multiple metrics for a portfolio of pages.
  • Monitor for noteworthy events: new keywords, backlinks and brand mentions.

Let’s look at them in more detail.

How to track competitors’ keyword rankings

To track your competitors’ rankings, use a rank tracking tool that allows you to automatically monitor their positions on the keywords you target yourself. So whenever you add keywords you want to target in your strategy, the tool will track both your and your competitors’ rank for that keyword.

In Ahrefs’ Rank Tracker all you need to do is set add your competitors’ URLs (you can track entire domains or specific directories). You can do it as soon as setting up your project or add them later on in the Competitors section.

Competitors overview report in Ahrefs Rank Tracker. Competitors overview report in Ahrefs Rank Tracker.

You can use competitor ranking data to:

  • Improve the pages where your competitors outrank you to gain more SOV.
  • Set goals and benchmarks.
  • Compare historical rankings to your performance over time.
  • Quickly see the competitive landscape; see how well you’re doing compared to competitors.
  • See how much more traffic you could gain if you outranked competitors.

How to track multiple metrics for a portfolio of pages

You can also track more than just rankings. Using the Portfolios feature in Ahrefs, you can monitor key metrics such as traffic growth and the increase in referring domains for multiple competitors all at once to analyze their overall SEO performance.

Portfolios feature in Ahrefs. Portfolios feature in Ahrefs.

You can use this feature to monitor specific pages on your competitors’ sites (such as topics on a blog) or combine all your competitors’ sites to see how your entire niche performs in organic search.

To create a portfolio in Ahrefs, go to the Dashboard and click New > Portfolio, then fill in the URLs you want to track.

How to create a new portfolio in Ahrefs.How to create a new portfolio in Ahrefs.

Tip

This feature is especially useful if you’re managing SEO for multiple clients — you can track their entire portfolio as one.

It’s also handy if you have multiple authors on your content team; for example, you can track all articles written by a particular author or keep tabs on all guest and freelance posts.

How to track competitors’ new keywords, backlinks, and web mentions

The final method of tracking your competitors allows you to get email alerts when a competitor:

  • Ranks for a new keyword. Useful for getting content ideas from your competitors’ new content.
  • Rise and fall in keyword rankings. For example, if you see an important keyword suddenly climbing into the top 3, that means your competitor is doing something right, and it’s worth investigating. It’s worth noting that this feature scans, all of the keywords, the site ranks for and not only the ones you track, so it gives you a much wider scope.
  • Gain or lose backlinks. Both situations are potential link building opportunities.
  • Their brand or product is mentioned online. So, when a competitor gets featured in a review, ranking, or digital PR, you can add that site to your list of link building/PR prospects.
Example keyword alert delivered by mail.
Example keyword alert delivered by mail.
Example keyword alert delivered by mail.

To set it up:

  1. Go to Ahrefs Alerts (in the More dropdown menu)
  2. Choose the type of Alert you want to set up.
  3. Click New alert or choose from one of the projects and fill out the details. In case of the mentions alerts, see our documentation to take advantage of advanced queries.
How to add a new keywords alert. How to add a new keywords alert.

Tip

You can also set this feature for your own website. Since Ahrefs Alerts monitors all keywords you rank for, you’ll know if any of your keywords suddenly rise or fall in rankings. 

This is especially useful to spot important keywords you haven’t yet added to Rank Tracker.

If you’re doing SEO for someone else, at some point, you will need to put all of those metrics in a report.

In some cases, it may be enough to show the raw data with a few sentences of commentary. This is true in in-house environments when you’re reporting to someone who can interpret the data themselves, especially if you’ve worked with them for a long time.

But if you’re reporting for a client, raw numbers won’t be enough. Additionally, you will need at least these three elements:

  • Executive summary: Summarizes the entire report, focusing on major points and outcomes for quick reading by senior stakeholders.
  • Opportunities for improvement: Identifies potential areas for SEO enhancements.
  • Roadmap: Outlines past achievements and future steps in the SEO strategy.

It’s also important how to report data for your and your stakeholders understanding and convenience. For instance, many clients require a live interactive dashboard with all the data available at all times (similar to these Ahrefs templates for Looker Studio).

Example of an live reporting dashboard with SEO data. Example of an live reporting dashboard with SEO data.

Others prefer a document where everything is laid out in layman’s terms — they appreciate the data but they don’t really want to deal with it.

Excerpt from an SEO reporting template. Excerpt from an SEO reporting template.

We’ve put together some resources, including a template, to help you quickly and efficiently create a solid report:

Final thoughts

A few tips before we wrap this up:

Got questions or comments? Let me know on X or LinkedIn.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending