Connect with us

SEO

How to Fit SEO Into Your Marketing Strategy

Published

on

How to Fit SEO Into Your Marketing Strategy

As digital marketing specialists, it’s easy to “have the blinkers on” and focus solely on the channel or channels nearest to your heart.

In my case, this is SEO. Having worked with several businesses and top brands over the past few years, I’ve learned that some specialists and marketing generalists are:

  • Cannibalizing efforts between channels.
  • Undervaluing organic search as a channel.
  • Failing to align SEO with other disciplines effectively.

In this guide, I’ll explain why you should include SEO in your marketing strategy. I’ll also explain how you can align SEO with other disciplines—from PPC to brand building.

How SEO helps you achieve strategic marketing objectives

Getting the “buy-in” for SEO investment can be tricky. In some businesses, this can lead to SEO being underutilized through a lack of investment and implementation. Here are four reasons why SEO should receive the focus it deserves.

You don’t have to pay per click

SEO is seen by many as an “always on” channel. It can take substantial investment to get going and some patience to see a return. However, once you build up your rankings, you’ll receive traffic essentially for “free” (no additional cost per click).

With SEO, a drop in spending won’t lead to losing all your traffic overnight. Paid advertising, on the other hand, is seen as a tap because you can switch the spending (and subsequently the traffic you receive) on and off.

Organic traffic is relatively sustainable

In SEO, you have to be in it for the long game. Unless you hold the brand authority of Wikipedia or Amazon, it’s hard to gain quality traffic overnight. 

Once you build up your rankings through a solid SEO strategy, the rewards are often here to stay without the need for continuous spending and reinvestment. This makes SEO more like a waterfall than a tap.

Organic traffic is continuous like a waterfall; traffic acquired via PPC can be turned on and off like a water tap

Building a sustainable stream of high-quality organic traffic to your website could be the difference between your business surviving or not surviving economic uncertainties. In challenging financial periods such as recessions, marketing budgets often get slashed, leaving channels like PPC stranded. With solid SEO foundations, however, you’ll continue to acquire users organically, even if you decide to tighten your budget for a short while.

That said, I don’t recommend making cuts to SEO budgets. Continuing your SEO efforts will ensure you are in the best position to steal an edge over your competitors.

SEO is targeted

Results served via organic search are inherently relevant to the query that is searched for by the user. This means you are serving your users a piece of content they want to see through organic search. The algorithm isn’t always 100% perfect, but it’s fair to say that Google does a great job ranking relevant organic search results.

The keyword also tells us a lot of information about what the user is looking to find. This allows us to target potential customers looking for our product or service.

Let’s say, for example, you run an online shop selling discounted football kits. Among several other search terms, you’ll be very interested in attracting potential customers searching for “cheap football kits.”

From this search term alone, we know that the users who search for this keyword want what we sell. Using Ahrefs’ Keywords Explorer, we can also see that the keyword “cheap football kits” attracts 6,300 searches per month (globally).

Overview of "cheap football kits," via Ahrefs' Keywords Explorer

Alternative channels, on the other hand, are a lot less straightforward. In paid search, there are instances where Google may place your result for unwanted search terms.

From 2018, targeting paid keywords via “exact match” means you will appear for other search terms that Google decides have the “same meaning” as the targeted term. Therefore, “exact match” targeting isn’t really an exact match anymore. And it gets worse with broader targeting options.

Ability to target users at various stages in the funnel

In SEO, you’re not just limited to targeting users at one stage of the marketing funnel. The ability to target potential customers through informational blog content and transactional product/service-focused landing pages is what makes SEO both exciting and lucrative.

People use Google regularly to search for:

  • Answers to questions (informational search).
  • Solutions to problems (informational or transactional search).
  • Products or services (transactional search).
  • A specific website (navigational search).

SEOs can target all of the above by creating different types of content to suit the users’ needs determined by the keywords they are searching for.

Let’s say, for example, I run an online store selling kayaks. Here’s how we can target customers at various funnel stages through different types of content. 

Target keywords across the marketing funnel (stages include awareness, research, service or product, and brand)

For keywords such as “how to store a kayak” and “what size kayak do I need,” we are best suited to rank for these queries by providing dedicated informational content. 

Sure, the user may not be in a position to purchase a kayak right away. But now that we’ve helped them out, they may come back to us when they are ready to make a purchase.

For users searching “kayaks for sale,” we know from the search term that they are potentially looking to make a purchase right away. In this case, a product page best suits their needs, allowing users to make a swift purchase. 

Don’t fall into the trap of assuming the type of content based on the query alone, though. Remember that Google is a bot, and your idea of a page that meets the users’ needs and Google’s idea could be completely different.

This is why you should always manually check Google’s search results to confirm the best page type (or page template) that Google likes to serve for your targeted keyword.

Using Ahrefs’ Keywords Explorer, simply enter your keyword and scroll down to the “SERP overview” to see what kind of pages are ranking. This method is great for seeing the search results alongside useful backlink and keyword data. 

SERP overview for the keyword "kayaks for sale," via Ahrefs' Keywords Explorer

Aligning SEO with other disciplines

Specialists can be guilty of becoming isolated from other channels. Often, you’ll hear debates about one discipline versus another, such as SEO vs. PPC. The reality is that having multiple strong-performing channels is vital for business success, and there’s often more opportunity to align than most specialists realize.

SEO and brand building/traditional advertising

Traditional advertising, such as TV, radio, and billboard advertising, can create a lot of search demand. How often within a TV advert are we prompted to “search” for a brand name or a product? 

The SEO team can ensure you are maximizing “SERP real estate” by being an entity in the Knowledge Graph and targeting search features such as People Also Ask. Furthermore, the SEO team can ensure all applicable content is up to date and well optimized.

Another area where the SEO department can help out traditional marketers is by using organic search to assist in market share calculations. Calculating market share is tricky, and the SEO team can help you calculate it through a metric called “share of search.”

At the EffWorks Global 2020 event hosted by IPA (a U.K. trade body), effectiveness guru Les Binet shared how he was experimenting with “share of search” to predict market share “sometimes up to a year ahead.” Les described the metric as a fast and predictive measure for short- and long-term ad effects. 

This metric looks specifically at branded, organic search volume data. To calculate your “share of search,” you divide the total search volume of your brand against the total search volume of all brands in your niche (including your own).

Equation of Les Binet's "share of search"

For example, I’ve taken five popular U.S. donut brands and put them into Ahrefs’ Keywords Explorer.

Respective search volumes of five major U.S. donut brands, via Ahrefs' Keywords Explorer

We can see that Dunkin Donuts is by far and away the most popular, with a 69% market share across these five brands (8.3 million/12 million).

Of course, there are more than five big donut brands in the U.S. The more expansive you go with your list, the more accurate your calculation will be.

SEO and paid search

Both SEO and paid search teams work with keywords a lot. This provides the perfect opportunity for sharing resources, particularly those keyword research files that often take hours to compile. But it’s not just about the keyword data. Sharing analytics data between teams is also useful, such as click-through rates, conversion rates, and other metrics.

As highlighted earlier in this article, PPC is instant, whereas SEO requires more of a “runway” to achieve results. This is the exact reason that these two teams should align on strategy. 

Let’s say you have identified some top new keywords to target and want to gain traffic via these keywords right away. While you wait for your optimized content to be crawled by Google, to mature, and to subsequently rank, the PPC team can immediately acquire traffic for these keywords.

Acquire traffic instantly via PPC during the "SEO runway" period

Once you are through the “SEO runway” and generating organic traffic for these keywords, the PPC team may then consider moving the spending to alternative keywords to generate organic traffic.

A common question is, “Should PPC target keywords that already perform well in SEO?” There is no right or wrong answer to this question, as all approaches have pros and cons.

By targeting the same keywords through SEO and PPC, you are holding two results competing against each other. Many believe this is a good thing, as it leads to more SERP “real estate,” which ultimately leads to more clicks overall. 

That said, you will inevitably be paying for some clicks you would have already received for free through the organic result. This leads to a drop in organic traffic for the respective keywords.

Jamie’s verdict

I always review this on a case-by-case basis. More often than not, my recommendation is not to target the same keywords through both SEO and PPC. It’s impossible to rank in position #1 organically for all relevant keywords to your business. So I find it more effective to avoid the overlap and ensure PPC teams are using their budget to target keywords that are yet to rank or are underperforming in SEO. 

That said, if certain keywords are critical to a business, then there is certainly a business case to go for “SERP dominance” and target through both SEO and PPC.

Successful PPC campaigns can also indirectly have a positive impact on SEO. Backlinks are a key ranking factor in SEO. The more visibility your content receives, the more likely people are to link to your site. In the video below, Ahrefs’ Sam Oh explains how PPC advertising can help build those all-important links.

SEO and UX

SEOs and user experience teams have been prone to the odd fallout over strategy. In modern SEO, however, the two teams should be more aligned than ever.

Shady tactics that game the algorithm and offer a poor experience no longer work in SEO. Google’s algorithm is now much more advanced and looks to reward high-quality websites that provide a good experience to their users.

There are several user experience factors that influence SEO. Mobile optimization is one of the more prominent examples.

Most web users now use a mobile device instead of a desktop or tablet. This is reflected in Google’s algorithm, with mobile usability being an important ranking factor. Google will also predominantly crawl the mobile version of your website.

Another UX optimization, which is also a ranking signal in SEO, is page speed.

Page speed, albeit more of a minor ranking signal, is used in the algorithm and is more important than ever in SEO following the introduction of Core Web Vitals as a ranking factor in 2021. Core Web Vitals focus on three key metrics that have a big impact on the experience of the user. Largest Contentful Paint (loading), First Input Delay (interactivity), and Cumulative Layout Shift (visual stability).

Both Core Web Vitals and mobile friendliness fall under Google’s “Page Experience” set of ranking signals. This also includes site security via SSL certification (HTTPS over HTTP) and not displaying intrusive interstitials (pop-ups).

Google's search signals for Page Experience

A third key optimization used in both UX and SEO is site structure. Ensuring your content is organized and internally linked helps users and bots discover your content.

Keen to hear more about the importance of site structure for both UX and SEO? Be sure to check out our Michal Pecánek’s guide to website structure.

Bonus tip

Breadcrumbs are great for user experience. They allow users (and bots) to navigate through the site’s structure easily.

Breadcrumb linking is an aspect of internal linking that is undervalued. Breadcrumb links are highly effective at passing PageRank due to their prominent on-page location.

SEO and PR

Public relations (PR) can have a significant influence on SEO performance. So much so that SEOs have formed digital PR (DPR or sometimes “SEO PR”), a spin-off of traditional PR designed to focus on the areas that benefit SEO the most. 

While similar to traditional PR, DPR is more focused on building backlinks and growing brand awareness through online publications.

Pie chart showing differences and overlapping aspect of traditional PR and digital PR

Link building is one of three key pillars in SEO. What sets DPR link building apart from the rest is that you build links from authoritative publications in a natural, “white hat,” and high-quality way.

SEOs, PRs, or DPRs can align with traditional PR teams by sharing media lists (often journalist contacts) and data. This allows for more efficiency as they work toward their respective goals.

Bonus tip

Be aware that PR experts can be territorial when it comes to outreach, but this is perfectly understandable. Let’s put ourselves in their shoes. They won’t want us to dive in and ruin relationships they have spent a lot of time building.

So how can we go about this? My colleague, Charlotte Crowther, who is the digital PR manager at Kaizen, shares her top three tips to ease this situation:

  1. Remind traditional PRs of the shared interests Although we may have slightly different KPIs, we are working toward the same goal: getting the best coverage for our business.
  2. Give them more of an understanding of our process – Being transparent about processes can help ease concerns. Despite having PR in the name, DPRs approach things quite differently from traditional PRs.
  3. Set out the rules from the very beginning Starting the relationship with strong communication from the very beginning will help create any required workarounds, avoiding potential bumps in the road at every turn caused by a lack of communication.

Here’s an example of how you can build natural, high-quality backlinks through exciting digital PR campaigns.

At Kaizen, we worked with the folks at the startup, DirectlyApply. They tasked us with a link building campaign amid the COVID-19 pandemic.

Enter Susan, the future of the remote worker. Susan is a shocking 3D model of a remote worker’s appearance after staying at home for 25 years.

Visualization of Susan, the future remote worker alongside examples of media coverage

Susan was the talk of the U.K., with several media outlets talking about the physical impacts of working from home. The campaign resulted in over 200 backlinks and over 400 pieces of coverage.

Graph showing backlinks built following email outreach; initial spike in backlinks acquired is followed by a stable increase
Image from the Overview report, via Ahrefs’ Site Explorer.

Not only did this campaign generate those all-important backlinks, but it also drove huge traction on social media. Susan generated over 60,000 shares, raising brand awareness even further.

SEO and social media

You might assume that SEO and social media teams have little in common. But there are tons of ways these teams should work together.

Social media is a great way to get eyes on your site, whether it be traditional social media sites (such as Twitter, Instagram, and Facebook) or video marketing sites (such as YouTube and TikTok). Similarly to all channels, the more people we have reading our content, the more likely we will naturally build relevant backlinks.

Social media is great for generating that initial “buzz” around new content and directing traffic to our pages. Rand Fishkin calls this the “spike of hope.” After a short while, however, this excitement wanes and clicks dry up, leading to the “flatline of nope.”

The initial spike in traffic is followed by an immediate drop-off

This isn’t necessarily a bad thing. It’s how social media marketing works. You focus on one piece of content and move on to the next exciting piece of content quickly.

That’s precisely why these two channels should be working together to avoid the “spike of hope, flatline of nope” scenario. The social media team is on hand to deliver that instant boost in traffic for new content. Then the SEO team is on hand to provide consistent traffic.

Initial spike in traffic is followed by a consistent stream of traffic that's acquired organically

Not all content intended for SEO will instantly be guaranteed success on socials. Campaigns led by DPRs, however, are often exciting, engaging, and shareable. Keeping DPRs involved in this relationship is beneficial for social media teams, as they can boost these campaigns through social media and reformat future content for social channels.

Looking to acquire traffic through Google Discover? In Michal’s blog on this topic, he discusses the correlation between posts that get traction on social media and those that perform well on Google Discover.

In a quirky social media test, JR Oakes encouraged his followers to engage in a low-quality post, receiving over 100 retweets, 50+ likes, and many replies. The result? JR’s article indeed landed in Google Discover.

Correlation does not equal causation, of course. That said, there’s no harm in giving your SEO content that extra boost through social media.

Final thoughts

We’ve seen how SEO can interact and work with other marketing channels and how important strong alignment is in today’s omnichannel marketing world.

It’s important to remember that all channels are working toward driving growth for your business. So working together well will bring out the best in each channel for optimal growth.

Key takeaways:

  • Align your SEO efforts with your strategic objectives
  • Use “share of search” as a predictive metric to calculate market share
  • Lean on PPC and social media to generate traffic during the “SEO runway” period
  • SEO and UX teams have a lot more in common in modern times
  • Ensure traditional PR and DPR teams are on the same page

Have any questions? Ping me on Twitter and let me know.



Source link

SEO

Ranking Factors & The Myths We Found

Published

on

Ranking Factors & The Myths We Found

Yandex is the search engine with the majority of market share in Russia and the fourth-largest search engine in the world.

On January 27, 2023, it suffered what is arguably one of the largest data leaks that a modern tech company has endured in many years – but is the second leak in less than a decade.

In 2015, a former Yandex employee attempted to sell Yandex’s search engine code on the black market for around $30,000.

The initial leak in January this year revealed 1,922 ranking factors, of which more than 64% were listed as unused or deprecated (superseded and best avoided).

This leak was just the file labeled kernel, but as the SEO community and I delved deeper, more files were found that combined contain approximately 17,800 ranking factors.

When it comes to practicing SEO for Yandex, the guide I wrote two years ago, for the most part, still applies.

Yandex, like Google, has always been public with its algorithm updates and changes, and in recent years, how it has adopted machine learning.

Notable updates from the past two-three years include:

  • Vega (which doubled the size of the index).
  • Mimicry (penalizing fake websites impersonating brands).
  • Y1 update (introducing YATI).
  • Y2 update (late 2022).
  • Adoption of IndexNow.
  • A fresh rollout and assumed update of the PF filter.

On a personal note, this data leak is like a second Christmas.

Since January 2020, I’ve run an SEO news website as a hobby dedicated to covering Yandex SEO and search news in Russia with 600+ articles, so this is probably the peak event of the hobby site.

I’ve also spoken twice at the Optimization conference – the largest SEO conference in Russia.

This is also a good test to see how closely Yandex’s public statements match the codebase secrets.

In 2019, working with Yandex’s PR team, I was able to interview engineers in their Search team and ask a number of questions sourced from the wider Western SEO community.

You can read the interview with the Yandex Search team here.

Whilst Yandex is primarily known for its presence in Russia, the search engine also has a presence in Turkey, Kazakhstan, and Georgia.

The data leak was believed to be politically motivated and the actions of a rogue employee, and contains a number of code fragments from Yandex’s monolithic repository, Arcadia.

Within the 44GB of leaked data, there’s information relating to a number of Yandex products including Search, Maps, Mail, Metrika, Disc, and Cloud.

What Yandex Has Had To Say

As I write this post (January 31st, 2023), Yandex has publicly stated that:

the contents of the archive (leaked code base) correspond to the outdated version of the repository – it differs from the current version used by our services

And:

It is important to note that the published code fragments also contain test algorithms that were used only within Yandex to verify the correct operation of the services.

So, how much of this code base is actively used is questionable.

Yandex has also revealed that during its investigation and audit, it found a number of errors that violate its own internal principles, so it is likely that portions of this leaked code (that are in current use) may be changing in the near future.

Factor Classification

Yandex classifies its ranking factors into three categories.

This has been outlined in Yandex’s public documentation for some time, but I feel is worth including here, as it better helps us understand the ranking factor leak.

  • Static factors – Factors that are related directly to the website (e.g. inbound backlinks, inbound internal links, headers, and ads ratio).
  • Dynamic factors – Factors that are related to both the website and the search query (e.g. text relevance, keyword inclusions, TF*IDF).
  • User search-related factors – Factors relating to the user query (e.g. where is the user located, query language, and intent modifiers).

The ranking factors in the document are tagged to match the corresponding category, with TG_STATIC and TG_DYNAMIC, and then TG_QUERY_ONLY, TG_QUERY, TG_USER_SEARCH, and TG_USER_SEARCH_ONLY.

Yandex Leak Learnings So Far

From the data thus far, below are some of the affirmations and learnings we’ve been able to make.

There is so much data in this leak, it is very likely that we will be finding new things and making new connections in the next few weeks.

These include:

  • PageRank (a form of).
  • At some point Yandex utilized TF*IDF.
  • Yandex still uses meta keywords, which are also highlighted in its documentation.
  • Yandex has specific factors for medical, legal, and financial topics (YMYL).
  • It also uses a form of page quality scoring, but this is known (ICS score).
  • Links from high-authority websites have an impact on rankings.
  • There’s nothing new to suggest Yandex can crawl JavaScript yet outside of already publicly documented processes.
  • Server errors and excessive 4xx errors can impact ranking.
  • The time of day is taken into consideration as a ranking factor.

Below, I’ve expanded on some other affirmations and learnings from the leak.

Where possible, I’ve also tied these leaked ranking factors to the algorithm updates and announcements that relate to them, or where we were told about them being impactful.

MatrixNet

MatrixNet is mentioned in a few of the ranking factors and was announced in 2009, and then superseded in 2017 by Catboost, which was rolled out across the Yandex product sphere.

This further adds validity to comments directly from Yandex, and one of the factor authors DenPlusPlus (Den Raskovalov), that this is, in fact, an outdated code repository.

MatrixNet was originally introduced as a new, core algorithm that took into consideration thousands of ranking factors and assigned weights based on the user location, the actual search query, and perceived search intent.

It is typically seen as an early version of Google’s RankBrain, when they are indeed two very different systems. MatrixNet was launched six years before RankBrain was announced.

MatrixNet has also been built upon, which isn’t surprising, given it is now 14 years old.

In 2016, Yandex introduced the Palekh algorithm that used deep neural networks to better match documents (webpages) and queries, even if they didn’t contain the right “levels” of common keywords, but satisfied the user intents.

Palekh was capable of processing 150 pages at a time, and in 2017 was updated with the Korolyov update, which took into account more depth of page content, and could work off 200,000 pages at once.

URL & Page-Level Factors

From the leak, we have learned that Yandex takes into consideration URL construction, specifically:

  • The presence of numbers in the URL.
  • The number of trailing slashes in the URL (and if they are excessive).
  • The number of capital letters in the URL is a factor.
Screenshot from author, January 2023

The age of a page (document age) and the last updated date are also important, and this makes sense.

As well as document age and last update, a number of factors in the data relate to freshness – particularly for news-related queries.

Yandex formerly used timestamps, specifically not for ranking purposes but “reordering” purposes, but this is now classified as unused.

Also in the deprecated column are the use of keywords in the URL. Yandex has previously measured that three keywords from the search query in the URL would be an “optimal” result.

Internal Links & Crawl Depth

Whilst Google has gone on the record to say that for its purposes, crawl depth isn’t explicitly a ranking factor, Yandex appears to have an active piece of code that dictates that URLs that are reachable from the homepage have a “higher” level of importance.

Yandex factorsScreenshot from author, January 2023

This mirrors John Mueller’s 2018 statement that Google gives “a little more weight” to pages found more than one click from the homepage.

The ranking factors also highlight a specific token weighting for webpages that are “orphans” within the website linking structure.

Clicks & CTR

In 2011, Yandex released a blog post talking about how the search engine uses clicks as part of its rankings and also addresses the desires of the SEO pros to manipulate the metric for ranking gain.

Specific click factors in the leak look at things like:

  • The ratio of the number of clicks on the URL, relative to all clicks on the search.
  • The same as above, but broken down by region.
  • How often do users click on the URL for the search?

Manipulating Clicks

Manipulating user behavior, specifically “click-jacking”, is a known tactic within Yandex.

Yandex has a filter, known as the PF filter, that actively seeks out and penalizes websites that engage in this activity using scripts that monitor IP similarities and then the “user actions” of those clicks – and the impact can be significant.

The below screenshot shows the impact on organic sessions (сессии) after being penalized for imitating user clicks.

Image Source: Russian Search NewsImage from Russian Search News, January 2023

User Behavior

The user behavior takeaways from the leak are some of the more interesting findings.

User behavior manipulation is a common SEO violation that Yandex has been combating for years. At the 2020 Optimization conference, then Head of Yandex Webmaster Tools Mikhail Slevinsky said the company is making good progress in detecting and penalizing this type of behavior.

Yandex penalizes user behavior manipulation with the same PF filter used to combat CTR manipulation.

Dwell Time

102 of the ranking factors contain the tag TG_USERFEAT_SEARCH_DWELL_TIME, and reference the device, user duration, and average page dwell time.

All but 39 of these factors are deprecated.

Yandex factorsScreenshot from author, January 2023

Bing first used the term Dwell time in a 2011 blog, and in recent years Google has made it clear that it doesn’t use dwell time (or similar user interaction signals) as ranking factors.

YMYL

YMYL (Your Money, Your Life) is a concept well-known within Google and is not a new concept to Yandex.

Within the data leak, there are specific ranking factors for medical, legal, and financial content that exist – but this was notably revealed in 2019 at the Yandex Webmaster conference when it announced the Proxima Search Quality Metric.

Metrika Data Usage

Six of the ranking factors relate to the usage of Metrika data for the purposes of ranking. However, one of them is tagged as deprecated:

  • The number of similar visitors from the YandexBar (YaBar/Ябар).
  • The average time spent on URLs from those same similar visitors.
  • The “core audience” of pages on which there is a Metrika counter [deprecated].
  • The average time a user spends on a host when accessed externally (from another non-search site) from a specific URL.
  • Average ‘depth’ (number of hits within the host) of a user’s stay on the host when accessed externally (from another non-search site) from a particular URL.
  • Whether or not the domain has Metrika installed.

In Metrika, user data is handled differently.

Unlike Google Analytics, there are a number of reports focused on user “loyalty” combining site engagement metrics with return frequency, duration between visits, and source of the visit.

For example, I can see a report in one click to see a breakdown of individual site visitors:

MetrikaScreenshot from Metrika, January 2023

Metrika also comes “out of the box” with heatmap tools and user session recording, and in recent years the Metrika team has made good progress in being able to identify and filter bot traffic.

With Google Analytics, there is an argument that Google doesn’t use UA/GA4 data for ranking purposes because of how easy it is to modify or break the tracking code – but with Metrika counters, they are a lot more linear, and a lot of the reports are unchangeable in terms of how the data is collected.

Impact Of Traffic On Rankings

Following on from looking at Metrika data as a ranking factor; These factors effectively confirm that direct traffic and paid traffic (buying ads via Yandex Direct) can impact organic search performance:

  • Share of direct visits among all incoming traffic.
  • Green traffic share (aka direct visits) – Desktop.
  • Green traffic share (aka direct visits) – Mobile.
  • Search traffic – transitions from search engines to the site.
  • Share of visits to the site not by links (set by hand or from bookmarks).
  • The number of unique visitors.
  • Share of traffic from search engines.

News Factors

There are a number of factors relating to “News”, including two that mention Yandex.News directly.

Yandex.News was an equivalent of Google News, but was sold to the Russian social network VKontakte in August 2022, along with another Yandex product “Zen”.

So, it’s not clear if these factors related to a product no longer owned or operated by Yandex, or to how news websites are ranked in “regular” search.

Backlink Importance

Yandex has similar algorithms to combat link manipulation as Google – and has since the Nepot filter in 2005.

From reviewing the backlink ranking factors and some of the specifics in the descriptions, we can assume that the best practices for building links for Yandex SEO would be to:

  • Build links with a more natural frequency and varying amounts.
  • Build links with branded anchor texts as well as use commercial keywords.
  • If buying links, avoid buying links from websites that have mixed topics.

Below is a list of link-related factors that can be considered affirmations of best practices:

  • The age of the backlink is a factor.
  • Link relevance based on topics.
  • Backlinks built from homepages carry more weight than internal pages.
  • Links from the top 100 websites by PageRank (PR) can impact rankings.
  • Link relevance based on the quality of each link.
  • Link relevance, taking into account the quality of each link, and the topic of each link.
  • Link relevance, taking into account the non-commercial nature of each link.
  • Percentage of inbound links with query words.
  • Percentage of query words in links (up to a synonym).
  • The links contain all the words of the query (up to a synonym).
  • Dispersion of the number of query words in links.

However, there are some link-related factors that are additional considerations when planning, monitoring, and analyzing backlinks:

  • The ratio of “good” versus “bad” backlinks to a website.
  • The frequency of links to the site.
  • The number of incoming SEO trash links between hosts.

The data leak also revealed that the link spam calculator has around 80 active factors that are taken into consideration, with a number of deprecated factors.

This creates the question as to how well Yandex is able to recognize negative SEO attacks, given it looks at the ratio of good versus bad links, and how it determines what a bad link is.

A negative SEO attack is also likely to be a short burst (high frequency) link event in which a site will unwittingly gain a high number of poor quality, non-topical, and potentially over-optimized links.

Yandex uses machine learning models to identify Private Blog Networks (PBNs) and paid links, and it makes the same assumption between link velocity and the time period they are acquired.

Typically, paid-for links are generated over a longer period of time, and these patterns (including link origin site analysis) are what the Minusinsk update (2015) was introduced to combat.

Yandex Penalties

There are two ranking factors, both deprecated, named SpamKarma and Pessimization.

Pessimization refers to reducing PageRank to zero and aligns with the expectations of severe Yandex penalties.

SpamKarma also aligns with assumptions made around Yandex penalizing hosts and individuals, as well as individual domains.

Onpage Advertising

There are a number of factors relating to advertising on the page, some of them deprecated (like the screenshot example below).

Yandex factorsScreenshot from author, January 2023

It’s not known from the description exactly what the thought process with this factor was, but it could be assumed that a high ratio of adverts to visible screen was a negative factor – much like how Google takes umbrage if adverts obfuscate the page’s main content, or are obtrusive.

Tying this back to known Yandex mechanisms, the Proxima update also took into consideration the ratio of useful and advertising content on a page.

Can We Apply Any Yandex Learnings To Google?

Yandex and Google are disparate search engines, with a number of differences, despite the tens of engineers who have worked for both companies.

Because of this fight for talent, we can infer that some of these master builders and engineers will have built things in a similar fashion (though not direct copies), and applied learnings from previous iterations of their builds with their new employers.

What Russian SEO Pros Are Saying About The Leak

Much like the Western world, SEO professionals in Russia have been having their say on the leak across the various Runet forums.

The reaction in these forums has been different to SEO Twitter and Mastodon, with a focus more on Yandex’s filters, and other Yandex products that are optimized as part of wider Yandex optimization campaigns.

It is also worth noting that a number of conclusions and findings from the data match what the Western SEO world is also finding.

Common themes in the Russian search forums:

  • Webmasters asking for insights into recent filters, such as Mimicry and the updated PF filter.
  • The age and relevance of some of the factors, due to author names no longer being at Yandex, and mentions of long-retired Yandex products.
  • The main interesting learnings are around the use of Metrika data, and information relating to the Crawler & Indexer.
  • A number of factors outline the usage of DSSM, which in theory was superseded by the release of Palekh in 2016. This was a search algorithm utilizing machine learning, announced by Yandex in 2016.
  • A debate around ICS scoring in Yandex, and whether or not Yandex may provide more traffic to a site and influence its own factors by doing so.

The leaked factors, particularly around how Yandex evaluates site quality, have also come under scrutiny.

There is a long-standing sentiment in the Russian SEO community that Yandex oftentimes favors its own products and services in search results ahead of other websites, and webmasters are asking questions like:

Why does it bother going to all this trouble, when it just nails its services to the top of the page anyway?

In loosely translated documents, these are referred to as the Sorcerers or Yandex Sorcerers. In Google, we’d call these search engine results pages (SERPs) features – like Google Hotels, etc.

In October 2022, Kassir (a Russian ticket portal) claimed ₽328m compensation from Yandex due to lost revenue, caused by the “discriminatory conditions” in which Yandex Sorcerers took the customer base away from the private company.

This is off the back of a 2020 class action in which multiple companies raised a case with the Federal Antimonopoly Service (FAS) for anticompetitive promotion of its own services.

More resources:


Featured Image: FGC/Shutterstock



Source link

Continue Reading

SEO

Google Updates Search Console Video Indexing Report

Published

on

Google Updates Search Console Video Indexing Report

Google’s updated Search Console Video indexing report now includes daily video impressions and a sitemap filter feature.

  • Google has updated the Search Console Video indexing report to provide more comprehensive insights into video performance in search results.
  • The updated report includes daily video impressions, which are grouped by page, and a new sitemap filter feature to focus on the most important video pages.
  • These updates are part of Google’s ongoing efforts to help website owners and content creators understand and improve the visibility of their videos in search results.



Source link

Continue Reading

SEO

Bing Revamps Crawl System To Enhance Efficiency

Published

on

Bing Revamps Crawl System To Enhance Efficiency

According to a recent study by Bing, most websites have XML sitemaps, with the “lastmod” tag being the most critical component of these sitemaps.

The “lastmod” tag indicates the last time the webpages linked by the sitemap were modified and is used by search engines to determine how often to crawl a site and which pages to index.

However, the study also revealed that a significant number of “lastmod” values in XML sitemaps were set incorrectly, with the most prevalent issue being identical dates on all sitemaps.

Upon consulting with web admins, Microsoft discovered that the dates were set to the date of sitemap generation rather than content modification.

To address this issue, Bing is revamping its crawl scheduling stack to better utilize the information provided by the “lastmod” tag in sitemaps.

This will improve crawl efficiency by reducing unnecessary crawling of unchanged content and prioritizing recently updated content.

The improvements have already begun on a limited scale and are expected to roll out by June fully.

Additionally, Microsoft has updated sitemap.org for improved clarity by adding the following line:

“Note that the date must be set to the date the linked page was last modified, not when the sitemap is generated.”

How To Use The Lastmod Tag Correctly

To correctly set the “lastmod” tag in a sitemap, you should include it in the <url> tag for each page in the sitemap.

The date should be in W3C Datetime format, with the most commonly used formats being YYYY-MM-DD or YYYY-MM-DDThh:mm:ssTZD.

The date should reflect the last time the page was modified and should be updated regularly to ensure that search engines understand the relevance and frequency of updates.

Here’s an example code snippet:

<?xml version=”1.0″ encoding=”UTF-8″?>

<urlset xmlns=”http://www.sitemaps.org/schemas/sitemap/0.9″>

   <url>

      <loc>http://www.example.com/</loc>

      <lastmod>2023-01-23</lastmod>      

   </url>

Google’s Advice: Use Lastmod Tag After Significant Changes Only

Google’s crawlers also utilize the “lastmod” tag, and the suggestions on using it by both major search engines are similar.

Google Search Advocate John Mueller recently discussed the lastmod tag in the January edition of Google’s office-hours Q&A sessions.

It’s worth noting that Google recommends only using the “lastmod” tag for substantial modifications, which was not mentioned in Microsoft’s blog post.

Changing the date in the lastmod tag after minor edits can be viewed as an attempt to manipulate search snippets.

In Summary

Microsoft’s recent study and efforts to improve the utilization of the “lastmod” tag in sitemaps will result in more efficient and effective webpage crawling.

Publishers are encouraged to regularly update their sitemaps and lastmod tags to ensure that their pages are correctly indexed and easily accessible by search engines.


Featured Image: mundissima/Shutterstock

Source: Microsoft



Source link

Continue Reading

Trending

en_USEnglish