Connect with us

MARKETING

Google’s John Mueller On Link Velocity and Penalties

Published

on

Google’s John Mueller answered a question about getting links too fast and if that would trigger a penalty. The rate at which links are acquired is known in the SEO community as link velocity. John Mueller’s answer provided insight into the topic of getting links too fast and whether that results in penalties.

Background of Link Velocity

Some of the people who promote the idea of link velocity don’t cite patents or research papers to support their ideas. That automatically makes their claims speculative and not factual.

It’s important to point out that the idea of link velocity was created by the SEO community.

The idea is based on the discovery of a patent. The patent, among many things, mentions measuring the growth of links in the context of time. The patent is named, Information Retrieval Based on Historical Data.

This patent is about a lot of things. For example, it discusses understanding whether an older web page is outdated and if a newer page is more relevant.

Link velocity is the idea that a high rate of link growth is a bad thing. The patent describes how a new site with a high rate of link growth can be judged to be more relevant than an older site.

The patent contains information that contradicts the idea of link velocity.

This is what it says:

Advertisement

“Consider the example of a document with an inception date of yesterday that is referenced by 10 back links.

This document may be scored higher by search engine 125 than a document with an inception date of 10 years ago that is referenced by 100 back links because the rate of link growth for the former is relatively higher than the latter.”

See how that contradicts the idea of link velocity?

That passage highlights the propensity of some SEOs to pick which part of a patent they will believe because it fits their experience and which part they choose to ignore because it does not fit their narrative of how search engines work.

The patent has more to say about links:

“While a spiky rate of growth in the number of back links may be a factor used by search engine 125 to score documents, it may also signal an attempt to spam search engine 125. Accordingly, in this situation, search engine 125 may actually lower the score of a document(s) to reduce the effect of spamming.”

There it is. That’s where the idea of link velocity originated. Except it isn’t actually proof that link velocity exists.

The patent doesn’t explicitly say that the rate of growth is the reason why the search engine might lower the rate of growth.

It says that a “spiky rate of growth” in backlinks could cause the search engine to lower the score.

That’s not just semantics. The patent uses the word “spiky” one more time in the context of web graphs. Web graphs mean a map of the Internet as connected by links.

Advertisement

This is what the patent says:

“Naturally developed web graphs typically involve independent decisions.

Synthetically generated web graphs, which are usually indicative of an intent to spam, are based on coordinated decisions, causing the profile of growth in anchor words/bigrams/phrases to likely be relatively spiky.”

What that patent is really talking about is the smooth natural rate of growth versus a spiky and unnatural rate of growth.

A spiky rate of growth can manifest over the course of months. That’s a big difference from the link velocity idea that proposes that a large amount of links acquired in a short period will result in a penalty.

A site that attains sudden popularity and a lot of links fast could be indicative of increased topicality. In that case Google would actually promote that page higher. That’s part of the Query Deserves Freshness update.

A Google update from 2011, Query Deserves Freshness, promotes new content that is topical which can be signaled by an increase in recent links.

So to wrap up:

  1. The patent does not mention link velocity. The word velocity isn’t mentioned.
  2. The patent describes a “spiky” rate of growth as a spam signal.
  3. It discusses rewarding sites that obtain links quickly.
  4. The patent is from 2003.

Yes, that’s an old patent. So, apart from the fact that the patent discusses rewarding quickly obtained links and talks about spiky rates of growth and not link velocity, it’s an old patent.

That makes it less likely to still be a significant part of today’s algorithms. Even PageRank was replaced in 2006.

Advertisement

So all of that is the background on link velocity.

This is What Mueller Says About Link Velocity

Will Link Velocity Cause a Penalty?

This is the question:

“If I build 200 backlinks in two days and didn’t perform any link building for years will Google still see this as black hat and penalize me?

What about link velocity?”

John Mueller answered:

“From my point of view if you’re jumping in with a question like this and you’re saying I’m going to get 200 backlinks in two days… then that sounds a lot like you’re not getting natural backlinks.

That sounds a lot like you’re going off and just buying them or having someone buy them for you. And that itself would be kind of the thing that we would not be happy with.”

Whether a Link is Natural is What Counts

Mueller is setting aside the link velocity question and focusing on how natural the links are.

He specifically says that the quality of the links being purchased is what will cause Google to take action, not the speed of the link acquisition.

Google’s John Mueller Addresses Link Velocity

Mueller then circles back and addresses the so-called “link velocity.”

Advertisement

This is what John Mueller says about link velocity:

“So it’s not so much a matter of how many links you get in which time period. It’s really just… if these are links that are unnatural or from our point of view problematic then they would be problematic.

It’s like it doesn’t really matter how many or in which time.

That is a clear statement that the quality of the links, whether they are natural or unnatural is what counts.

Mueller states that the rate of link acquisition and the time period those links are acquired in are not a factor.

Some in the industry will continue to hold on to the idea of link velocity. Many will say that their experience proves it exists.

But what one sees is one thing. What caused what is seen is something else. Two different things.

I have provided the background showing where the idea of link velocity came from and why it’s never been an accurate SEO theory. John Mueller’s response seems to confirm that the concept of link velocity is not a factor. More importantly, Mueller confirms that it’s factors specific to the link themselves that matter.

Advertisement

MARKETING

Freshness & SEO: An Underrated Concept

Published

on

Freshness & SEO: An Underrated Concept

The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

During my time in search, there are certain ranking factors that I’ve changed my perspective on. For instance, after coming to Go Fish Digital and working on internal linking initiatives, I started to realize the power of internal links over time. By implementing internal links at scale, we were able to see consistent success.

Freshness is another one of these factors. After working with a news organization and testing the learnings gained from that work on other sites, I started to see the immense power that content refreshes could produce. As a result, I think the entire SEO community has underrated this concept for quite some time. Let’s dig into why.

Reviewing news sites

This all started when we began to work with a large news publisher who was having trouble getting in Google’s Top Stories for highly competitive keywords. They were consistently finding that their content wasn’t able to get inclusion in this feature, and wanted to know why.

Inclusion in “Top stories”

We began to perform a lot of research around news outlets that seemed quite adept at getting included in Top Stories. This immediately turned our attention to CNN, the site that is by far the most skilled in acquiring coveted Top Stories positions.

By diving into their strategies, one consistent trend we noticed was that they would always create a brand new URL the day they wanted to be included in the Top Stories carousel:

As an example, here you can see that they create a unique URL for their rolling coverage of the Russia-Ukraine war. Since they know that Google will show Top Stories results daily for queries around this, they create brand new URLs every single day:

Advertisement
    • cnn.com/europe/live-news/russia-ukraine-war-news-05-16-22/index.html

    • cnn.com/europe/live-news/russia-ukraine-war-news-05-21-22/index.html

    • cnn.com/europe/live-news/russia-ukraine-war-news-05-23-22/index.html

This flies in the face of traditional SEO advice that indicates web owners need to keep consistent URLs in order to ensure equity isn’t diluted and keywords aren’t cannibalized. But to be eligible for Top Stories, Google needs a “fresh” URL to be indexed in order for the content to qualify.

After we started implementing the strategy of creating unique URLs every day, we saw much more consistent inclusion for this news outlet in Top Stories for their primary keywords.

However, the next question we wanted to address was not just how to get included in this feature, but also how to maintain strong ranking positions once there.

Ranking in “Top stories”

The next element that we looked at was how frequently competitors were updating their stories once in the Top Stories carousel, and were surprised at how frequently top news outlets refresh their content.

We found that competitors were aggressively updating their timestamps. For one query, when reviewing three articles over a four-hour period, we found the average time between updates for major outlets:

  1. USA Today: Every 8 Minutes

  2. New York Times: Every 27 minutes

  3. CNN: Every 28 minutes

For this particular query, USA Today was literally updating their page every 8 minutes and maintaining the #1 ranking position for Top Stories. Clearly, they were putting a lot of effort into the freshness of their content.

But what about the rest of us?

Of course, it’s obvious how this would apply to news sites. There is certainly no other vertical where the concept of “freshness” is going to carry more weight to the algorithm. However, this got us thinking about how valuable this concept would be to the broader web. Are other sites doing this, and would it be possible to see SEO success by updating content more frequently?

Evergreen content

Fortunately, we were able to perform even more research in this area. Our news client also had many non-news specific sections of their site. These sections contain more “evergreen” articles where more traditional SEO norms and rules should apply. One section of their site contains more “reviews” type of content, where they find the best products for a given category.

When reviewing articles for these topics, we also noticed patterns around freshness. In general, high ranking articles in competitive product areas (electronics, bedding, appliances) would aggressively update their timestamps on a monthly (sometimes weekly) cadence.

Advertisement

For example, as of the date of this writing (May 25th, 2022), I can see that all of the top three articles for “best mattress” have been updated within the last 7 days.

Looking at the term “best robot vacuum”, it looks like all of the articles have been updated in the last month (as of May 2022):

Even though these articles are more “evergreen” and not tied to the news cycle, it’s obvious that these sites are placing a high emphasis on freshness with frequent article updates. This indicated to us that there might be more benefits to freshness than just news story results.

Performing a test

We decided to start testing the concept of freshness on our own blog to see what the impact of these updates could be. We had an article on automotive SEO that used to perform quite well for “automotive seo” queries. However, in recent years, this page lost a lot of organic traffic:

The article still contained evergreen information, but it hadn’t been updated since 2016:

It was the perfect candidate for our test. To perform this test, we made only three changes to the article:

  1. Updated the content to ensure it was all current. This changed less than 5% of the text.

  2. Added “2022” to the title tag.

  3. Updated the timestamp.

Immediately, we saw rankings improve for the keyword “automotive seo”. We moved from ranking on the third page to the first page the day after we updated the content:

To verify these results, we tested this concept on another page. For this next article, we only updated the timestamp and title tag with no changes to the on-page content. While we normally wouldn’t recommend doing this, this was the only way we could isolate whether “freshness” was the driving change, and not the content adjustments.

However, after making these two updates, we could clearly see an immediate improvement to the traffic of the second page:

These two experiments combined with other tests we’ve performed are showing us that Google places value on the recency of content. This value extends beyond just articles tied to the news cycle.

Why does Google care?

E-A-T considerations

Thinking about this more holistically, Google utilizing the concept of freshness makes sense from their E-A-T initiatives. The whole concept of E-A-T is that Google wants to rank content that it can trust (written by experts, citing facts) above other search results. Google has a borderline public responsibility to ensure that the content it serves is accurate, so it’s in the search giant’s best interest to surface content that it thinks it can trust.

So how does freshness play into this? Well, if Google thinks content is outdated, how is it supposed to trust that the information is accurate? If the search engine sees that your article hasn’t been updated in five years while competitors have more recent content, that might be a signal that their content is more trustworthy than yours.

For example, for the term “best camera phones”, would you want to read an article last updated two years ago? For that matter, would you even want an article last updated six months ago?

As we can see, Google is only ranking pages that have been updated within the last one or two months. That’s because the technology changes so rapidly in this space that, unless you’re updating your articles every couple of months or so, you’re dramatically behind the curve.

Marketplace threats

The concept of freshness also makes sense from a competitive perspective. One of the biggest weaknesses of an indexation engine is that it’s inherently hard to serve real-time results. To find when content changes, a search engine needs time to recrawl and reindex content. When combined with the demands of crawling the web at scale, this becomes extremely difficult.

Advertisement

On the other hand, social media sites like Twitter don’t have this issue and are made to serve real-time content. The platform isn’t tasked with indexing results, and engagement metrics can help quickly surface content that’s gaining traction. As a result, Twitter does a much better job of surfacing trending content.

Thinking about the web from a platform based perspective, it makes sense that most users would choose Twitter over Google when looking for real-time information. This causes a big threat to Google, as it’s a reason for users to migrate off the ecosystem, thus presenting fewer opportunities to serve ads.

Recently in Top Stories, you now see a lot more “Live Blog Posts”. These articles utilize LiveBlogPosting structured data, which signals to Google that the content is getting updated in real-time. While looking for real-time URLs across the entire web is daunting, using this structured data type can help them better narrow in on content they need to be crawling and indexing more frequently.

Google seems to be aggressively pushing these live blogs in Top Stories as they often see strong visibility in Top Stories results:

This might be a strategic move to encourage publishers to create real-time content. The goal here could be increased adoption of content that’s updated in real-time with the end result of showcasing to users that they can get this type of content on Google, not just Twitter.

Utilizing these concepts moving forward

I think as an industry, sometimes there’s room for us to be more creative when thinking about our on-page optimizations. When looking at how to improve pages that have lost traffic and positions over time, we could take freshness into consideration. When looking at pages that have lost prominence over time, we might want to consider checking if that content is also outdated. Through testing and experimentation, you could see if updating the freshness of your content has noticeable positive impacts on ranking improvements.

Source link

Advertisement
Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending

en_USEnglish