Connect with us

SEO

Google Considers Reducing Webpage Crawl Rate

Published

on

Google Considers Reducing Webpage Crawl Rate

Google may reduce the frequency of crawling webpages as it grows more conscious of the sustainability of crawling and indexing.

This topic is discussed by Google’s Search Relations team, which is made up of John Mueller, Martin Splitt, and Gary Illyes.

Together, in the latest episode of the Search Off the Record podcast, they discuss what to expect from Google in 2022 and beyond.

Among the topics they address is crawling and indexing, which SEO professionals and website owners say they’ve seen less of over the past year.

That’s going to be a key focus for Google this year as it aims to make crawling more sustainable by conserving computing resources.

Advertisement

Here’s what that will mean for your website and its performance in search results.

Sustainability Of Crawling & Indexing

Since Googlebot crawling and indexing happens virtually, it’s not something you may think has an impact on the environment.

Illyes brings this issue to attention when he says computing isn’t sustainable in general:

“… what I mean is that computing, in general, is not really sustainable. And if you think of Bitcoin, for example, Bitcoin mining has real impact on the environment that you can actually measure, especially if the electricity is coming from coal plants or other less sustainable plants.

We are carbon-free, since I don’t even know, 2007 or something, 2009, but it doesn’t mean that we can’t reduce even more our footprint on the environment. And crawling is one of those things that early on, we could chop off some low-hanging fruits.”

The low-hanging fruits, in this instance, refers to unnecessary web crawling. Such as crawling webpages that haven’t had any recent updates.

How Will Google Make Crawling More Sustainable?

Illyes goes on to explain that web crawling can be made more sustainable by cutting down on refresh crawls.

Advertisement

There are two types of Googlebot crawling: crawling to discover new content and crawling to refresh existing content.

Google is considering scaling back on crawling to refresh content.

Illyes continues:

“… one thing that we do, and we might not need to do that much, is refresh crawls. Which means that once we discovered a document, a URL, then we go, we crawl it, and then, eventually, we are going to go back and revisit that URL. That is a refresh crawl.

And then every single time we go back to that one URL, that will always be a refresh crawl. Now, how often do we need to go back to that URL?”

He goes on to give an example of certain websites that warrant a significant number of refresh crawls for some parts of the site but not others.

A website like Wall Street Journal is constantly updating its homepage with new content, so it deserves a lot of refresh crawls.

Advertisement

However, WSJ is not likely updating its About page as frequently, so Google doesn’t need to keep doing refresh crawls on those types of pages.

“So you don’t have to go back there that much. And often, we can’t estimate this well, and we definitely have room for improvement there on refresh crawls. Because sometimes it just seems wasteful that we are hitting the same URL over and over again.

Sometimes we are hitting 404 pages, for example, for no good reason or no apparent reason. And all these things are basically stuff that we could improve on and then reduce our footprint even more.”

If Google were to cut down on refresh crawls, which is not 100% confirmed, here’s the impact that could have on your website.

What Does A Reduction In Crawl Rate Mean For Your Website?

There’s a belief out there that a high crawl rate is a positive SEO signal, even if you’re not updating your content as often as Google is crawling it.

That’s a misconception, Illyes says, as content will not necessarily rank better because it gets crawled more.

Mueller:

Advertisement

“So I guess that’s kind of also a misconception that people have inthat they think if a page gets crawled more, it’ll get ranked more. Is that correct that that’s a misconception, or is that actually true?”

Illyes:

“It’s a misconception.”

Mueller:

“OK, so no need to try to force something to be re-crawled if it doesn’t actually change. It’s not going to rank better.”

Again, it’s not confirmed that Google will reduce refresh crawls, but it’s an idea the team is actively considering.

If Google follows through on this idea it won’t be a bad thing for your website. More crawling does not mean better rankings.

Moreover, the idea is to learn which pages need refresh crawls and which pages do not. That means the pages you change more often will continue to be refreshed and updated in search results.

For more details on how Google plans to pull this off, listen to the full discussion in the podcast below (starting at the 2:40 mark):

Advertisement


Featured Image: Alena Veasey/Shutterstock




Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address