Connect with us

SEARCHENGINES

Google Says Stop Using 403s or 404s To Reduce Googlebot Crawl Rates

Published

on

Bee Googlebot

Gary Illyes posted a new blog post on the Google Search Central site asking all of you to stop using 403 and 404 server status codes to reduce the crawl rate of Googlebot. He said they have seen an uptick in the number of sites and CDNs doing this and they need to cut it out.

Gary wrote, “Over the last few months we noticed an uptick in website owners and some content delivery networks (CDNs) attempting to use 404 and other 4xx client errors (but not 429) to attempt to reduce Googlebot’s crawl rate.” “The short version of this blog post is: please don’t do that,” he added.

Instead, he said Google has documentation about how to reduce Googlebot’s crawl rate. “Read that instead and learn how to effectively manage Googlebot’s crawl rate,” he added.

Gary also posted on LinkedIn saying, “Friday rumble… ramble? One of those. Anyway: the 403 and 404 status codes will not help you quickly reduce crawl rate. If anything, they might have the opposite effect. We have documentation about how to reduce crawl rate and unsurprisingly 403/404 is not in them.”

There are more details in the blog post.

Advertisement

Forum discussion at LinkedIn.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address