SEARCHENGINES
Google Says You Can Restrict XML Sitemaps To Search Engines
Google’s John Mueller said it is acceptable to restrict access of your XML Sitemaps to just search engines. So you can technically block humans from seeing your XML sitemaps but allow Google to access it.
This came up when Christoph Cemper asked about some sites using Cloudflare to restrict access to their XML sitemap files. Christoph asked “Seen an xml sitemap “protected” by .
Cloudflare. Wondering if Googlebot would type in that Captcha, or just retry later, or just ignore the map. Any experiences/guidance @JohnMu on such a case?”
John responded on Twitter saying “That’s fine. These sitemap files are for search engines, and some sites prefer to restrict their access accordingly.”
Here are those tweets:
That’s fine. These sitemap files are for search engines, and some sites prefer to restrict their access accordingly.
— 🐝 johnmu.xml (personal) 🐝 (@JohnMu) June 6, 2022
I guess it depends on what you’re trying to do. If you don’t want random people crawling your sitemap, then by all means block them all.
— 🐝 johnmu.xml (personal) 🐝 (@JohnMu) June 8, 2022
This would clearly not be a form of cloaking because humans do not need to see your Sitemap files, because XML sitemap files are designed for search engines, not humans. But this obviously would not apply to HTML sitemap files…
Forum discussion at Twitter.
Source: www.seroundtable.com
You must be logged in to post a comment Login