Everyone panic – Google might not just make larger updates to the Google Webmaster Guidelines but also rename the guidelines. Gary Illyes from Google said over time the Webmaster Guidelines have become confusing and somewhat outdated and it does need an update.
He said this in the last Search Off The Record podcast at about 15 minutes into the podcast:
Gary Illyes said the Google Webmaster Guidelines are now more confusing, specifically Google “started putting more stuff into it whenever there was a push for something like, for example, HTTPS that ended up in the Webmaster Guidelines. When it was speed, then it ended up in the Webmaster Guidelines. When there was something else, then it ended up in the Webmaster Guidelines,” he said. “And that’s not necessarily good, because the Webmaster Guidelines is– it should be more about what are the basic requirements for you to get into Search. Speed is not one of them, HTTPS is not one of them, so what are they doing there? And the other thing is that many of the quality guidelines were created, how many years is that? 15 years ago at least. And some of the things that we have in our Webmaster Guidelines don’t even exist anymore,” he added. Other examples he gave were guestbooks are mention and an old SEO software package named WebPositionGold.
How can they be improved, one example he said was “doorway pages.” He said “if you take a look at that page, it explains what doorway pages are, but it’s very general. And for example, if you have a site for every single state in the US, because you have shops in all those states, then technically, that’s a doorway page. But does it violate our guidelines? I don’t know. It depends on other factors, like are you trying to manipulate the search results or not?”
John asked if Google should split up the guidelines or not, he said “I wonder if we should at some point kind of split them off into more technical guidelines, like what you need to do, and separate out the kind of the spam, low quality, manipulative stuff a little bit more. Because then, from a technical point of view, the developers would still have enough to look at and be like, “Oh, I’m fulfilling all of these requirements.”
Gary suggested if Google would update the guidelines it might cause a lot of concern and “people freak out.” He said everyone “will start analyzing the changes, and then they might read something into it that’s not there. And that’s what we want to preempt, ideally.”
They then go through more examples of issues with the current Google webmaster guidelines; stuff around misunderstanding the cloaking issue and how intent really matters. Plus, Google might renamed the Google Webmaster Guidelines to Guideliney McGuidelineFace.
So expect more changes to the Google webmaster guidelines in 2022 but don’t freak out about it.
Forum discussion at Twitter.
Google Says They Have Algorithms To Detect & Demote AI Plagiarized Content
Duy Nguyen from Google’s search quality team said in the Google office hours video that Google has “algorithms to go after” those who post AI plagiarized content, then the algorithms can “demote site scraping content from other sites.”
The question was asked at the 9:19 mark which was “How should content creators respond to sites that use AI to plagiarize the content, modify it, and then outrank them in search results?”
Duy Nguyen said, “Scraping content, even with some modification, is against our spam policy.” Duy added that Google has “many algorithms to go after such behaviors and demote site scraping content from other sites.”
If Google messes up, and “if you come across sites that repeatedly scrape content, that perform well on Search, please feel free to report them to us, with our spam report form so that we can further improve our systems, both in detecting the spam and also ranking overall,” he added.
Here is the video embed:
Later on in the video, at 17:05 mark a similar question was asked and answered by Duy:
Kunal asked why Google is not taking action on copy or spun web stories? Can you check on Discover?
Thank you for the report. We are aware of these attempts and we are looking into them. In general, sites with spammy scraped content violate our spam policy, and our algorithms do a pretty good job of demoting them in search results.
Forum discussion at Twitter.