SÖKMOTORER
Listen To Google Talk About de-SEOing The Search Central Website

Google’s Martin Splitt, Gary Illyes, and Lizzi Sassman sat together on the latest Search Off the Record to talk about de-SEOing – what we would probably call over-optimization strategies. It is an interesting listen to, stuff you probably heard before but it is interesting to hear Googlers who are intimate with how Google Search works tackle this issue on their own site.
Here is the embed:
One thing, they take credit for coining the name “de-SEOing” but what they describe as de-SEOing is what SEOs would call working on over optimization – something both Gary (who was on the podcast) and Matt Cutts talked about before. Here is where Gary mentioned it:
That is totally a thing, but I can’t think of a better name for it. It is literally optimizing so much that eventually it starts hurting
— Gary 鯨理/경리 Illyes (@methode) May 24, 2017
Before listening, it might make sense to read about Google relaunching the Google Webmasters brand to the Search Central brand and then the year later progress report.
One final point, at the end, at the 27 minute mark or so, Gary talks about he finds it annoying that he cannot use the internal Google tools to debug these issues. He has to use the public tools that you or I use, such as Google Search Console, to debug these issues because otherwise it would not be fair. He said:
And it’s also kind of annoying because when we are looking into how people reach those pages, I could use the internal tools. It could be very simple to do it. Like I could just hit up the debug interfaces that we have and tools and just look at how it happens. But we can’t do that. We actually have to use the same tools that anyone else, external to Google, has to use or could use. And it’s very annoying because the information is less obviously. I know why it’s less than what we have internally. Still annoying. But yeah, we will have to figure it out with Search Console and other similar tools.
Forumdiskussion kl Twitter.
SÖKMOTORER
Google Blasts Agencies That Sell Links Building & Disavow Link Services

John Mueller of Google blasted SEO or marketing agencies that sell both link-building services and disavow link services. He said on Twitter, “These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”
John added that its “all made up and irrelevant.”
Also, when asked if they should disavow links, John replied yesterday on Twitter, “Don’t waste your time on it; do things that build up your site instead.”
Here is the chain of tweets, so you see the context that John is replying to.
Ryan Jones does his rant:
I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.
— Ryan Jones (@RyanJones) January 31, 2023
Here is the chain that follows:
There are no clear instructions from google’s side that what kind of links we should disavow even Google created confusion on this by saying no need to disavow blogspot, shaddy links, adult site links.
— Saurabh Rawat (Tech SEO) (@SEOGuruJaipur) January 31, 2023
I’ve personally never seen that type of negative seo actually work without some sort of hacking or the site itself having some sort of issue the links exploit.
— Ryan Jones (@RyanJones) January 31, 2023
That’s all made up & irrelevant. These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.
— John Mueller is watching out for Google+ 🐀 (@JohnMu) January 31, 2023
Pretty strong words from John, don’t you think?
Here is the second part:
Not unless it was placed with ‘intent to manipulate’ by someone. Its important to know you have those though as they could be impacting relevancy or the site owners opinion on how many valid links you actually have etc
— Paul Madden (@PaulDavidMadden) January 31, 2023
Don’t waste your time on it; do things that build up your site instead.
— John Mueller is watching out for Google+ 🐀 (@JohnMu) January 31, 2023
One note:
Thanks for posting on this issue. I think you should add my conclusion tweet as well that all this is claimed by an agency, not by me.
I am not a native English speaker, not sure how people are taking this. I just tried to put some serious points here to help our SEO community.— Saurabh Rawat (Tech SEO) (@SEOGuruJaipur) February 1, 2023
Just yesterday we covered the topic of disavowing spammy porn links and also added how Google has downplays disavowing links for a while. John said in that in that SEO office hours help video yesterday, “That said, this will not position your site as it was before, but it can help our algorithms to recognize that they can trust your site again, giving you a chance to work up from there.” I don’t think he meant that fully based on what he said yesterday and previously?
But this is pretty strong language for not bothering with the disavow file.
Forumdiskussion kl Twitter.
SÖKMOTORER
Google Search Console Verification Does Not Impact Your Ranking In Google Search

Gary Illyes of Google said in the Google SEO office-hours from yesterday that verifying your website in Google Search Console won’t impact your Google Search indexing or ranking whatsoever.
Gary said, “Having your site verified in Search Console or changing the verification code and method has no effect on indexing or ranking whatsoever.”
John Mueller of Google previously said that Search Console verification doesn’t help with crawling either.
Gary added later that Search Console gives you data and analytics that can help you make improvements to your site to help you rank better in Google Search potentially. “You can use the data that Search Console gives you to improve your site and thus potentially do better in Search with your site, but otherwise has no effect on search whatsoever,” he added.
Here is the video embed at the 15:27 mark:
Forumdiskussion kl Twitter.
SÖKMOTORER
Google Say Most Common Reason For Blocking Googlebot Are Firewalls or CDNs Issues

Gary Illyes from Google posted a new PSA on LinkedIn saying that the most common reason a site unexpectedly blocks Googlebot from crawling is due to a misconfiguration of a firewall or CDN.
Gary wrote, “check what traffic your firewalls and CDN are blocking.” “By far the most common issue in my inbox is related to firewalls or CDNs blocking googlebot traffic. If I reach out to the blocking site, in the vast majority of the cases the blockage is unintended.”
So what can you do? Gary said, “I’ve said this before, but want to emphasize it again: make a habit of checking your block rules. We publish our IP ranges so it should be very easy to run an automation that checks the block rules against the googlebot subnets.”
Gary linked to this help document for more details.
In short, do what you can to test to see if your site is accessible to Googlebot. You can use the URL inspection tool in Google Search Console, as one method. Also, confirm with your CDN or firewall company that they are allowing Googlebot and ask them to prove it.
Forumdiskussion kl on LinkedIn.
-
SÖKMOTORER7 dagar sedan
Google publicerar en ny SEO-fallstudie
-
AMAZON6 dagar sedan
41 superpraktiska alla hjärtans dag-gåvor 2023
-
MARKNADSFÖRING4 dagar sedan
11 designtips för e-postmarknadsföring för att öka intäkterna
-
PPC6 dagar sedan
47 kreativa marknadsföringsidéer för februari (bortom alla hjärtans dag!)
-
MARKNADSFÖRING6 dagar sedan
MarTechs marknadsföringsexperter att följa
-
MARKNADSFÖRING6 dagar sedan
Är en marknadsföringsexamen värd det 2023?
-
MARKNADSFÖRING7 dagar sedan
En digital praktikerguide för att börja det nya året på rätt sätt
-
SÖKMOTORER7 dagar sedan
Att generera falska webbadresser på konkurrenters webbplats borde inte skada webbplatsen, säger Google