Ta kontakt med oss

SÖKMOTORER

Listen To Google Talk About de-SEOing The Search Central Website

Publicerad

Listen To Google Talk About de-SEOing The Search Central Website


Google’s Martin Splitt, Gary Illyes, and Lizzi Sassman sat together on the latest Search Off the Record to talk about de-SEOing – what we would probably call over-optimization strategies. It is an interesting listen to, stuff you probably heard before but it is interesting to hear Googlers who are intimate with how Google Search works tackle this issue on their own site.

Here is the embed:

One thing, they take credit for coining the name “de-SEOing” but what they describe as de-SEOing is what SEOs would call working on over optimization – something both Gary (who was on the podcast) and Matt Cutts talked about before. Here is where Gary mentioned it:

Before listening, it might make sense to read about Google relaunching the Google Webmasters brand to the Search Central brand and then the year later progress report.

One final point, at the end, at the 27 minute mark or so, Gary talks about he finds it annoying that he cannot use the internal Google tools to debug these issues. He has to use the public tools that you or I use, such as Google Search Console, to debug these issues because otherwise it would not be fair. He said:

And it’s also kind of annoying because when we are looking into how people reach those pages, I could use the internal tools. It could be very simple to do it. Like I could just hit up the debug interfaces that we have and tools and just look at how it happens. But we can’t do that. We actually have to use the same tools that anyone else, external to Google, has to use or could use. And it’s very annoying because the information is less obviously. I know why it’s less than what we have internally. Still annoying. But yeah, we will have to figure it out with Search Console and other similar tools.

Forumdiskussion kl Twitter.





Källlänk

Klicka för att kommentera

Lämna ett svar

Din e-postadress kommer inte publiceras. Obligatoriska fält är märkta *

SÖKMOTORER

Google Blasts Agencies That Sell Links Building & Disavow Link Services

Publicerad

Google Hacker Red

John Mueller of Google blasted SEO or marketing agencies that sell both link-building services and disavow link services. He said on Twitter, “These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”

John added that its “all made up and irrelevant.”

Also, when asked if they should disavow links, John replied yesterday on Twitter, “Don’t waste your time on it; do things that build up your site instead.”

Here is the chain of tweets, so you see the context that John is replying to.

Ryan Jones does his rant:

Here is the chain that follows:

Pretty strong words from John, don’t you think?

Here is the second part:

One note:

Just yesterday we covered the topic of disavowing spammy porn links and also added how Google has downplays disavowing links for a while. John said in that in that SEO office hours help video yesterday, “That said, this will not position your site as it was before, but it can help our algorithms to recognize that they can trust your site again, giving you a chance to work up from there.” I don’t think he meant that fully based on what he said yesterday and previously?

But this is pretty strong language for not bothering with the disavow file.

Forumdiskussion kl Twitter.



Källlänk

Fortsätt läsa

SÖKMOTORER

Google Search Console Verification Does Not Impact Your Ranking In Google Search

Publicerad

Woman looking at phone with Google logo

Gary Illyes of Google said in the Google SEO office-hours from yesterday that verifying your website in Google Search Console won’t impact your Google Search indexing or ranking whatsoever.

Gary said, “Having your site verified in Search Console or changing the verification code and method has no effect on indexing or ranking whatsoever.”

John Mueller of Google previously said that Search Console verification doesn’t help with crawling either.

Gary added later that Search Console gives you data and analytics that can help you make improvements to your site to help you rank better in Google Search potentially. “You can use the data that Search Console gives you to improve your site and thus potentially do better in Search with your site, but otherwise has no effect on search whatsoever,” he added.

Here is the video embed at the 15:27 mark:

Forumdiskussion kl Twitter.



Källlänk

Fortsätt läsa

SÖKMOTORER

Google Say Most Common Reason For Blocking Googlebot Are Firewalls or CDNs Issues

Publicerad

Google Fire Wall

Gary Illyes from Google posted a new PSA on LinkedIn saying that the most common reason a site unexpectedly blocks Googlebot from crawling is due to a misconfiguration of a firewall or CDN.

Gary wrote, “check what traffic your firewalls and CDN are blocking.” “By far the most common issue in my inbox is related to firewalls or CDNs blocking googlebot traffic. If I reach out to the blocking site, in the vast majority of the cases the blockage is unintended.”

So what can you do? Gary said, “I’ve said this before, but want to emphasize it again: make a habit of checking your block rules. We publish our IP ranges so it should be very easy to run an automation that checks the block rules against the googlebot subnets.”

Gary linked to this help document for more details.

In short, do what you can to test to see if your site is accessible to Googlebot. You can use the URL inspection tool in Google Search Console, as one method. Also, confirm with your CDN or firewall company that they are allowing Googlebot and ask them to prove it.

Forumdiskussion kl on LinkedIn.

Källlänk

Fortsätt läsa

Trendigt

sv_SESvenska