Connect with us

SEARCHENGINES

Google Says Indexing Issues Can Be Spam Related But Likely Not Adult Content Related

Published

on

Google Says Indexing Issues Can Be Spam Related But Likely Not Adult Content Related


Having issues with Google indexing your web pages? Google’s John Mueller said while Google may not index pages that are spammy, it will index pages that are adult oriented. John said if your domain had old spammy content on it and you took it over and removed the spam, you can still see your new page not being indexed as fast. But if the site had old adult content, you probably would not see indexing issues related to adult content.

In other words, it can take time for Google to trust a domain that has a history of spammy content on it. Google has been saying for a decade that it does not like to index spammy content. Google has said numerous times that indexing issues can be related to quality issues with the overall site. For pages to be indexed they need to pass quality checks. And if you see your site being deindexed, that can be a sign of a quality issue.

But when it comes to adult oriented content and the SafeSearch filter, normally it won’t result in indexing issues. Google will index adult content, it just does not like to index spammy content.

John said at the 51:42 mark “Usually the the indexing side of things would not be related to, kind of like, if there was adult content on the website before. The indexing side might be affected if the content that was on there before was very spammy. So that might be something where we’re kind of like from an indexing point of view it just takes a while to figure out oh this this new website is actually not spammy at all.”

Here is the video embed where he said this:

Forum discussion at YouTube Community.



Source link

SEARCHENGINES

Google Advice For JavaScript Heavy Sites: Have Content Load First

Published

on

Google Says Near Duplicate URLs With Canonical Still Can Lead To Wrong URL Ranking

Gary Illyes from Google posted a PSA (public service announcement) of sorts on Mastodon and LinkedIn for sites that are heavy with JavaScript. He said that you should try to load the content, including the “marginal boilerplate” content, first when you have a JavaScript-heavy site.

He said this is recent advice based on seeing a “bunch” of emails with complaints from SEOs where they see “lots of dups reported in Search Console.” He said when he tried to load the pages in his browser, it “took forever to load,” “so rendering timed out (my most likely explanation) and we were left with a bunch of pages that only had the boilerplate. With only the boilerplate, those pages are dups,” he added.

So if you see this issue in Google Search Console for your JavaScript heavy site, then try it, try to load the content first.

Here is what Gary posted:

Do you have a JavaScript-heavy site and you see lots of dups reported in Search Console? Try to restructure the js calls such that the content (including marginal boilerplate) loads first and see if that helps.

I have a bunch of emails in my inbox where the issue is that the centerpiece took forever to load, so rendering timed out (my most likely explanation) and we were left with a bunch of pages that only had the boilerplate. With only the boilerplate, those pages are dups.

Gary later added on Mastodon, “Search engines are in fact very similar to a user’s browser when it comes to indexing, but a user doesn’t access billions of pages (or however many search engines typically access) every day, so they must have stricter limits.”

Forum discussion at Mastodon and LinkedIn.

Source: www.seroundtable.com

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending

en_USEnglish