Connect with us

SEARCHENGINES

Google Gives Advice On When Copy Sites Outrank Your Original Content

Published

on

Google Gives Advice On When Copy Sites Outrank Your Original Content

Danny Sullivan from Google provided some advice and took some feedback on a specific example of when an original piece of research from a website was being outranked by a site copying its information.

This specific example is for the query [r kelly net worth] where the original source of the information supposedly comes from celebritynetworth.com but according to Celebrity Net Worth, the site ranking above them for that query is a “content regurgitators.” Celebrity Net Worth said the change happened with the September core update and they asked what can be done.

Here are those tweets:

Danny Sullivan of Google replied giving a few bits of advice including (1) ask the sites taking your original research to cite and link to your content, (2) make it clear on your page that the original research is yours and yours alone, (3) ensure the details on the page show it is uniquely done, (4) and explain how you get to your original research. This is not to say that this site did anything wrong, Danny said he will pass along this information to the Google Search team.

Here are those tweets:

Previously, Google has said that when scrapers outrank you, it might be due to quality issues with your site or your site was penalized. This is why you often see complaints about this after a core algorithm, which accesses site quality.

Forum discussion at Twitter.



Source: www.seroundtable.com

SEARCHENGINES

Google Advice For JavaScript Heavy Sites: Have Content Load First

Published

on

Google Says Near Duplicate URLs With Canonical Still Can Lead To Wrong URL Ranking

Gary Illyes from Google posted a PSA (public service announcement) of sorts on Mastodon and LinkedIn for sites that are heavy with JavaScript. He said that you should try to load the content, including the “marginal boilerplate” content, first when you have a JavaScript-heavy site.

He said this is recent advice based on seeing a “bunch” of emails with complaints from SEOs where they see “lots of dups reported in Search Console.” He said when he tried to load the pages in his browser, it “took forever to load,” “so rendering timed out (my most likely explanation) and we were left with a bunch of pages that only had the boilerplate. With only the boilerplate, those pages are dups,” he added.

So if you see this issue in Google Search Console for your JavaScript heavy site, then try it, try to load the content first.

Here is what Gary posted:

Do you have a JavaScript-heavy site and you see lots of dups reported in Search Console? Try to restructure the js calls such that the content (including marginal boilerplate) loads first and see if that helps.

I have a bunch of emails in my inbox where the issue is that the centerpiece took forever to load, so rendering timed out (my most likely explanation) and we were left with a bunch of pages that only had the boilerplate. With only the boilerplate, those pages are dups.

Gary later added on Mastodon, “Search engines are in fact very similar to a user’s browser when it comes to indexing, but a user doesn’t access billions of pages (or however many search engines typically access) every day, so they must have stricter limits.”

Forum discussion at Mastodon and LinkedIn.

Source: www.seroundtable.com

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending

en_USEnglish