Connect with us

SEARCHENGINES

Empty Google Cafe Lunch Room

Published

on

Empty Google Cafe Lunch Room


Two years into COVID the Google offices are still pretty empty, here is a recent photo from the main Google headquarters, the GooglePlex, from one of the cafes. It is pretty empty, you can see more photos in the Instagram embed below.

Some Googlers do go to the office here and there but there are essential workers there daily, including the cafe staff.

This post is part of our daily Search Photo of the Day column, where we find fun and interesting photos related to the search industry and share them with our readers.





Source link

SEARCHENGINES

Google Advice For JavaScript Heavy Sites: Have Content Load First

Published

on

Google Says Near Duplicate URLs With Canonical Still Can Lead To Wrong URL Ranking

Gary Illyes from Google posted a PSA (public service announcement) of sorts on Mastodon and LinkedIn for sites that are heavy with JavaScript. He said that you should try to load the content, including the “marginal boilerplate” content, first when you have a JavaScript-heavy site.

He said this is recent advice based on seeing a “bunch” of emails with complaints from SEOs where they see “lots of dups reported in Search Console.” He said when he tried to load the pages in his browser, it “took forever to load,” “so rendering timed out (my most likely explanation) and we were left with a bunch of pages that only had the boilerplate. With only the boilerplate, those pages are dups,” he added.

So if you see this issue in Google Search Console for your JavaScript heavy site, then try it, try to load the content first.

Here is what Gary posted:

Do you have a JavaScript-heavy site and you see lots of dups reported in Search Console? Try to restructure the js calls such that the content (including marginal boilerplate) loads first and see if that helps.

I have a bunch of emails in my inbox where the issue is that the centerpiece took forever to load, so rendering timed out (my most likely explanation) and we were left with a bunch of pages that only had the boilerplate. With only the boilerplate, those pages are dups.

Gary later added on Mastodon, “Search engines are in fact very similar to a user’s browser when it comes to indexing, but a user doesn’t access billions of pages (or however many search engines typically access) every day, so they must have stricter limits.”

Forum discussion at Mastodon and LinkedIn.

Source: www.seroundtable.com

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending

en_USEnglish