Connect with us

SEARCHENGINES

Google On The Most Expensive Pages To Crawl

Published

on

Google On The Most Expensive Pages To Crawl

There is a fun Twitter thread where Google’s John Mueller and SEO Elmer Boutin talk about expensive pages for Google to crawl. John starts off by explaining that Google doesn’t think about it from an expense or not, it is more about if the page is something that is relevant and useful – that is what Google cares about.

John goes on to say that the “most “expensive” pages are those we’ve been crawling & indexing for years, and nobody has looked for them ever.” But Google crawls and indexes those pages because “what if tomorrow is different? It’s good to be prepared,” he added. He then threw out the line “15% of all searches are new every day, you never know.” Indeed.

Here are those tweets:

Sure, Google tries to be as efficient as possible when it comes to everything it does. That means discovering, crawling, indexing, ranking and serving are all done with efficiency in mind. But not at the expense of not providing the most useful and relevant search result to the searcher.

Forum discussion at Twitter.



Source: www.seroundtable.com

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEARCHENGINES

Google Business Profile Services Showing Incorrect Pricing

Published

on

Google Pencil Wood

Google Business Profiles lets you see what business listings offer service-wise, they’ve been doing this for a while now, and now they seem to impact your local rankings. But what is new and scary is that Google is making up pricing for your services that are almost always incorrect and sometimes dangerous for that businesses.

Carrie Hill and Sukhjit S Matharu spotted this and posted a couple of examples, one from a client and one from some random business. In both cases, the pricing Google listed are incorrect. She said her client is not offering these services for free, despite what Google says. And her client’s competitors are not offering bed bug inspection for only $1 and $100.

Here is what Sukhjit S Matharu shared on Twitter, saying “when looking into a client’s services in their GBP, we noticed that some of the predefined services had a “free” label which we nor the client added.”

click for full size

Here is what Carrie Hill shared on Twitter saying “Here’s another where pricing is arbitrarily added in – not from the client… certainly not correct!”

click for full size

Joy Hawkins, a local SEO, also confirmed this is new.

I wonder if this is easy for the business to fix by going into their Google Business Profiles and editing their services. But I suspect most of these businesses have no clue Google added these prices to their services and it might lead to some bad reviews if a customer is charged or quoted more than what is listed in Google Search.

This reminds me when Google Local Service Ads estimated pricing, which upset many businesses.

Forum discussion at Twitter.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEARCHENGINES

Google’s 15MB Googlebot Limit Is For Each Individual Subresources

Published

on

Google Java Coffee Bot

Google has updated the Googlebot help document on crawling to clarify that the 15MB fetch size limit applies to each fetch of the individual subresources referenced in the HTML as well, such as JavaScript and CSS files.

Google initially added information about this 15MB limit several months ago, which caused a lot of concern in the SEO industry, so make sure to read that story.

The help document now reads:

Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Each resource referenced in the HTML such as CSS and JavaScript is fetched separately, and each fetch is bound by the same file size limit. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers, for example Googlebot Video and Googlebot Image, may have different limits.

It previously read:

Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Any resources referenced in the HTML such as images, videos, CSS, and JavaScript are fetched separately. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers may have different limits.

Gary added on LinkedIn, “PSA from my inbox: The 15MB resource fetch threshold applies on the JavaScript resources, too, so if your JavaScript files are larger than that, your site might have a bad time in Google Search. See googlebot.com for information about the fetch features and limitations. Yes, there are JavaScript resources out there that are larger than 15MB. No, it’s not a good idea to have JavaScript files that are that large.”

Here are some comments on early feedback from the SEO community on this:

Forum discussion at LinkedIn.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEARCHENGINES

Sam Michelson On Google Algorithm Updates With Reputation Management

Published

on

Sam Michelson

In part one, we spoke about who Sam Michelson is and his business development techniques. In part two, we dove into CRM software and how they use it, including looking at a new piece of software for partner management. Then in part three, we go into pricing your services, and in part four, we talk about human resources. In part five, we talk about Google algorithm updates.

He said in the old days of SEO, it was live and die by these Google algorithm updates. With reputation management, it is about managing the first ten to twenty results, and it is about doing SEO for not just your site but other sites you don’t manage.

They take a data-centric approach to Google algorithms. They basically look at the search results for the competing search results. Then they try to replicate those results based on what Google tends to rank for those queries. Then you find a pattern, and then you try to repeat that pattern.

Reputation management aims to build a search result set that Google wants to show, people want to see, and your clients approve of. And that type of content is things you own, earned media, profiles, and it tends to be all very deliberate.

For more on Sam Michelson, visit Five Blocks and find him on LinkedIn.


You can subscribe to our YouTube channel by clicking here so you don’t miss the next vlog where I interviews. I do have a nice lineup of interviews scheduled with SEOs and SEMS, many of which you don’t want to miss – and I promise to continue to make these vlogs better over time. If you want to be interviewed, please fill out this form with your details.

Forum discussion at YouTube.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

en_USEnglish