Connect with us

SEARCHENGINES

Daily Search Forum Recap: October 31, 2022

Published

on

Here is a recap of what happened in the search forums today, through the eyes of the Search Engine Roundtable and other search forums on the web.


A new Google local ranking study shows keywords in reviews does not improve local rankings in Google Search. Want to see a site that got hit hard by the Google spam update, we got one. Google is showing on-time delivery and order accuracy data in search listings. Google Search Console’s geographic setting is still available but it likely does not work. Google has new fall decorations for some search results. And a new vlog is out today, this one with Rick Mariano.

Search Engine Roundtable Stories:

Other Great Search Threads:

Search Engine Land Stories:

Other Great Search Stories:

Analytics

Industry & Business

Links & Content Marketing

Local & Maps

Mobile & Voice

SEO

PPC

Other Search

Feedback:


Have feedback on this daily recap; let me know on Twitter @rustybrick or @seroundtable, you can follow us on Facebook and make sure to subscribe to the YouTube channel, Apple Podcasts, Spotify, Google Podcasts or just contact us the old fashion way.



Source: www.seroundtable.com

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEARCHENGINES

Can Bing Chat Access Content Behind Paywalls?

Published

on

Bing Paywall

There is some concern and speculation on the internet that Microsoft Bing is feeding in content behind paywall and using such content to provide answers in Bing Chat. I asked Bing Chat if it can give answers based on content behind paywalls and it said no, it cannot.

But I am not sure if this answer is 100% true:

click for full size

Here is one thread about Bing Chat referencing and citing content behind a paywall to provide an answer for Bing Chat:

Now, is this possible? Well, there can be answers on why Bing was able to access this content:

(1) Maybe the content was open for a period of time where it was not behind a paywall and Bing indexed it?

(2) Maybe the content provider is giving this paywalled content to Bingbot without a paywall. There are approved ways to give paywalled content to search engines, like the old first click free and flexible sampling solutions.

So technically, the content might now be behind a paywall for users but not for search engines.

So technically, Bingbot doesn’t see the paywall but users might.

That is a possible technical explanation.

Forum discussion at Twitter.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEARCHENGINES

Google Business Profile Services Showing Incorrect Pricing

Published

on

Google Pencil Wood

Google Business Profiles lets you see what business listings offer service-wise, they’ve been doing this for a while now, and now they seem to impact your local rankings. But what is new and scary is that Google is making up pricing for your services that are almost always incorrect and sometimes dangerous for that businesses.

Carrie Hill and Sukhjit S Matharu spotted this and posted a couple of examples, one from a client and one from some random business. In both cases, the pricing Google listed are incorrect. She said her client is not offering these services for free, despite what Google says. And her client’s competitors are not offering bed bug inspection for only $1 and $100.

Here is what Sukhjit S Matharu shared on Twitter, saying “when looking into a client’s services in their GBP, we noticed that some of the predefined services had a “free” label which we nor the client added.”

click for full size

Here is what Carrie Hill shared on Twitter saying “Here’s another where pricing is arbitrarily added in – not from the client… certainly not correct!”

click for full size

Joy Hawkins, a local SEO, also confirmed this is new.

I wonder if this is easy for the business to fix by going into their Google Business Profiles and editing their services. But I suspect most of these businesses have no clue Google added these prices to their services and it might lead to some bad reviews if a customer is charged or quoted more than what is listed in Google Search.

This reminds me when Google Local Service Ads estimated pricing, which upset many businesses.

Forum discussion at Twitter.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEARCHENGINES

Google’s 15MB Googlebot Limit Is For Each Individual Subresources

Published

on

Google Java Coffee Bot

Google has updated the Googlebot help document on crawling to clarify that the 15MB fetch size limit applies to each fetch of the individual subresources referenced in the HTML as well, such as JavaScript and CSS files.

Google initially added information about this 15MB limit several months ago, which caused a lot of concern in the SEO industry, so make sure to read that story.

The help document now reads:

Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Each resource referenced in the HTML such as CSS and JavaScript is fetched separately, and each fetch is bound by the same file size limit. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers, for example Googlebot Video and Googlebot Image, may have different limits.

It previously read:

Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Any resources referenced in the HTML such as images, videos, CSS, and JavaScript are fetched separately. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers may have different limits.

Gary added on LinkedIn, “PSA from my inbox: The 15MB resource fetch threshold applies on the JavaScript resources, too, so if your JavaScript files are larger than that, your site might have a bad time in Google Search. See googlebot.com for information about the fetch features and limitations. Yes, there are JavaScript resources out there that are larger than 15MB. No, it’s not a good idea to have JavaScript files that are that large.”

Here are some comments on early feedback from the SEO community on this:

Forum discussion at LinkedIn.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

en_USEnglish