SEARCHENGINES
Daily Search Forum Recap: October 31, 2022
Here is a recap of what happened in the search forums today, through the eyes of the Search Engine Roundtable and other search forums on the web.
A new Google local ranking study shows keywords in reviews does not improve local rankings in Google Search. Want to see a site that got hit hard by the Google spam update, we got one. Google is showing on-time delivery and order accuracy data in search listings. Google Search Console’s geographic setting is still available but it likely does not work. Google has new fall decorations for some search results. And a new vlog is out today, this one with Rick Mariano.
Search Engine Roundtable Stories:
Other Great Search Threads:
- The Google cache is not a diagnostics tool., John Mueller on Twitter
- October 2022 Google Search Observations, WebmasterWorld
- All of the new tricks and treats in today’s #GoogleDoodle are ghastly good—in fact, it’s the stalk of the town See if you can keep your ghoul and avoid getting lost in the new corn maze game arena, Google Doodles on Twitter
- Worried about many duplicate urls indexed, but blocked by robots.txt? Via @johnmu: If blocked, Google doesn’t know what’s on the page content-wise. If you have to dig for those via advanced site queries, then users typically w, Glenn Gabe on Twitter
- If you have SEO / Search questions, please drop them into our new SEO office hours form at https://t.co/KeLoq7p10V — we’ll be using this as a basis for the next office hours episodes. Tha, John Mueller on Twitter
- It’s up to you to convince users & search engines. Go for it, make something awesome., John Mueller on Twitter
Search Engine Land Stories:
Other Great Search Stories:
Analytics
Industry & Business
Links & Content Marketing
Local & Maps
Mobile & Voice
SEO
PPC
Other Search
Feedback:
Have feedback on this daily recap; let me know on Twitter @rustybrick or @seroundtable, you can follow us on Facebook and make sure to subscribe to the YouTube channel, Apple Podcasts, Spotify, Google Podcasts or just contact us the old fashion way.
Source: www.seroundtable.com
SEARCHENGINES
Can Bing Chat Access Content Behind Paywalls?

There is some concern and speculation on the internet that Microsoft Bing is feeding in content behind paywall and using such content to provide answers in Bing Chat. I asked Bing Chat if it can give answers based on content behind paywalls and it said no, it cannot.
But I am not sure if this answer is 100% true:
Here is one thread about Bing Chat referencing and citing content behind a paywall to provide an answer for Bing Chat:
It’s a tricky minefield. If proven that these generative AIs are trained on proprietary and/or paywalled content, it opens the door to, shall we say, interesting litigation.
— Barry Adams 📰 (@badams) March 19, 2023
Now, is this possible? Well, there can be answers on why Bing was able to access this content:
(1) Maybe the content was open for a period of time where it was not behind a paywall and Bing indexed it?
(2) Maybe the content provider is giving this paywalled content to Bingbot without a paywall. There are approved ways to give paywalled content to search engines, like the old first click free and flexible sampling solutions.
So technically, the content might now be behind a paywall for users but not for search engines.
So technically, Bingbot doesn’t see the paywall but users might.
That is a possible technical explanation.
Forum discussion at Twitter.
SEARCHENGINES
Google Business Profile Services Showing Incorrect Pricing

Google Business Profiles lets you see what business listings offer service-wise, they’ve been doing this for a while now, and now they seem to impact your local rankings. But what is new and scary is that Google is making up pricing for your services that are almost always incorrect and sometimes dangerous for that businesses.
Carrie Hill and Sukhjit S Matharu spotted this and posted a couple of examples, one from a client and one from some random business. In both cases, the pricing Google listed are incorrect. She said her client is not offering these services for free, despite what Google says. And her client’s competitors are not offering bed bug inspection for only $1 and $100.
Here is what Sukhjit S Matharu shared on Twitter, saying “when looking into a client’s services in their GBP, we noticed that some of the predefined services had a “free” label which we nor the client added.”
Here is what Carrie Hill shared on Twitter saying “Here’s another where pricing is arbitrarily added in – not from the client… certainly not correct!”
Joy Hawkins, a local SEO, also confirmed this is new.
I wonder if this is easy for the business to fix by going into their Google Business Profiles and editing their services. But I suspect most of these businesses have no clue Google added these prices to their services and it might lead to some bad reviews if a customer is charged or quoted more than what is listed in Google Search.
This reminds me when Google Local Service Ads estimated pricing, which upset many businesses.
Forum discussion at Twitter.
SEARCHENGINES
Google’s 15MB Googlebot Limit Is For Each Individual Subresources

Google has updated the Googlebot help document on crawling to clarify that the 15MB fetch size limit applies to each fetch of the individual subresources referenced in the HTML as well, such as JavaScript and CSS files.
Google initially added information about this 15MB limit several months ago, which caused a lot of concern in the SEO industry, so make sure to read that story.
The help document now reads:
Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Each resource referenced in the HTML such as CSS and JavaScript is fetched separately, and each fetch is bound by the same file size limit. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers, for example Googlebot Video and Googlebot Image, may have different limits.
It previously read:
Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Any resources referenced in the HTML such as images, videos, CSS, and JavaScript are fetched separately. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers may have different limits.
Gary added on LinkedIn, “PSA from my inbox: The 15MB resource fetch threshold applies on the JavaScript resources, too, so if your JavaScript files are larger than that, your site might have a bad time in Google Search. See googlebot.com for information about the fetch features and limitations. Yes, there are JavaScript resources out there that are larger than 15MB. No, it’s not a good idea to have JavaScript files that are that large.”
Here are some comments on early feedback from the SEO community on this:
Are you running into 15mb Don size? That’s quite a bit.
— johnmu is not a chatbot yet 🐀 (@JohnMu) March 19, 2023
The 15mb is just about fetching, it’s totally separate from the indexing side.
— johnmu is not a chatbot yet 🐀 (@JohnMu) March 19, 2023
Forum discussion at LinkedIn.
-
SEARCHENGINES5 days ago
Google Says Ignore Spammy Referral Traffic
-
PPC6 days ago
How to Get Found Online: Our Top 9 Tips for Local Service Businesses
-
MARKETING5 days ago
12 Best Practices to Boost Your TikTok Ad Performance
-
AMAZON1 day ago
The Top 10 Benefits of Amazon AWS Lightsail: Why It’s a Great Choice for Businesses
-
AFFILIATE MARKETING7 days ago
The 17 Best Affiliate Marketing Programs For Beginners – Jeffbullas's Blog
-
SEARCHENGINES6 days ago
John Mueller Offers Hreflang Google SEO Advice
-
MARKETING6 days ago
Top 10 Content Marketing Tools that Improve the Marketing Team’s Productivity
-
SEO6 days ago
SEO for Multiple Locations (Beginner’s Guide)