Connect with us

SEO

Google Considers Reducing Webpage Crawl Rate

Published

on

Google Considers Reducing Webpage Crawl Rate

Google may reduce the frequency of crawling webpages as it grows more conscious of the sustainability of crawling and indexing.

This topic is discussed by Google’s Search Relations team, which is made up of John Mueller, Martin Splitt, and Gary Illyes.

Together, in the latest episode of the Search Off the Record podcast, they discuss what to expect from Google in 2022 and beyond.

Among the topics they address is crawling and indexing, which SEO professionals and website owners say they’ve seen less of over the past year.

That’s going to be a key focus for Google this year as it aims to make crawling more sustainable by conserving computing resources.

Advertisement

Here’s what that will mean for your website and its performance in search results.

Sustainability Of Crawling & Indexing

Since Googlebot crawling and indexing happens virtually, it’s not something you may think has an impact on the environment.

Illyes brings this issue to attention when he says computing isn’t sustainable in general:

“… what I mean is that computing, in general, is not really sustainable. And if you think of Bitcoin, for example, Bitcoin mining has real impact on the environment that you can actually measure, especially if the electricity is coming from coal plants or other less sustainable plants.

We are carbon-free, since I don’t even know, 2007 or something, 2009, but it doesn’t mean that we can’t reduce even more our footprint on the environment. And crawling is one of those things that early on, we could chop off some low-hanging fruits.”

The low-hanging fruits, in this instance, refers to unnecessary web crawling. Such as crawling webpages that haven’t had any recent updates.

How Will Google Make Crawling More Sustainable?

Illyes goes on to explain that web crawling can be made more sustainable by cutting down on refresh crawls.

Advertisement

There are two types of Googlebot crawling: crawling to discover new content and crawling to refresh existing content.

Google is considering scaling back on crawling to refresh content.

Illyes continues:

“… one thing that we do, and we might not need to do that much, is refresh crawls. Which means that once we discovered a document, a URL, then we go, we crawl it, and then, eventually, we are going to go back and revisit that URL. That is a refresh crawl.

And then every single time we go back to that one URL, that will always be a refresh crawl. Now, how often do we need to go back to that URL?”

He goes on to give an example of certain websites that warrant a significant number of refresh crawls for some parts of the site but not others.

A website like Wall Street Journal is constantly updating its homepage with new content, so it deserves a lot of refresh crawls.

Advertisement

However, WSJ is not likely updating its About page as frequently, so Google doesn’t need to keep doing refresh crawls on those types of pages.

“So you don’t have to go back there that much. And often, we can’t estimate this well, and we definitely have room for improvement there on refresh crawls. Because sometimes it just seems wasteful that we are hitting the same URL over and over again.

Sometimes we are hitting 404 pages, for example, for no good reason or no apparent reason. And all these things are basically stuff that we could improve on and then reduce our footprint even more.”

If Google were to cut down on refresh crawls, which is not 100% confirmed, here’s the impact that could have on your website.

What Does A Reduction In Crawl Rate Mean For Your Website?

There’s a belief out there that a high crawl rate is a positive SEO signal, even if you’re not updating your content as often as Google is crawling it.

That’s a misconception, Illyes says, as content will not necessarily rank better because it gets crawled more.

Mueller:

Advertisement

“So I guess that’s kind of also a misconception that people have inthat they think if a page gets crawled more, it’ll get ranked more. Is that correct that that’s a misconception, or is that actually true?”

Illyes:

“It’s a misconception.”

Mueller:

“OK, so no need to try to force something to be re-crawled if it doesn’t actually change. It’s not going to rank better.”

Again, it’s not confirmed that Google will reduce refresh crawls, but it’s an idea the team is actively considering.

If Google follows through on this idea it won’t be a bad thing for your website. More crawling does not mean better rankings.

Moreover, the idea is to learn which pages need refresh crawls and which pages do not. That means the pages you change more often will continue to be refreshed and updated in search results.

For more details on how Google plans to pull this off, listen to the full discussion in the podcast below (starting at the 2:40 mark):

Advertisement


Featured Image: Alena Veasey/Shutterstock




Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Limits News Links In California Over Proposed ‘Link Tax’ Law

Published

on

By

A brown cardboard price tag with a twine string and a black dollar sign symbol, influenced by the Link Tax Law, set against a dark gray background.

Google announced that it plans to reduce access to California news websites for a portion of users in the state.

The decision comes as Google prepares for the potential passage of the California Journalism Preservation Act (CJPA), a bill requiring online platforms like Google to pay news publishers for linking to their content.

What Is The California Journalism Preservation Act?

The CJPA, introduced in the California State Legislature, aims to support local journalism by creating what Google refers to as a “link tax.”

If passed, the Act would force companies like Google to pay media outlets when sending readers to news articles.

However, Google believes this approach needs to be revised and could harm rather than help the news industry.

Advertisement

Jaffer Zaidi, Google’s VP of Global News Partnerships, stated in a blog post:

“It would favor media conglomerates and hedge funds—who’ve been lobbying for this bill—and could use funds from CJPA to continue to buy up local California newspapers, strip them of journalists, and create more ghost papers that operate with a skeleton crew to produce only low-cost, and often low-quality, content.”

Google’s Response

To assess the potential impact of the CJPA on its services, Google is running a test with a percentage of California users.

During this test, Google will remove links to California news websites that the proposed legislation could cover.

Zaidi states:

“To prepare for possible CJPA implications, we are beginning a short-term test for a small percentage of California users. The testing process involves removing links to California news websites, potentially covered by CJPA, to measure the impact of the legislation on our product experience.”

Google Claims Only 2% of Search Queries Are News-Related

Zaidi highlighted peoples’ changing news consumption habits and its effect on Google search queries (emphasis mine):

“It’s well known that people are getting news from sources like short-form videos, topical newsletters, social media, and curated podcasts, and many are avoiding the news entirely. In line with those trends, just 2% of queries on Google Search are news-related.”

Despite the low percentage of news queries, Google wants to continue helping news publishers gain visibility on its platforms.

Advertisement

However, the “CJPA as currently constructed would end these investments,” Zaidi says.

A Call For A Different Approach

In its current form, Google maintains that the CJPA undermines news in California and could leave all parties worse off.

The company urges lawmakers to consider alternative approaches supporting the news industry without harming smaller local outlets.

Google argues that, over the past two decades, it’s done plenty to help news publishers innovate:

“We’ve rolled out Google News Showcase, which operates in 26 countries, including the U.S., and has more than 2,500 participating publications. Through the Google News Initiative we’ve partnered with more than 7,000 news publishers around the world, including 200 news organizations and 6,000 journalists in California alone.”

Zaidi suggested that a healthy news industry in California requires support from the state government and a broad base of private companies.

As the legislative process continues, Google is willing to cooperate with California publishers and lawmakers to explore alternative paths that would allow it to continue linking to news.

Advertisement

Featured Image:Ismael Juan/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

The Best of Ahrefs’ Digest: March 2024

Published

on

The Best of Ahrefs’ Digest: March 2024

Every week, we share hot SEO news, interesting reads, and new posts in our newsletter, Ahrefs’ Digest.

If you’re not one of our 280,000 subscribers, you’ve missed out on some great reads!

Here’s a quick summary of my personal favorites from the last month:

Best of March 2024

How 16 Companies are Dominating the World’s Google Search Results

Author: Glen Allsopp

tl;dr

Glen’s research reveals that just 16 companies representing 588 brands get 3.5 billion (yes, billion!) monthly clicks from Google.

My takeaway

Glen pointed out some really actionable ideas in this report, such as the fact that many of the brands dominating search are adding mini-author bios.

Advertisement
Example of mini-author bios on The VergeExample of mini-author bios on The Verge

This idea makes so much sense in terms of both UX and E-E-A-T. I’ve already pitched it to the team and we’re going to implement it on our blog.

How Google is Killing Independent Sites Like Ours

Authors: Gisele Navarro, Danny Ashton

tl;dr

Big publications have gotten into the affiliate game, publishing “best of” lists about everything under the sun. And despite often not testing products thoroughly, they’re dominating Google rankings. The result, Gisele and Danny argue, is that genuine review sites suffer and Google is fast losing content diversity.

My takeaway

I have a lot of sympathy for independent sites. Some of them are trying their best, but unfortunately, they’re lumped in with thousands of others who are more than happy to spam.

Estimated search traffic to Danny and Gisele's site fell off a cliff after Google's March updatesEstimated search traffic to Danny and Gisele's site fell off a cliff after Google's March updates
Estimated search traffic to Danny and Gisele’s site fell off a cliff after Google’s March updates 🙁 

I know it’s hard to hear, but the truth is Google benefits more from having big sites in the SERPs than from having diversity. That’s because results from big brands are likely what users actually want. By and large, people would rather shop at Walmart or ALDI than at a local store or farmer’s market.

That said, I agree with most people that Forbes (with its dubious contributor model contributing to scams and poor journalism) should not be rewarded so handsomely.

The Discussion Forums Dominating 10,000 Product Review Search Results

Author: Glen Allsopp

Tl;dr

Glen analyzed 10,000 “product review” keywords and found that:

Advertisement

My takeaway

After Google’s heavy promotion of Reddit from last year’s Core Update, to no one’s surprise, unscrupulous SEOs and marketers have already started spamming Reddit. And as you may know, Reddit’s moderation is done by volunteers, and obviously, they can’t keep up.

I’m not sure how this second-order effect completely escaped the smart minds at Google, but from the outside, it feels like Google has capitulated to some extent.

John Mueller seemingly having too much faith in Reddit...John Mueller seemingly having too much faith in Reddit...

I’m not one to make predictions and I have no idea what will happen next, but I agree with Glen: Google’s results are the worst I’ve seen them. We can only hope Google sorts itself out.

Who Sends Traffic on the Web and How Much? New Research from Datos & SparkToro

Author: Rand Fishkin

tl;dr

63.41% of all U.S. web traffic referrals from the top 170 sites are initiated on Google.com.

Data from SparktoroData from Sparktoro

My takeaway

Despite all of our complaints, Google is still the main platform to acquire traffic from. That’s why we all want Google to sort itself out and do well.

But it would also be a mistake to look at this post and think Google is the only channel you should drive traffic from. As Rand’s later blog post clarifies, “be careful not to ascribe attribution or credit to Google when other investments drove the real value.”

I think many affiliate marketers learned this lesson well from the past few Core Updates: Relying on one single channel to drive all of your traffic is not a good idea. You should be using other platforms to build brand awareness, interest, and demand.

Want more?

Each week, our team handpicks the best SEO and marketing content from around the web for our newsletter. Sign up to get them directly in your inbox.

Advertisement



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Unplugs “Notes on Search” Experiment

Published

on

By

Google unplugs Notes On Search Experiment

Google is shutting down it’s Google Notes Search Labs experiment that allowed users to see and leave notes on Google’s search results and many in the search community aren’t too surprised.

Google Search Notes

Availability of the feature was limited to Android and Apple devices and there was never a clearly defined practical purpose or usefulness of the Notes experiment. Search marketers reaction throughout has consistently been that would become a spam-magnet.

The Search Labs page for the experiment touts it as mode of self-expression, to help other users and as a way for users to collect their own notes within their Google profiles.

The official Notes page in Search Labs has a simple notice:

Notes on Search Ends May 2024

That’s it.

Advertisement

Screenshot Of Notice

Reaction From Search Community

Kevin Indig tweeted his thoughts that anything Google makes with a user generated content aspect was doomed to attract spam.

He tweeted:

“I’m gonna assume Google retires notes because of spam.

It’s crazy how spammy the web has become. Google can’t launch anything UGC without being bombarded.”

Cindy Krum (@Suzzicks) tweeted that it was author Purna Virji (LinkedIn profile) who predicted that it would be shut down once Google received enough data.

She shared:

Advertisement

“It was actually @purnavirji who predicted it when we were at @BarbadosSeo – while I was talking. Everyone agreed that it would be spammed, but she said it would just be a test to collect a certain type of information until they got what they needed, and then it would be retired.”

Purna herself responded with a tweet:

“My personal (non-employer) opinion is that everyone wants all the UGC to train the AI models. Eg Reddit deal also could potentially help with that.”

Google’s Notes for Search seemed destined to never take off, it was met with skepticism and a shrug when it came out and nobody’s really mourning that it’s on the way out, either.

Featured Image by Shutterstock/Jamesbin



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS