Connect with us

NEWS

Data Seemingly Proves Googlebot Crawling Has Slowed via @sejournal, @martinibuster

Published

on

googlebot indexing slowed 61940943f3c0c sej

There have been an ongoing discussions over the past few weeks across social media that Googlebot has dramatically reduced its crawling. For example, the founder of a web crawl analysis service tweeted a graph showing how Google’s crawl activity has declined since November 11, 2021.

Although the indexing slowdown doesn’t affect all sites, many on Twitter and Reddit agree that something changed at Google with respect to indexing and prove it with screenshots of Googlebot activity.

Evidence of Reduced Crawling

Anecdotal evidence of Google crawling anomalies have been stacking up on social media. The problem with social media is that one can literally make any observation about Google and nearly be guaranteed to receive agreement.

Anecdotal is interesting but not as good as data backed observations, which is what appeared recently on Twitter.

A founder of crawler and log analysis service Seolyzer (@Seolyzer_io) posted a graph of Google crawling behavior that showed a dramatic drop off of crawling activity beginning on November 11th.

Advertisement

Continue Reading Below

He posted:

“Googlebot is on strike! Googlebot has drastically reduced its crawl activity on many large sites since November 11 at 6PM (GMT).”

304 Server Response Code and Googlebot Crawling

Some have noted a pattern with Googlebot suddenly no longer crawling pages that serve a 304 server response code.

A 304 response code means 304 (Not Modified).

That response code is generated by a server when a browser (or Googlebot) makes a conditional request for a page.

That means that a browser (or Googlebot) tells the server it has a web page saved in cache so don’t bother serving it unless that page has been updated (modified).

Here is a definition of the 304 (Not Modified) server response code from the HTTP Working Group:

“The 304 (Not Modified) status code indicates that a conditional GET or HEAD request has been received and would have resulted in a 200 (OK) response if it were not for the fact that the condition evaluated to false.

In other words, there is no need for the server to transfer a representation of the target resource because the request indicates that the client, which made the request conditional, already has a valid representation; the server is therefore redirecting the client to make use of that stored representation as if it were the payload of a 200 (OK) response.”

Advertisement

Continue Reading Below

304 Response Causes Less Googlebot Crawling?

One person tweeted confirmation (in French) that on several sites with AMP that he monitors experienced a drop on pages that responded with a 304 response.

The person who posted the original tweet responded with a post of a graph showing how Google nearly stopped crawling pages that responded with a 304 server response code:

Others noticed a similar issue where pages serving a 304 response had drastically lower crawl rates:

Another person noticed reduced crawls on travel pages but a crawl increase on ecommerce pages:

Many others are sharing analytics and search console screenshots:

More data:

304 Response Code Should Not Alter Crawling

Google’s official developer help page documentation on Googlebot crawling states that a 304 response code should not impact crawling.

Here’s what Google’s official documentation advises:

“Googlebot signals the indexing pipeline that the content is the same as last time it was crawled.

The indexing pipeline may recalculate signals for the URL, but otherwise the status code has no effect on indexing.”

Advertisement

Continue Reading Below

Is it possible that Google has changed (permanently or temporarily) and that developer page is outdated?

Cookie Consent Theory

The 304 Server Response theory is one of many theories and solutions to explain why Googlebot might not index a web page.

One person tweeted that Google increased indexing after removing a cookie consent bar.

Why would a cookie response bar cause Google to not index a web page? Could the cookie consent bar have triggered a 304 response, causing Google to not index the page?

Reduced Googlebot Crawls Discussed at Reddit

The phenomenon of reduced Googlebot crawls were also discussed on Reddit.

A Redditor described how in the past articles from their successful site were indexed within 10 minutes of submitting them via Google Search Console.

They related that recently only half of new articles were being indexed.

Advertisement

Continue Reading Below

But that changed in November according to this Reddit post:

“For whatever reason now less than half of our new articles are indexing, even with me manually submitting them all right after publishing.”

Other redditors shared similar experiences:

“A lot of people are experiencing similar right now… Something seems to be going on with Google.”

“Something is up with Google indexing new posts…”

“My website is 17 years old… suddenly, the latest article took weeks to get indexed.”

Google Says Nothing is Broken

Google’s John Mueller responded to the questions on Reddit:

“I don’t see anything broken in the way Google indexes stuff at the moment. I do see us being critical about what we pick up for indexing though, as any search engine should.”

Is Google Testing New Crawling Patterns?

Bing in October announced an open source indexing protocol called IndexNow whose goal is to reduce how often crawlers crawl web pages in order to reduce how much energy is used at data centers for crawling and at servers for serving web pages. The new protocol benefits publishers because it speeds up the process of notifying search engines when pages are updated or created, resulting in faster indexing of quality web pages.

Advertisement

Continue Reading Below

In November Google announced that it would test the new IndexNow indexing protocol to see if there are benefits to it.

Saving energy and reducing the carbon footprint is one of our most important issues of today. Could it be that Google is improving on ways to reduce crawling without radically changing to a new protocol?

Has Google Reduced Web Page Crawling?

There are some claims that Google has stopped indexing altogether but that is incorrect. However there is significant discussion on social media backed with data to support that Googlebot indexing patterns have changed.

Searchenginejournal.com

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

NEWS

OpenAI Introduces Fine-Tuning for GPT-4 and Enabling Customized AI Models

Published

on

By

OpenAI Introduces Fine-Tuning for GPT-4 and Enabling Customized AI Models

OpenAI has today announced the release of fine-tuning capabilities for its flagship GPT-4 large language model, marking a significant milestone in the AI landscape. This new functionality empowers developers to create tailored versions of GPT-4 to suit specialized use cases, enhancing the model’s utility across various industries.

Fine-tuning has long been a desired feature for developers who require more control over AI behavior, and with this update, OpenAI delivers on that demand. The ability to fine-tune GPT-4 allows businesses and developers to refine the model’s responses to better align with specific requirements, whether for customer service, content generation, technical support, or other unique applications.

Why Fine-Tuning Matters

GPT-4 is a very flexible model that can handle many different tasks. However, some businesses and developers need more specialized AI that matches their specific language, style, and needs. Fine-tuning helps with this by letting them adjust GPT-4 using custom data. For example, companies can train a fine-tuned model to keep a consistent brand tone or focus on industry-specific language.

Fine-tuning also offers improvements in areas like response accuracy and context comprehension. For use cases where nuanced understanding or specialized knowledge is crucial, this can be a game-changer. Models can be taught to better grasp intricate details, improving their effectiveness in sectors such as legal analysis, medical advice, or technical writing.

Key Features of GPT-4 Fine-Tuning

The fine-tuning process leverages OpenAI’s established tools, but now it is optimized for GPT-4’s advanced architecture. Notable features include:

  • Enhanced Customization: Developers can precisely influence the model’s behavior and knowledge base.
  • Consistency in Output: Fine-tuned models can be made to maintain consistent formatting, tone, or responses, essential for professional applications.
  • Higher Efficiency: Compared to training models from scratch, fine-tuning GPT-4 allows organizations to deploy sophisticated AI with reduced time and computational cost.

Additionally, OpenAI has emphasized ease of use with this feature. The fine-tuning workflow is designed to be accessible even to teams with limited AI experience, reducing barriers to customization. For more advanced users, OpenAI provides granular control options to achieve highly specialized outputs.

Implications for the Future

The launch of fine-tuning capabilities for GPT-4 signals a broader shift toward more user-centric AI development. As businesses increasingly adopt AI, the demand for models that can cater to specific business needs, without compromising on performance, will continue to grow. OpenAI’s move positions GPT-4 as a flexible and adaptable tool that can be refined to deliver optimal value in any given scenario.

By offering fine-tuning, OpenAI not only enhances GPT-4’s appeal but also reinforces the model’s role as a leading AI solution across diverse sectors. From startups seeking to automate niche tasks to large enterprises looking to scale intelligent systems, GPT-4’s fine-tuning capability provides a powerful resource for driving innovation.

OpenAI announced that fine-tuning GPT-4o will cost $25 for every million tokens used during training. After the model is set up, it will cost $3.75 per million input tokens and $15 per million output tokens. To help developers get started, OpenAI is offering 1 million free training tokens per day for GPT-4o and 2 million free tokens per day for GPT-4o mini until September 23. This makes it easier for developers to try out the fine-tuning service.

As AI continues to evolve, OpenAI’s focus on customization and adaptability with GPT-4 represents a critical step in making advanced AI accessible, scalable, and more aligned with real-world applications. This new capability is expected to accelerate the adoption of AI across industries, creating a new wave of AI-driven solutions tailored to specific challenges and opportunities.

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

GOOGLE

This Week in Search News: Simple and Easy-to-Read Update

Published

on

This Week in Search News: Simple and Easy-to-Read Update

Here’s what happened in the world of Google and search engines this week:

1. Google’s June 2024 Spam Update

Google finished rolling out its June 2024 spam update over a period of seven days. This update aims to reduce spammy content in search results.

2. Changes to Google Search Interface

Google has removed the continuous scroll feature for search results. Instead, it’s back to the old system of pages.

3. New Features and Tests

  • Link Cards: Google is testing link cards at the top of AI-generated overviews.
  • Health Overviews: There are more AI-generated health overviews showing up in search results.
  • Local Panels: Google is testing AI overviews in local information panels.

4. Search Rankings and Quality

  • Improving Rankings: Google said it can improve its search ranking system but will only do so on a large scale.
  • Measuring Quality: Google’s Elizabeth Tucker shared how they measure search quality.

5. Advice for Content Creators

  • Brand Names in Reviews: Google advises not to avoid mentioning brand names in review content.
  • Fixing 404 Pages: Google explained when it’s important to fix 404 error pages.

6. New Search Features in Google Chrome

Google Chrome for mobile devices has added several new search features to enhance user experience.

7. New Tests and Features in Google Search

  • Credit Card Widget: Google is testing a new widget for credit card information in search results.
  • Sliding Search Results: When making a new search query, the results might slide to the right.

8. Bing’s New Feature

Bing is now using AI to write “People Also Ask” questions in search results.

9. Local Search Ranking Factors

Menu items and popular times might be factors that influence local search rankings on Google.

10. Google Ads Updates

  • Query Matching and Brand Controls: Google Ads updated its query matching and brand controls, and advertisers are happy with these changes.
  • Lead Credits: Google will automate lead credits for Local Service Ads. Google says this is a good change, but some advertisers are worried.
  • tROAS Insights Box: Google Ads is testing a new insights box for tROAS (Target Return on Ad Spend) in Performance Max and Standard Shopping campaigns.
  • WordPress Tag Code: There is a new conversion code for Google Ads on WordPress sites.

These updates highlight how Google and other search engines are continuously evolving to improve user experience and provide better advertising tools.

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

FACEBOOK

Facebook Faces Yet Another Outage: Platform Encounters Technical Issues Again

Published

on

By

Facebook Problem Again

Uppdated: It seems that today’s issues with Facebook haven’t affected as many users as the last time. A smaller group of people appears to be impacted this time around, which is a relief compared to the larger incident before. Nevertheless, it’s still frustrating for those affected, and hopefully, the issues will be resolved soon by the Facebook team.

Facebook had another problem today (March 20, 2024). According to Downdetector, a website that shows when other websites are not working, many people had trouble using Facebook.

This isn’t the first time Facebook has had issues. Just a little while ago, there was another problem that stopped people from using the site. Today, when people tried to use Facebook, it didn’t work like it should. People couldn’t see their friends’ posts, and sometimes the website wouldn’t even load.

Downdetector, which watches out for problems on websites, showed that lots of people were having trouble with Facebook. People from all over the world said they couldn’t use the site, and they were not happy about it.

When websites like Facebook have problems, it affects a lot of people. It’s not just about not being able to see posts or chat with friends. It can also impact businesses that use Facebook to reach customers.

Since Facebook owns Messenger and Instagram, the problems with Facebook also meant that people had trouble using these apps. It made the situation even more frustrating for many users, who rely on these apps to stay connected with others.

During this recent problem, one thing is obvious: the internet is always changing, and even big websites like Facebook can have problems. While people wait for Facebook to fix the issue, it shows us how easily things online can go wrong. It’s a good reminder that we should have backup plans for staying connected online, just in case something like this happens again.

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending