Connect with us

NEWS

SiteGround Hosting Outage Over – What Happened via @sejournal, @martinibuster

Published

on

siteground outage over 61926f15e9ff3 sej

SiteGround web hosting suffered a significant four day outage beginning Monday November 8, 2021. It wasn’t until November 12th that they tweeted that they had resolved the problem. Many customers lost rankings in Google and a significant amount of website traffic as the holiday shopping season fast approaches.

Many SiteGround publishers remain upset, largely because of the perceived slow pace of recovery from lost Google Search traffic.

What Caused the SiteGround Problem?

SiteGround provided a statement revealed that the problem was isolated to an issue between Amazon’s Global Accelerator and Google’s crawler.

According to SiteGround:

“On Friday we managed to isolate the Google bot crawling issue to a networking problem that was specific only to Amazon’s Global Accelerator and Google’s crawler bot subnet.

We implemented a fix that bypasses that problem and we’re happy to say that our clients’ sites now get properly crawled and most of them have already returned their rankings.

We are still working closely with both Amazon and Google on finding the cause of the problem.

Based on the latest updates, we suspect it’s a routing issue and Amazon is in contact with Google trying to narrow it down. From Amazon we know there are other clients of theirs that have been affected as well.”

Advertisement

Continue Reading Below

What is Amazon Global Accelerator?

Amazon Global Accelerator is  a service that helps solve network congestion on the Internet to help speed up websites.

This is how Amazon describes the Global Accelerator:

“AWS Global Accelerator is a networking service that improves the performance of your users’ traffic by up to 60% using Amazon Web Services’ global network infrastructure. When the internet is congested, AWS Global Accelerator optimizes the path to your application to keep packet loss, jitter, and latency consistently low.”

SiteGround Solves Issue

SiteGround tweeted on Friday November 12th that they had identified the issue and fixed it.

“Status Update: We are glad to inform you that we have implemented a fix for the Google bot crawling issue experienced by some sites. Websites are already being crawled successfully. Please allow a few hours for the DNS changes to take effect. Thank you for your patience!”

Nearly a week after the hosting outage began SiteGround publicly announced on Monday November 15th what the cause of the problem was.

Advertisement

Continue Reading Below

SiteGround tweeted:

“Status Update: On Friday we managed to isolate the Google bot crawling issue to a networking problem that was specific only to Amazon’s Global Accelerator and Google’s crawler bot subnet. We implemented a fix that bypasses that problem.”

SiteGround followed up with a tweet to express their happiness: 

“We’re happy to say that the majority of our clients’ sites are being crawled now and most of them have already returned their rankings.”

That followed another tweet that reported the success of the fix:

“Аll websites hosted on our end were fully operational, and there were no DNS resolution issues with the requests submitted from any other service.”

SiteGround Customers Still Upset

SiteGround implemented a fix. But many customers remained upset over the weekend as their sites still appeared to be affected.

That may not have been a problem at SiteGround but rather caused by delays in the DNS system as that information propagated across Internet, which could take a few days.

Positive Reports from SiteGround Clients

Some customers reported that their sites have recovered:

Many Negative Tweets About SiteGround

Some publishers continued to tweet over the weekend about their ongoing problems, which might have been related to the DNS information or Google’s having to re-crawl the sites.

Nevertheless, customers were still tweeting about the slow pace of website traffic recovery.

SiteGround responded to those tweets on Monday November 15th by reiterating that the problem should already be resolved:

Advertisement

Continue Reading Below

“We’ve deployed a fix and the Google bot can crawl the sites we host.

The issue should be resolved across our platform by now, many clients have confirmed.

Please DM us with some URL examples and additional info so we can help further.”

Why do SiteGround Customers Continue to Suffer?

Although SiteGround announced that the problem has been fixed some customers continue to suffer loss of traffic.

That is not unexpected and perhaps SiteGround could have helped customers by making sure they understood what to expect next in terms of Google’s having to re-index the websites.

Basically, when a site goes missing for an extended period of time Google will begin removing the missing site from its index. That’s what SiteGround customers experienced over the weekend.

However Google never really goes away. Google’s crawler, Googlebot, continues to return to the missing website to check if it returned.

Once the site returns after an extended absence it can take a few days to up to as much as ten days to fully recover, depending on how many web pages need to be re-crawled.

Advertisement

Continue Reading Below

My own experience a couple years ago with hundreds of thousands of web pages that temporarily disappeared was that it took about ten days to recover after a two week outage.

But for most publishers with smaller sites the recovery may be significantly faster.

Google Offers Insights Into Site Recovery

Google’s John Mueller tweeted and retweeted some helpful information about what SiteGround customers should expect in the coming days in terms of Google re-indexing their websites.

Google’s John Mueller tweeted insights about this process in October:

“If you’re curious about what happens in Google Search with an outage like Facebook recently had, it’s generally a 2-part reaction: when we can’t reach a site for network / DNS reasons, we see it like a 5xx HTTP server error. This means we reduce crawling:”

Then:

“The URLs remain indexed as they are, the site continues to rank as it used to. This is a temporary state though. If we becomes a persistent error (if it lasts more than 1-2 days), we will start dropping those URLs from indexing.”

Advertisement

Continue Reading Below

Mueller tweeted that an outage will not cause Google to change the rankings of the site after it comes back.

“There’s no direct ranking change when this happens – we don’t see the site as being low-quality or similar, but if those URLs aren’t indexed, they can’t rank either.”

Once the outage is fixed, Google will re-crawl the websites. This is important to note, that the site cannot resume ranking where it used to rank until Google has finished re-crawling the website.

Mueller tweeted:

“When the site comes back, if we have dropped URLs from indexing, those will generally pop back in once we can successfully crawl them again. Crawling will also speed up again if we can tell that the server’s fine.”

John Mueller repeated this reassurance on November 12, 2021 in a series of tweets:

“Once it’s resolved, Googlebot crawling & indexing picks up automatically again. The crawl rate goes up over time as the errors disappear, the dropped URLs will get recrawled over time and make it back into the index. The visibility will stabilize again.”

Advertisement

Continue Reading Below

He reaffirmed that there are no lasting effects from an outage.

John tweeted

“There are generally no lasting effects from temporary outages like these. Technical issues come & go, we have to do our best to make sure users can find their way to your wonderful sites through search results.”

Mueller tweeted this tip for fast re-indexing:

“If you have important pages that you need reprocessed quickly, I’d use “Inspect URL” in Search Console to resubmit them. Within a website, using internal linking to highlight & link to what you really care about is also good.”

Most replies to Mueller’s tweets were positive but not enough for some to blunt their lingering feelings.

Websites Recover From Temporary Outages

What happened last week was literally an unprecedented event. SiteGround is widely viewed in the industry as a reliable web host, which is why it is so popular.

Advertisement

Continue Reading Below

Websites should be back to ranking where they formerly did within days as Google continues to crawl and re-index the sites that temporarily dropped out.

Searchenginejournal.com

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

NEWS

OpenAI Introduces Fine-Tuning for GPT-4 and Enabling Customized AI Models

Published

on

By

OpenAI Introduces Fine-Tuning for GPT-4 and Enabling Customized AI Models

OpenAI has today announced the release of fine-tuning capabilities for its flagship GPT-4 large language model, marking a significant milestone in the AI landscape. This new functionality empowers developers to create tailored versions of GPT-4 to suit specialized use cases, enhancing the model’s utility across various industries.

Fine-tuning has long been a desired feature for developers who require more control over AI behavior, and with this update, OpenAI delivers on that demand. The ability to fine-tune GPT-4 allows businesses and developers to refine the model’s responses to better align with specific requirements, whether for customer service, content generation, technical support, or other unique applications.

Why Fine-Tuning Matters

GPT-4 is a very flexible model that can handle many different tasks. However, some businesses and developers need more specialized AI that matches their specific language, style, and needs. Fine-tuning helps with this by letting them adjust GPT-4 using custom data. For example, companies can train a fine-tuned model to keep a consistent brand tone or focus on industry-specific language.

Fine-tuning also offers improvements in areas like response accuracy and context comprehension. For use cases where nuanced understanding or specialized knowledge is crucial, this can be a game-changer. Models can be taught to better grasp intricate details, improving their effectiveness in sectors such as legal analysis, medical advice, or technical writing.

Key Features of GPT-4 Fine-Tuning

The fine-tuning process leverages OpenAI’s established tools, but now it is optimized for GPT-4’s advanced architecture. Notable features include:

  • Enhanced Customization: Developers can precisely influence the model’s behavior and knowledge base.
  • Consistency in Output: Fine-tuned models can be made to maintain consistent formatting, tone, or responses, essential for professional applications.
  • Higher Efficiency: Compared to training models from scratch, fine-tuning GPT-4 allows organizations to deploy sophisticated AI with reduced time and computational cost.

Additionally, OpenAI has emphasized ease of use with this feature. The fine-tuning workflow is designed to be accessible even to teams with limited AI experience, reducing barriers to customization. For more advanced users, OpenAI provides granular control options to achieve highly specialized outputs.

Implications for the Future

The launch of fine-tuning capabilities for GPT-4 signals a broader shift toward more user-centric AI development. As businesses increasingly adopt AI, the demand for models that can cater to specific business needs, without compromising on performance, will continue to grow. OpenAI’s move positions GPT-4 as a flexible and adaptable tool that can be refined to deliver optimal value in any given scenario.

By offering fine-tuning, OpenAI not only enhances GPT-4’s appeal but also reinforces the model’s role as a leading AI solution across diverse sectors. From startups seeking to automate niche tasks to large enterprises looking to scale intelligent systems, GPT-4’s fine-tuning capability provides a powerful resource for driving innovation.

OpenAI announced that fine-tuning GPT-4o will cost $25 for every million tokens used during training. After the model is set up, it will cost $3.75 per million input tokens and $15 per million output tokens. To help developers get started, OpenAI is offering 1 million free training tokens per day for GPT-4o and 2 million free tokens per day for GPT-4o mini until September 23. This makes it easier for developers to try out the fine-tuning service.

As AI continues to evolve, OpenAI’s focus on customization and adaptability with GPT-4 represents a critical step in making advanced AI accessible, scalable, and more aligned with real-world applications. This new capability is expected to accelerate the adoption of AI across industries, creating a new wave of AI-driven solutions tailored to specific challenges and opportunities.

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

GOOGLE

This Week in Search News: Simple and Easy-to-Read Update

Published

on

This Week in Search News: Simple and Easy-to-Read Update

Here’s what happened in the world of Google and search engines this week:

1. Google’s June 2024 Spam Update

Google finished rolling out its June 2024 spam update over a period of seven days. This update aims to reduce spammy content in search results.

2. Changes to Google Search Interface

Google has removed the continuous scroll feature for search results. Instead, it’s back to the old system of pages.

3. New Features and Tests

  • Link Cards: Google is testing link cards at the top of AI-generated overviews.
  • Health Overviews: There are more AI-generated health overviews showing up in search results.
  • Local Panels: Google is testing AI overviews in local information panels.

4. Search Rankings and Quality

  • Improving Rankings: Google said it can improve its search ranking system but will only do so on a large scale.
  • Measuring Quality: Google’s Elizabeth Tucker shared how they measure search quality.

5. Advice for Content Creators

  • Brand Names in Reviews: Google advises not to avoid mentioning brand names in review content.
  • Fixing 404 Pages: Google explained when it’s important to fix 404 error pages.

6. New Search Features in Google Chrome

Google Chrome for mobile devices has added several new search features to enhance user experience.

7. New Tests and Features in Google Search

  • Credit Card Widget: Google is testing a new widget for credit card information in search results.
  • Sliding Search Results: When making a new search query, the results might slide to the right.

8. Bing’s New Feature

Bing is now using AI to write “People Also Ask” questions in search results.

9. Local Search Ranking Factors

Menu items and popular times might be factors that influence local search rankings on Google.

10. Google Ads Updates

  • Query Matching and Brand Controls: Google Ads updated its query matching and brand controls, and advertisers are happy with these changes.
  • Lead Credits: Google will automate lead credits for Local Service Ads. Google says this is a good change, but some advertisers are worried.
  • tROAS Insights Box: Google Ads is testing a new insights box for tROAS (Target Return on Ad Spend) in Performance Max and Standard Shopping campaigns.
  • WordPress Tag Code: There is a new conversion code for Google Ads on WordPress sites.

These updates highlight how Google and other search engines are continuously evolving to improve user experience and provide better advertising tools.

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

FACEBOOK

Facebook Faces Yet Another Outage: Platform Encounters Technical Issues Again

Published

on

By

Facebook Problem Again

Uppdated: It seems that today’s issues with Facebook haven’t affected as many users as the last time. A smaller group of people appears to be impacted this time around, which is a relief compared to the larger incident before. Nevertheless, it’s still frustrating for those affected, and hopefully, the issues will be resolved soon by the Facebook team.

Facebook had another problem today (March 20, 2024). According to Downdetector, a website that shows when other websites are not working, many people had trouble using Facebook.

This isn’t the first time Facebook has had issues. Just a little while ago, there was another problem that stopped people from using the site. Today, when people tried to use Facebook, it didn’t work like it should. People couldn’t see their friends’ posts, and sometimes the website wouldn’t even load.

Downdetector, which watches out for problems on websites, showed that lots of people were having trouble with Facebook. People from all over the world said they couldn’t use the site, and they were not happy about it.

When websites like Facebook have problems, it affects a lot of people. It’s not just about not being able to see posts or chat with friends. It can also impact businesses that use Facebook to reach customers.

Since Facebook owns Messenger and Instagram, the problems with Facebook also meant that people had trouble using these apps. It made the situation even more frustrating for many users, who rely on these apps to stay connected with others.

During this recent problem, one thing is obvious: the internet is always changing, and even big websites like Facebook can have problems. While people wait for Facebook to fix the issue, it shows us how easily things online can go wrong. It’s a good reminder that we should have backup plans for staying connected online, just in case something like this happens again.

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending