Connect with us


SiteGround Hosting Outage Over – What Happened via @sejournal, @martinibuster



SiteGround web hosting suffered a significant four day outage beginning Monday November 8, 2021. It wasn’t until November 12th that they tweeted that they had resolved the problem. Many customers lost rankings in Google and a significant amount of website traffic as the holiday shopping season fast approaches.

Many SiteGround publishers remain upset, largely because of the perceived slow pace of recovery from lost Google Search traffic.

What Caused the SiteGround Problem?

SiteGround provided a statement revealed that the problem was isolated to an issue between Amazon’s Global Accelerator and Google’s crawler.

According to SiteGround:

“On Friday we managed to isolate the Google bot crawling issue to a networking problem that was specific only to Amazon’s Global Accelerator and Google’s crawler bot subnet.

We implemented a fix that bypasses that problem and we’re happy to say that our clients’ sites now get properly crawled and most of them have already returned their rankings.

We are still working closely with both Amazon and Google on finding the cause of the problem.

Based on the latest updates, we suspect it’s a routing issue and Amazon is in contact with Google trying to narrow it down. From Amazon we know there are other clients of theirs that have been affected as well.”


Continue Reading Below

What is Amazon Global Accelerator?

Amazon Global Accelerator is  a service that helps solve network congestion on the Internet to help speed up websites.

This is how Amazon describes the Global Accelerator:

“AWS Global Accelerator is a networking service that improves the performance of your users’ traffic by up to 60% using Amazon Web Services’ global network infrastructure. When the internet is congested, AWS Global Accelerator optimizes the path to your application to keep packet loss, jitter, and latency consistently low.”

SiteGround Solves Issue

SiteGround tweeted on Friday November 12th that they had identified the issue and fixed it.

“Status Update: We are glad to inform you that we have implemented a fix for the Google bot crawling issue experienced by some sites. Websites are already being crawled successfully. Please allow a few hours for the DNS changes to take effect. Thank you for your patience!”

Nearly a week after the hosting outage began SiteGround publicly announced on Monday November 15th what the cause of the problem was.


Continue Reading Below

SiteGround tweeted:

“Status Update: On Friday we managed to isolate the Google bot crawling issue to a networking problem that was specific only to Amazon’s Global Accelerator and Google’s crawler bot subnet. We implemented a fix that bypasses that problem.”

SiteGround followed up with a tweet to express their happiness: 

“We’re happy to say that the majority of our clients’ sites are being crawled now and most of them have already returned their rankings.”

That followed another tweet that reported the success of the fix:

“Аll websites hosted on our end were fully operational, and there were no DNS resolution issues with the requests submitted from any other service.”

SiteGround Customers Still Upset

SiteGround implemented a fix. But many customers remained upset over the weekend as their sites still appeared to be affected.

That may not have been a problem at SiteGround but rather caused by delays in the DNS system as that information propagated across Internet, which could take a few days.

Positive Reports from SiteGround Clients

Some customers reported that their sites have recovered:

Many Negative Tweets About SiteGround

Some publishers continued to tweet over the weekend about their ongoing problems, which might have been related to the DNS information or Google’s having to re-crawl the sites.

Nevertheless, customers were still tweeting about the slow pace of website traffic recovery.

SiteGround responded to those tweets on Monday November 15th by reiterating that the problem should already be resolved:


Continue Reading Below

“We’ve deployed a fix and the Google bot can crawl the sites we host.

The issue should be resolved across our platform by now, many clients have confirmed.

Please DM us with some URL examples and additional info so we can help further.”

Why do SiteGround Customers Continue to Suffer?

Although SiteGround announced that the problem has been fixed some customers continue to suffer loss of traffic.

That is not unexpected and perhaps SiteGround could have helped customers by making sure they understood what to expect next in terms of Google’s having to re-index the websites.

Basically, when a site goes missing for an extended period of time Google will begin removing the missing site from its index. That’s what SiteGround customers experienced over the weekend.

However Google never really goes away. Google’s crawler, Googlebot, continues to return to the missing website to check if it returned.

Once the site returns after an extended absence it can take a few days to up to as much as ten days to fully recover, depending on how many web pages need to be re-crawled.


Continue Reading Below

My own experience a couple years ago with hundreds of thousands of web pages that temporarily disappeared was that it took about ten days to recover after a two week outage.

But for most publishers with smaller sites the recovery may be significantly faster.

Google Offers Insights Into Site Recovery

Google’s John Mueller tweeted and retweeted some helpful information about what SiteGround customers should expect in the coming days in terms of Google re-indexing their websites.

Google’s John Mueller tweeted insights about this process in October:

“If you’re curious about what happens in Google Search with an outage like Facebook recently had, it’s generally a 2-part reaction: when we can’t reach a site for network / DNS reasons, we see it like a 5xx HTTP server error. This means we reduce crawling:”


“The URLs remain indexed as they are, the site continues to rank as it used to. This is a temporary state though. If we becomes a persistent error (if it lasts more than 1-2 days), we will start dropping those URLs from indexing.”


Continue Reading Below

Mueller tweeted that an outage will not cause Google to change the rankings of the site after it comes back.

“There’s no direct ranking change when this happens – we don’t see the site as being low-quality or similar, but if those URLs aren’t indexed, they can’t rank either.”

Once the outage is fixed, Google will re-crawl the websites. This is important to note, that the site cannot resume ranking where it used to rank until Google has finished re-crawling the website.

Mueller tweeted:

“When the site comes back, if we have dropped URLs from indexing, those will generally pop back in once we can successfully crawl them again. Crawling will also speed up again if we can tell that the server’s fine.”

John Mueller repeated this reassurance on November 12, 2021 in a series of tweets:

“Once it’s resolved, Googlebot crawling & indexing picks up automatically again. The crawl rate goes up over time as the errors disappear, the dropped URLs will get recrawled over time and make it back into the index. The visibility will stabilize again.”


Continue Reading Below

He reaffirmed that there are no lasting effects from an outage.

John tweeted

“There are generally no lasting effects from temporary outages like these. Technical issues come & go, we have to do our best to make sure users can find their way to your wonderful sites through search results.”

Mueller tweeted this tip for fast re-indexing:

“If you have important pages that you need reprocessed quickly, I’d use “Inspect URL” in Search Console to resubmit them. Within a website, using internal linking to highlight & link to what you really care about is also good.”

Most replies to Mueller’s tweets were positive but not enough for some to blunt their lingering feelings.

Websites Recover From Temporary Outages

What happened last week was literally an unprecedented event. SiteGround is widely viewed in the industry as a reliable web host, which is why it is so popular.


Continue Reading Below

Websites should be back to ranking where they formerly did within days as Google continues to crawl and re-index the sites that temporarily dropped out.


What can ChatGPT do?



ChatGPT Explained

ChatGPT is a large language model developed by OpenAI that is trained on a massive amount of text data. It is capable of generating human-like text and has been used in a variety of applications, such as chatbots, language translation, and text summarization.

One of the key features of ChatGPT is its ability to generate text that is similar to human writing. This is achieved through the use of a transformer architecture, which allows the model to understand the context and relationships between words in a sentence. The transformer architecture is a type of neural network that is designed to process sequential data, such as natural language.

Another important aspect of ChatGPT is its ability to generate text that is contextually relevant. This means that the model is able to understand the context of a conversation and generate responses that are appropriate to the conversation. This is accomplished by the use of a technique called “masked language modeling,” which allows the model to predict the next word in a sentence based on the context of the previous words.

One of the most popular applications of ChatGPT is in the creation of chatbots. Chatbots are computer programs that simulate human conversation and can be used in customer service, sales, and other applications. ChatGPT is particularly well-suited for this task because of its ability to generate human-like text and understand context.

Another application of ChatGPT is language translation. By training the model on a large amount of text data in multiple languages, it can be used to translate text from one language to another. The model is able to understand the meaning of the text and generate a translation that is grammatically correct and semantically equivalent.

In addition to chatbots and language translation, ChatGPT can also be used for text summarization. This is the process of taking a large amount of text and condensing it into a shorter, more concise version. ChatGPT is able to understand the main ideas of the text and generate a summary that captures the most important information.

Despite its many capabilities and applications, ChatGPT is not without its limitations. One of the main challenges with using language models like ChatGPT is the risk of generating text that is biased or offensive. This can occur when the model is trained on text data that contains biases or stereotypes. To address this, OpenAI has implemented a number of techniques to reduce bias in the training data and in the model itself.

In conclusion, ChatGPT is a powerful language model that is capable of generating human-like text and understanding context. It has a wide range of applications, including chatbots, language translation, and text summarization. While there are limitations to its use, ongoing research and development is aimed at improving the model’s performance and reducing the risk of bias.

** The above article has been written 100% by ChatGPT. This is an example of what can be done with AI. This was done to show the advanced text that can be written by an automated AI.

Continue Reading


Google December Product Reviews Update Affects More Than English Language Sites? via @sejournal, @martinibuster



Google’s Product Reviews update was announced to be rolling out to the English language. No mention was made as to if or when it would roll out to other languages. Mueller answered a question as to whether it is rolling out to other languages.

Google December 2021 Product Reviews Update

On December 1, 2021, Google announced on Twitter that a Product Review update would be rolling out that would focus on English language web pages.

The focus of the update was for improving the quality of reviews shown in Google search, specifically targeting review sites.

A Googler tweeted a description of the kinds of sites that would be targeted for demotion in the search rankings:

“Mainly relevant to sites that post articles reviewing products.

Think of sites like “best TVs under $200″.com.

Goal is to improve the quality and usefulness of reviews we show users.”


Continue Reading Below

Google also published a blog post with more guidance on the product review update that introduced two new best practices that Google’s algorithm would be looking for.

The first best practice was a requirement of evidence that a product was actually handled and reviewed.

The second best practice was to provide links to more than one place that a user could purchase the product.

The Twitter announcement stated that it was rolling out to English language websites. The blog post did not mention what languages it was rolling out to nor did the blog post specify that the product review update was limited to the English language.

Google’s Mueller Thinking About Product Reviews Update

Screenshot of Google's John Mueller trying to recall if December Product Review Update affects more than the English language

Screenshot of Google's John Mueller trying to recall if December Product Review Update affects more than the English language

Product Review Update Targets More Languages?

The person asking the question was rightly under the impression that the product review update only affected English language search results.


Continue Reading Below

But he asserted that he was seeing search volatility in the German language that appears to be related to Google’s December 2021 Product Review Update.

This is his question:

“I was seeing some movements in German search as well.

So I was wondering if there could also be an effect on websites in other languages by this product reviews update… because we had lots of movement and volatility in the last weeks.

…My question is, is it possible that the product reviews update affects other sites as well?”

John Mueller answered:

“I don’t know… like other languages?

My assumption was this was global and and across all languages.

But I don’t know what we announced in the blog post specifically.

But usually we try to push the engineering team to make a decision on that so that we can document it properly in the blog post.

I don’t know if that happened with the product reviews update. I don’t recall the complete blog post.

But it’s… from my point of view it seems like something that we could be doing in multiple languages and wouldn’t be tied to English.

And even if it were English initially, it feels like something that is relevant across the board, and we should try to find ways to roll that out to other languages over time as well.

So I’m not particularly surprised that you see changes in Germany.

But I also don’t know what we actually announced with regards to the locations and languages that are involved.”

Does Product Reviews Update Affect More Languages?

While the tweeted announcement specified that the product reviews update was limited to the English language the official blog post did not mention any such limitations.

Google’s John Mueller offered his opinion that the product reviews update is something that Google could do in multiple languages.

One must wonder if the tweet was meant to communicate that the update was rolling out first in English and subsequently to other languages.

It’s unclear if the product reviews update was rolled out globally to more languages. Hopefully Google will clarify this soon.


Google Blog Post About Product Reviews Update

Product reviews update and your site

Google’s New Product Reviews Guidelines

Write high quality product reviews

John Mueller Discusses If Product Reviews Update Is Global

Watch Mueller answer the question at the 14:00 Minute Mark

[embedded content]

Continue Reading


Survey says: Amazon, Google more trusted with your personal data than Apple is




MacRumors reveals that more people feel better with their personal data in the hands of Amazon and Google than Apple’s. Companies that the public really doesn’t trust when it comes to their personal data include Facebook, TikTok, and Instagram.

The survey asked over 1,000 internet users in the U.S. how much they trusted certain companies such as Facebook, TikTok, Instagram, WhatsApp, YouTube, Google, Microsoft, Apple, and Amazon to handle their user data and browsing activity responsibly.

Amazon and Google are considered by survey respondents to be more trustworthy than Apple

Those surveyed were asked whether they trusted these firms with their personal data “a great deal,” “a good amount,” “not much,” or “not at all.” Respondents could also answer that they had no opinion about a particular company. 18% of those polled said that they trust Apple “a great deal” which topped the 14% received by Google and Amazon.

However, 39% said that they trust Amazon  by “a good amount” with Google picking up 34% of the votes in that same category. Only 26% of those answering said that they trust Apple by “a good amount.” The first two responses, “a great deal” and “a good amount,” are considered positive replies for a company. “Not much” and “not at all” are considered negative responses.

By adding up the scores in the positive categories,

Apple tallied a score of 44% (18% said it trusted Apple with its personal data “a great deal” while 26% said it trusted Apple “a good amount”). But that placed the tech giant third after Amazon’s 53% and Google’s 48%. After Apple, Microsoft finished fourth with 43%, YouTube (which is owned by Google) was fifth with 35%, and Facebook was sixth at 20%.

Rounding out the remainder of the nine firms in the survey, Instagram placed seventh with a positive score of 19%, WhatsApp was eighth with a score of 15%, and TikTok was last at 12%.

Looking at the scoring for the two negative responses (“not much,” or “not at all”), Facebook had a combined negative score of 72% making it the least trusted company in the survey. TikTok was next at 63% with Instagram following at 60%. WhatsApp and YouTube were both in the middle of the pact at 53% followed next by Google and Microsoft at 47% and 42% respectively. Apple and Amazon each had the lowest combined negative scores at 40% each.

74% of those surveyed called targeted online ads invasive

The survey also found that a whopping 82% of respondents found targeted online ads annoying and 74% called them invasive. Just 27% found such ads helpful. This response doesn’t exactly track the 62% of iOS users who have used Apple’s App Tracking Transparency feature to opt-out of being tracked while browsing websites and using apps. The tracking allows third-party firms to send users targeted ads online which is something that they cannot do to users who have opted out.

The 38% of iOS users who decided not to opt out of being tracked might have done so because they find it convenient to receive targeted ads about a certain product that they looked up online. But is ATT actually doing anything?

Marketing strategy consultant Eric Seufert said last summer, “Anyone opting out of tracking right now is basically having the same level of data collected as they were before. Apple hasn’t actually deterred the behavior that they have called out as being so reprehensible, so they are kind of complicit in it happening.”

The Financial Times says that iPhone users are being lumped together by certain behaviors instead of unique ID numbers in order to send targeted ads. Facebook chief operating officer Sheryl Sandberg says that the company is working to rebuild its ad infrastructure “using more aggregate or anonymized data.”

Aggregated data is a collection of individual data that is used to create high-level data. Anonymized data is data that removes any information that can be used to identify the people in a group.

When consumers were asked how often do they think that their phones or other tech devices are listening in to them in ways that they didn’t agree to, 72% answered “very often” or “somewhat often.” 28% responded by saying “rarely” or “never.”

Continue Reading