NEWS
Google Update November 2019
There is substantial evidence that an unannounced Google Update is underway. This update has been affecting sites across a wide range of niches. Most of the feedback is negative although there are winners mixed in, including winners in the spam community.
Impact Felt Across Many Industries
Forensic SEO expert Alan Bleiweiss (@AlanBleiweiss) noted in a tweet how this update affected a plurality of industries. He tweeted:
“I monitor 47 sites
- Mental Health Knowledge base & Directory – up 20%
- Travel booking – up 14%
- Travel booking – up 25%
- Recipes – up 12%
- B to C eCom – up 20%
- Tech news – down 20%
- Skin care affiliate – down 48%
- Alternative health – not impacted
- Other Recipe sites – not impacted”
Note: The double listing for travel sites is not a typo. It’s a reference to two travel sites.
Recipe Bloggers Report Update Effects
Google does not target specific niches. The recipe blogger niche is a highly organized community. Because of that when something big happens the community’s voice will be amplified.
Thus it was that the recipe blogger community noticed this update. As of earlier today there was a growing list of 47 recipe blogs that had reported losses from this update.
Casey Markee of MediaWyse (@MediaWyse) who specializes in food blog SEO said this about those 47 sites that were suffering from Google’s November 2019 update:
“All of them, big sites, small sites, medium sites, all are showing like 30%+ drops. I know it’s tough on bloggers to see drops and thing “OMG I need to make some dramatic changes.”
But that’s like throwing darts at a board, blindfolded, in the dark, while underwater. Definitely need to WAIT for more data and until this “update” (or whatever it is) has fully rolled-out.”
I agree, it’s best to wait to hear what Google says about this update. I took a quick look at two of the recipe sites that were lost traffic and both of them had heavy keyword term densities.
For example, I extracted the Heading outlines (H1, H2) of both sites and they both were hitting their target keywords hard in every single heading. A keyword density report of two sample pages revealed that the target keywords were being used as high as 5% of all words on the page.
Travel Bloggers
Leslie Harvey of Trips with Tykes (@TripsWithTykes) tweeted:
“Seeing it reported heavily in the travel blog world too – dozens of high traffic travel blogs are significantly down from anecdotes I’m hearing.”
Many First Time Google Update Casualties
A curious note about this update is that there are many publishers reporting that they have never been hit by a previous update and are surprised to have finally been hit by the November 2019 Google update. As a representative example, one publisher tweeted:
“I’m down 25% organic in the last two days. I’ve seen positive impacts from every previous algo update.”
Jim @ UncoveringPA (@UncoveringPA) tweeted:
“Definitely frustrating. First time I’ve ever gotten a big hit from a Google update. Down 20-25%. For my local site, it seems mostly that more official sites got boosted over mine, even though mine has better and more in depth content.”
Facebook-Hat Observations
A member of the private SEO Signals Lab Facebook group had a similar experience.
As an example, a Facebook member stated that his seven year old site that’s been in the top three for years suddenly dropped to page two and three.
Perhaps of interest or maybe not, this member reported having used content analysis software to analyze the SERPs to understand the kind of words are ranking for queries. Regardless if that’s a clue or not, this is another person out of many who reported having never suffered from a previous Google update and are now having a difficult time with this update.
Another Facebook member reported a 20% change in traffic but with little change in keyword positions. That kind of result is sometimes caused by an increase in People Also Ask, Featured Snippets, Carousels, Top Stories and other “helpful” Google features that tend to push organic results down.
According to Moz’s Mozcast, the biggest changes has been in the Top Stories, with a spike beginning on November 5th, 2019 that peaked on Thursday November 7th, when evidence of an update was beginning to surge.
Could that be causing some of the traffic declines? It’s hard to say, these are the first days of the update and information is still trickling in.
In general, there are more people reporting ranking losses than increases in the SEO Signals Labs group. And on Twitter there are many reporting losses, far more than winners.
Gray Hat Update Experience
Members of the self-described “gray hat” Proper SEO Facebook group are generally positive. This group is focused on Private Blog Network (PBN) links. The Admin of the group posted an Accuranker graph showing that all his keywords were on a steep upward trajectory, with one member saying his graph resembled a hockey stick.
Another member remarked that he was having a week to week improvement in sales on the order of 30%, with Friday tracking at a 70% improvement.
Several members noted that there have been huge gains in Google Local changes.
Overall, the gray hats are responding more positive than negative.
Black Hat Update Reaction
Over at the Black Hat World forums members are discussing dramatic changes to their rankings. Some are reporting losses as high as 40%.
A few however are reporting improvements from Google’s November update.
According to one member:
“There must be something going on, I’ve seen 2 people on amazon affiliate forum stating that their traffic took a huge nose dive. Mine is up 30% yesterday and today.”
While in another discussion a black hat member commented:
“Just today I saw massive improvements finally… “
No-Hat SEO Observations on Google Update
Over on WebmasterWorld Forums, members started noticing changes on Wednesday November 6th . One member noted on Thursday that they hadn’t seen conversions in 24 hours. Another member noted:
“I’m seeing serps filled with malware and sneaky redirects from .cf .tk .ml .gq .ga domains.
Google can’t tell apart anymore a legitimate site and a spam site. This is serious. The anti-spam team lost the battle.”
That observation seems to affirm the positive reports seen in black hat and gray hat communities.
Member Paperchaser said:
“Entertainment industry here, I see lot of movement on my end, lot of sites losing their ranking heavily and couple hours later everything gets back to where they were.”
Fishing Hat SEO Reaction to Update
Someone asked me if I’m a white hat SEO. My response was that I’m more of a camo fishing hat type of SEO. In other words, I color within the lines while doing what needs to be done to get results.
That means understanding what Google is trying to accomplish and working within those parameters. It’s not about “tricking” Google. It’s about knowing the lines and coloring within those lines.
It’s also about being critical of SEO information that is unsupported by any kind of Google research or patents. For example, for the past year there has been a line of thought that it’s important to add author bios to websites. That’s been debunked by Googlers like John Mueller.
There have been numerous responses to recent updates that have proven false because they were based on poor sources of information or poor reasoning. Thus, many of today’s poor observations such as:
- The advice that to build an authors page
- Advice to display expert accreditation on the site
- The advice to improve E-A-T (Google’s confirmed there is no E-A-T ranking factor)
All of that advice have been debunked by Gary Illyes and John Mueller.
That poor advice came from using Google’s Search Quality Raters Guidelines as a way to understand Google’s algorithm.
The Search Quality Raters Guidelines can be used to set goals for what a website can be and from those goals one can create strategies to achieve those high standards.
But that document was never a road map of Google’s algorithm. It was a road map for how to rate a website. That’s all. Two different things. But that document is useless for trying to understand why a site is no longer ranking because there are no ranking secrets in the Search Quality Raters Guidelines.
Google Does Not Target Individual Niches
Google rarely targets an industry. In the past some SEOs promoted the idea that Google was targeting medical sites. They were wrong and now we are stuck with the ineptly named Medic Update to remind the SEO industry that Google does not target specific industries.
Google is Not Targeting Recipe Blogs
Anyone who says that Google is targeting the recipe blogs is wrong. This update has affected publishers from all niches. It’s a broad update that affects a wide range of site topics.
Takeaway About Google’s November Update
The point then is to wait until we hear from Google.
Most of the past updates have focused on relevance through understanding user queries better, understanding what web pages are about, and understanding the link signal better.
It could be that Google is rolling out a combination of changes related to content, queries and backlinks. That’s a safe bet… but at this point, but we simply don’t know. To say that Google is targeting specific kinds of sites or types of content is to skate on thin ice.
NEWS
OpenAI Introduces Fine-Tuning for GPT-4 and Enabling Customized AI Models
OpenAI has today announced the release of fine-tuning capabilities for its flagship GPT-4 large language model, marking a significant milestone in the AI landscape. This new functionality empowers developers to create tailored versions of GPT-4 to suit specialized use cases, enhancing the model’s utility across various industries.
Fine-tuning has long been a desired feature for developers who require more control over AI behavior, and with this update, OpenAI delivers on that demand. The ability to fine-tune GPT-4 allows businesses and developers to refine the model’s responses to better align with specific requirements, whether for customer service, content generation, technical support, or other unique applications.
Why Fine-Tuning Matters
GPT-4 is a very flexible model that can handle many different tasks. However, some businesses and developers need more specialized AI that matches their specific language, style, and needs. Fine-tuning helps with this by letting them adjust GPT-4 using custom data. For example, companies can train a fine-tuned model to keep a consistent brand tone or focus on industry-specific language.
Fine-tuning also offers improvements in areas like response accuracy and context comprehension. For use cases where nuanced understanding or specialized knowledge is crucial, this can be a game-changer. Models can be taught to better grasp intricate details, improving their effectiveness in sectors such as legal analysis, medical advice, or technical writing.
Key Features of GPT-4 Fine-Tuning
The fine-tuning process leverages OpenAI’s established tools, but now it is optimized for GPT-4’s advanced architecture. Notable features include:
- Enhanced Customization: Developers can precisely influence the model’s behavior and knowledge base.
- Consistency in Output: Fine-tuned models can be made to maintain consistent formatting, tone, or responses, essential for professional applications.
- Higher Efficiency: Compared to training models from scratch, fine-tuning GPT-4 allows organizations to deploy sophisticated AI with reduced time and computational cost.
Additionally, OpenAI has emphasized ease of use with this feature. The fine-tuning workflow is designed to be accessible even to teams with limited AI experience, reducing barriers to customization. For more advanced users, OpenAI provides granular control options to achieve highly specialized outputs.
Implications for the Future
The launch of fine-tuning capabilities for GPT-4 signals a broader shift toward more user-centric AI development. As businesses increasingly adopt AI, the demand for models that can cater to specific business needs, without compromising on performance, will continue to grow. OpenAI’s move positions GPT-4 as a flexible and adaptable tool that can be refined to deliver optimal value in any given scenario.
By offering fine-tuning, OpenAI not only enhances GPT-4’s appeal but also reinforces the model’s role as a leading AI solution across diverse sectors. From startups seeking to automate niche tasks to large enterprises looking to scale intelligent systems, GPT-4’s fine-tuning capability provides a powerful resource for driving innovation.
OpenAI announced that fine-tuning GPT-4o will cost $25 for every million tokens used during training. After the model is set up, it will cost $3.75 per million input tokens and $15 per million output tokens. To help developers get started, OpenAI is offering 1 million free training tokens per day for GPT-4o and 2 million free tokens per day for GPT-4o mini until September 23. This makes it easier for developers to try out the fine-tuning service.
As AI continues to evolve, OpenAI’s focus on customization and adaptability with GPT-4 represents a critical step in making advanced AI accessible, scalable, and more aligned with real-world applications. This new capability is expected to accelerate the adoption of AI across industries, creating a new wave of AI-driven solutions tailored to specific challenges and opportunities.
This Week in Search News: Simple and Easy-to-Read Update
Here’s what happened in the world of Google and search engines this week:
1. Google’s June 2024 Spam Update
Google finished rolling out its June 2024 spam update over a period of seven days. This update aims to reduce spammy content in search results.
2. Changes to Google Search Interface
Google has removed the continuous scroll feature for search results. Instead, it’s back to the old system of pages.
3. New Features and Tests
- Link Cards: Google is testing link cards at the top of AI-generated overviews.
- Health Overviews: There are more AI-generated health overviews showing up in search results.
- Local Panels: Google is testing AI overviews in local information panels.
4. Search Rankings and Quality
- Improving Rankings: Google said it can improve its search ranking system but will only do so on a large scale.
- Measuring Quality: Google’s Elizabeth Tucker shared how they measure search quality.
5. Advice for Content Creators
- Brand Names in Reviews: Google advises not to avoid mentioning brand names in review content.
- Fixing 404 Pages: Google explained when it’s important to fix 404 error pages.
6. New Search Features in Google Chrome
Google Chrome for mobile devices has added several new search features to enhance user experience.
7. New Tests and Features in Google Search
- Credit Card Widget: Google is testing a new widget for credit card information in search results.
- Sliding Search Results: When making a new search query, the results might slide to the right.
8. Bing’s New Feature
Bing is now using AI to write “People Also Ask” questions in search results.
9. Local Search Ranking Factors
Menu items and popular times might be factors that influence local search rankings on Google.
10. Google Ads Updates
- Query Matching and Brand Controls: Google Ads updated its query matching and brand controls, and advertisers are happy with these changes.
- Lead Credits: Google will automate lead credits for Local Service Ads. Google says this is a good change, but some advertisers are worried.
- tROAS Insights Box: Google Ads is testing a new insights box for tROAS (Target Return on Ad Spend) in Performance Max and Standard Shopping campaigns.
- WordPress Tag Code: There is a new conversion code for Google Ads on WordPress sites.
These updates highlight how Google and other search engines are continuously evolving to improve user experience and provide better advertising tools.
Facebook Faces Yet Another Outage: Platform Encounters Technical Issues Again
Uppdated: It seems that today’s issues with Facebook haven’t affected as many users as the last time. A smaller group of people appears to be impacted this time around, which is a relief compared to the larger incident before. Nevertheless, it’s still frustrating for those affected, and hopefully, the issues will be resolved soon by the Facebook team.
Facebook had another problem today (March 20, 2024). According to Downdetector, a website that shows when other websites are not working, many people had trouble using Facebook.
This isn’t the first time Facebook has had issues. Just a little while ago, there was another problem that stopped people from using the site. Today, when people tried to use Facebook, it didn’t work like it should. People couldn’t see their friends’ posts, and sometimes the website wouldn’t even load.
Downdetector, which watches out for problems on websites, showed that lots of people were having trouble with Facebook. People from all over the world said they couldn’t use the site, and they were not happy about it.
When websites like Facebook have problems, it affects a lot of people. It’s not just about not being able to see posts or chat with friends. It can also impact businesses that use Facebook to reach customers.
Since Facebook owns Messenger and Instagram, the problems with Facebook also meant that people had trouble using these apps. It made the situation even more frustrating for many users, who rely on these apps to stay connected with others.
During this recent problem, one thing is obvious: the internet is always changing, and even big websites like Facebook can have problems. While people wait for Facebook to fix the issue, it shows us how easily things online can go wrong. It’s a good reminder that we should have backup plans for staying connected online, just in case something like this happens again.
-
SEO6 days ago
How to Market When Information is Dirt Cheap
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 2, 2024
-
SEO4 days ago
Early Analysis & User Feedback
-
SEARCHENGINES5 days ago
Daily Search Forum Recap: September 3, 2024
-
SEO5 days ago
Google Trends Subscriptions Quietly Canceled
-
SEO6 days ago
What Is Largest Contentful Paint: An Easy Explanation
-
WORDPRESS6 days ago
MyDataNinja
-
AFFILIATE MARKETING3 days ago
What Is Founder Mode and Why Is It Better Than Manager Mode?