SEO
Google Penalties: The Newbie-Friendly Guide
A penalty from Google is every webmaster’s worst nightmare. But even if you get one, all is not lost.
In this post, you will learn the following:
Thankfully, we have never dealt with penalties on Ahrefs’ blog, so I reached out to SEO experts who deal with Google penalties in their professional work:
Let’s rock!
In simple terms, a penalty is a “punishment” manually imposed on a website by Google’s webspam team.
This generally happens when the website violates Google’s quality guidelines. This penalty results in a dramatic drop in rankings and organic traffic loss. It’s worth mentioning the negative effect of Google’s algorithm updates must not be mistaken for a penalty.
Google does not use the term “penalties” in its documentation. Instead, it calls them manual actions and algorithmic actions.
If your organic rankings and traffic suddenly drop while your website has no downtimes or technical SEO issues, you may be facing one of two things:
- Your website was reviewed by a human and got a manual action.
- A Google update resulted in an algorithmic action demoting your website.
Google’s mission is to present the best results to its users. Its quality guidelines explain what it expects from websites. Any attempt to manipulate its ranking factors or abuse its quality guidelines is called spam.
Every day, Google discovers around 40 billion spammy pages. Here’s how it fights spam.
Google has systems that can detect spam at the “crawling” stage. As a result, a large number of pages considered spammy won’t even make it to Google’s index.
The content included in its index is double-checked for spam. Such low-quality content may be shown in the search results, but its search rankings will be low.
As explained by Google, these AI-aided automated systems provide 99% protection against spam. Manual actions come into play for the remaining 1%.
Google has a vast army of human reviewers who can impose manual actions on websites.
In 2020, they sent 2.9 million messages to site owners in Search Console related to manual webspam actions.
Sidenote.
Manual actions are imposed and lifted by an in-house webspam team in Google. Don’t confuse this team with the quality raters who only help Google with evaluating changes to the algorithms and are not privy to any insights regarding the inner workings of Google Search.
Whenever a website gets a manual action, this issue will always be visible under Security and Manual actions in Google Search Console. It looks something like this.
With such a message, you know the problem and can address it directly.
Manual actions imposed on a website will result in pages being ranked lower or omitted from the search results completely.
But diagnosing the effect of an algorithmic adjustment is much more challenging, as you will get zero notifications from Google.
The best way to identify an algorithmic action is to look at your Google organic traffic and see if you have a drop that coincides with a known or suspected algorithm update.
You can search through webmaster forums or Twitter to see if other webmasters are facing similar issues. Google Search Central Help Community, for example, is an excellent place to start.
With Ahrefs, you can check if other websites in your industry are losing their rankings too.
Let’s use our blog as an example:
On Dec. 3, 2020, a new core algorithm update was rolled out. Notably, Ahrefs’ blog has the strongest focus on content quality; we have never bought a single backlink. So this update was a slap in the face for us.
You should also note that algorithm updates don’t only demote low-quality or spammy websites; they also promote high-quality sites. So even if there’s nothing “wrong” with your website, you can find other sites outranking you after the next core update.
If you’re ‘hit’ by a core update, you shouldn’t think of it as a penalty. You might not be doing anything wrong at all, it might just be that someone is doing something better.
There are two types of actions that can be displayed on the “Manual actions” page. These are:
- Manual actions that affect the entire site.
- Manual actions that only affect a specific URL or section of a site.
Manual actions can be anything from quite broad to quite specific, and very fluid in between. The goal is always to neutralize the issue, and sometimes that’s easy to isolate & do, other times it’s much harder, so we end up taking broader action.
Every “manual action” notification is accompanied by “Reason” and “Effects” information.
Today, the list of manual actions includes the following:
- Site abused with third-party spam
- User-generated spam
- Spammy free host
- Structured data issue
- Unnatural links to your site
- Unnatural links from your site
- Thin content with little or no added value
- Cloaking and/or sneaky redirects
- Pure spam
- Cloaked images
- Hidden text and/or keyword stuffing
- AMP content mismatch
- Sneaky mobile redirects
- News and Discover policy violations
Whenever Google rolls out a new spam update, it’s basically targeting the same issues.
The core updates address the relevance and quality of content.
Also, you should note that manual actions can cover different Google products. But if you somehow get a manual action related to Google Discover, your regular rankings should not be affected.
News and Discover are entirely separate products.
News includes newsy-content, not just news publishers.
Discover includes content from across the web, not just news content.
Manual actions can happen for either; if issued, they only involve the particular product issued for.
You can read more about the Manual Actions report in this Search Console Help article.
Unfortunately, there’s no data on the share of different penalties from Google, so I searched through recent threads on different webmaster forums and Twitter.
The most common manual actions that webmasters care to resolve these days are related to the attempts to manipulate Google’s search results. This may be a coincidence, but these penalties are even next to each other on the list of manual actions. These are:
- Unnatural links to your site.
- Unnatural links from your site.
- Thin content with little or no added value.
I would say that almost all Google penalties now are given because the site owner was trying too hard to manipulate Google. Five or six years ago I did see a lot of penalties that came as a result of good, honest business owners hiring poor SEO companies who built unnatural links. But, now, most of that type of link is just ignored by Penguin. As such, if you get a link penalty, you usually know that you deserved it.
So stop doing shady stuff, folks! 🙂
Links are still one of the most important ranking factors, so “unnatural links” are the reason behind a large number of penalties these days.
Now, let’s be honest. If you see this notification in Search Console, you most likely participated in a link scheme and know what you’re hit for.
But if you purchased a website or are working on someone else’s website, you’ll need to do a comprehensive link audit. Here are a few quick things that you can work on.
Check your anchor texts
You must pay attention to the dominance of irrelevant or over-optimized anchor texts that come from poor-quality websites.
The first step you should take is to plug your website into Ahrefs’ Site Explorer (or another backlink checker you like) and navigate to the Anchors report. We’ve recently enhanced ours with various helpful filters. Now it’s even easier to diagnose issues related to anchor texts.
With a natural backlink profile, most links are URL-anchored or brand-anchored. But if a site has been heavily involved in manipulative link building, you’ll usually see a lot of keyword-stuffed anchors from multiple referring domains:
If the Anchors report looks something like this, that’s a bad sign.
As for “Unnatural links from your site,” you should look at the anchor texts of the outgoing links. Ahrefs’ Site Explorer has a special report for that.
Check your referring domains
The Referring domains report in Site Explorer can help you identify websites explicitly built for selling links. The main attribute of such websites is the large number of sites they link to (related to the size of a website).
Besides, these websites usually get little to no traffic from Google, although they may have a pretty high Domain Rating score.
This is why we added “Dofollow linked domains” and “Traffic” columns straight into the Referring domains report.
For example, here at Ahrefs’ blog, we’ve never been greedy for external links. If a page or resource deserves a reference, we will link to it.
Our blog has approximately 2K pages indexed in Google.
And Ahrefs’ Linked Domains report will show you that we link to around 1.5K websites.
So the ratio is close to 1:1. I’m not saying it’s a standard for every industry, but I hope you get the idea. A website with 500 pages linking to 5K different websites should raise flags.
You should note that a low DR and poor organic traffic do not always indicate a website’s low quality. It could be that the website is new and could grow into a strong one over time.
But when a linking website has low DR, ugly design and UI, low-quality content, and no organic traffic, there’s no way a link from it will be considered natural.
Google seems to have a well-maintained list of websites that sell links. And it may be using link-selling emails to expand that list. Check out this article from Barry Schwartz for more information.
To get a good sense of how serious the link problem is for your site, you can access Marie Haynes’ Disavow Blacklist Bulk Upload tool. Then export the list of your referring domains from Ahrefs and see how many of them are on Marie’s blacklist.
You should also estimate the linking pages visually.
Websites created for the sake of linking out are easily noticeable most of the time. Content on their pages won’t make much sense, and images will be a total mess.
Recommended reading: How to Deal With Unnatural Links & Google Manual Actions
To lift a manual action from your website, you must take actions to rectify the problems specified in the GSC Manual Actions message(s) and select “Request Review” for the particular issue.
“Unnatural links to your site” is the only penalty that has its roots outside your website.
To rehabilitate, you must get rid of those links.
To do that, you need to clean up your link profile as much as possible. Simply disavowing these links may not suffice.
Here’s what Google’s document says:
You should make a good-faith effort to remove backlinks before using the disavow tool. It can also be helpful to document the effort involved in removing those links. Simply disavowing all backlinks without attempting to remove them might lead to rejection of your request.
If you have control over the link, remove it.
Then you should send link removal requests to webmasters. But most likely, this is going to be tough. It will often take you more than one attempt, as you’ll need to look for different contacts before finding the right person.
And even if you reach them, some simply won’t care. And sometimes, they’ll even ask you to pay to remove links.
If you fail to remove a link to your website, disavow the linking page or domain. But do explain why you did so in your reconsideration request.
Make sure to document every step you take and every message you send to fix the issues. This log will make your reconsideration request more convincing.
If the penalty results from thin content, provide evidence of your improvements. Show what content you took down and what you added.
Request a review only when you have fixed the issues on all affected pages, as the manual action can’t be lifted partially.
It is essential to demonstrate your effort to address all the issues on your website, as well as any results, when you send a reconsideration request. Remember that your website will be reconsidered by humans, not machine algorithms.
There is no limit on the number of times you can apply for reconsideration.
With a manual action, you want to show to a knowledgable person that you understand what the problem was, that you’ve taken all of the necessary steps to resolve it and ideally that you won’t do it again. Just disavowing some links someone else placed doesn’t seem complete to me.
— 🐐 John 🐐 (@JohnMu) June 30, 2021
Algorithmic actions do not imply reconsideration requests. All you can do is fix the issues that may be demoting your website and wait for Google as it recrawls and reindexes your site. Only after that will you figure out whether your site is still compromised.
If you fail to lift your penalties on the first try, repeat the above process by being more thorough this time. If you’re lost, find an SEO professional who can assess the damage and find the solution.
Is there a penalty for duplicate content?
25%–30% of the web is duplicate content.
Google demystified the “duplicate content penalty” back in 2008. Still, people keep asking this question.
The short answer to this question is “no.” There’s no penalty for duplicate content.
If you scrape or steal content from other websites, Google will simply rank the original.
As for the duplicate pages on your website, Google will try to determine the primary URL as the canonical version. Use the rel=canonical
labeling to help Google choose the most appropriate page.
Recommended reading: Duplicate Content: Why It Happens and How to Fix It
Can a website get a penalty for bad or outdated design?
Sometimes minimal, old, simplified, or even ugly pages rank well, sometimes that also changes over time (it feels like you’d be leaving open room for competition by being suboptimal).
Can a spam report from my competitor result in a penalty for my website?
Before 2020, Google’s documentation indicated that spam reports could be used to take manual actions against websites.
Today, these reports are used only to improve their spam detection algorithms. According to Google:
While Google does not use these reports to take direct action against violations, these reports still play a significant role in helping us understand how to improve our spam detection systems that protect our search results.
Should I disavow any low-quality links to prevent a manual action?
If you haven’t been building links, you probably have next to nothing to worry about.
Random links collected over the years aren’t necessarily harmful, we’ve seen them for a long time too and can ignore all of those weird pieces of web-graffiti from long ago. Disavow links that were really paid for (or otherwise actively unnaturally placed), don’t fret the cruft.
How long does it take to get a response to the reconsideration request?
There’s no exact answer to this question. Obviously, it depends on the severity of the issue and the current load on the webspam team.
Here’s what Google says:
Most reconsideration reviews can take several days or weeks, although in some cases, such as link-related reconsideration requests, it may take longer than usual to review your request.
In rare cases, it can take months.
In the past 18 months, I’ve experienced anywhere between 3 weeks and in one extreme case 5 months.
I believe that Google processes reconsideration requests in batches versus a conveyor belt method, and depending on the severity of the penalty/offence and its potential impact on user welfare (of lifting the penalty), they are treated with different levels (and batched differently) of importance. Which given how Google weights resources in favour of the user majority, this makes sense.
Can negative SEO attacks lead to manual actions?
It’s highly unlikely the spam links built as a negative SEO attack by your competitors will demote your site. Google says it’s smart enough to ignore those links.
But if you’re worried, simply disavow them.
In my opinion, it is rare that negative SEO attempts will lead to a manual action. If a site does get a manual action after a wave of negative SEO links are pointed at them, it almost always turns out that the site had also been involved in a lot of their own link building as well.
So if you do some black-hat SEO, negative SEO from your black-hat competition may bring unnecessary attention to your website. Isn’t that another signal to quit black hat?
Can a penalty expire?
To my surprise, manual actions do expire. Algorithm actions don’t, and they’re only getting better.
Yes, manual actions expire after time. Often things change over the years, so what might have required manual intervention to solve / improve back then, might be handled better algorithmically nowadays.
— 🐐 John 🐐 (@JohnMu) September 7, 2018
Will my website performance on search become normal once the manual action is lifted?
The general answer is “yes.” Google does not hold a grudge against websites with lifted manual actions.
Site being ‘tainted’ after a manual action is a SEO myth. You request a RR and if successful, you’re off the hook.
John Mueller even recorded a dedicated video on this topic.
But things may be a bit more complicated than that.
Over the years I’ve worked with a number of websites that have had manual actions, link penalties, and once lifted the website has been able to rebuild and progress. When lifting a link penalty, there is oftentimes a misconception that ‘performance will return to normal,’ but people forget that they will have had some benefit from the backlinks originally, impacting performance — otherwise why would Google see them as manipulations if they didn’t work?
A website gets a manual action usually because the techniques it was using were actually working. Even after the manual action is lifted, the site gets back to the point from before these black hat/grey hat techniques were implemented and actually started to work.
I did some research and came across John’s AMA session on Reddit, where he took the time to answer tons of interesting questions:
There’s no ‘reset button’ for a domain, we don’t even have that internally, so a manual review wouldn’t change anything there. If there’s a lot of bad history associated with that, you either have to live with it, clean it up as much as possible, or move to a different domain. I realize that can be a hassle, but it’s the same with any kind of business, cleaning up a bad name/reputation can be a lot of work, and it’s hard to say ahead of time if it’ll be worth it in the end.
Final thoughts
As you saw from this post, websites mostly suffer from Google’s penalties because of low-quality content and shady SEO techniques.
If you want to make your website bulletproof, make sure it meets Google’s quality guidelines, follows the E‑A-T principles, and has a natural link profile.
Apart from that, monitor your site for hacking and remove hacked content as soon as possible to prevent user-generated spam on your site.
Have more thoughts to share? Got questions? Ping me on Twitter.
SEO
brightonSEO Live Blog
Hello everyone. It’s April again, so I’m back in Brighton for another two days of Being the introvert I am, my idea of fun isn’t hanging around our booth all day explaining we’ve run out of t-shirts (seriously, you need to be fast if you want swag!). So I decided to do something useful and live-blog the event instead.
Follow below for talk takeaways and (very) mildly humorous commentary. sun, sea, and SEO!
SEO
Google Further Postpones Third-Party Cookie Deprecation In Chrome
Google has again delayed its plan to phase out third-party cookies in the Chrome web browser. The latest postponement comes after ongoing challenges in reconciling feedback from industry stakeholders and regulators.
The announcement was made in Google and the UK’s Competition and Markets Authority (CMA) joint quarterly report on the Privacy Sandbox initiative, scheduled for release on April 26.
Chrome’s Third-Party Cookie Phaseout Pushed To 2025
Google states it “will not complete third-party cookie deprecation during the second half of Q4” this year as planned.
Instead, the tech giant aims to begin deprecating third-party cookies in Chrome “starting early next year,” assuming an agreement can be reached with the CMA and the UK’s Information Commissioner’s Office (ICO).
The statement reads:
“We recognize that there are ongoing challenges related to reconciling divergent feedback from the industry, regulators and developers, and will continue to engage closely with the entire ecosystem. It’s also critical that the CMA has sufficient time to review all evidence, including results from industry tests, which the CMA has asked market participants to provide by the end of June.”
Continued Engagement With Regulators
Google reiterated its commitment to “engaging closely with the CMA and ICO” throughout the process and hopes to conclude discussions this year.
This marks the third delay to Google’s plan to deprecate third-party cookies, initially aiming for a Q3 2023 phaseout before pushing it back to late 2024.
The postponements reflect the challenges in transitioning away from cross-site user tracking while balancing privacy and advertiser interests.
Transition Period & Impact
In January, Chrome began restricting third-party cookie access for 1% of users globally. This percentage was expected to gradually increase until 100% of users were covered by Q3 2024.
However, the latest delay gives websites and services more time to migrate away from third-party cookie dependencies through Google’s limited “deprecation trials” program.
The trials offer temporary cookie access extensions until December 27, 2024, for non-advertising use cases that can demonstrate direct user impact and functional breakage.
While easing the transition, the trials have strict eligibility rules. Advertising-related services are ineligible, and origins matching known ad-related domains are rejected.
Google states the program aims to address functional issues rather than relieve general data collection inconveniences.
Publisher & Advertiser Implications
The repeated delays highlight the potential disruption for digital publishers and advertisers relying on third-party cookie tracking.
Industry groups have raised concerns that restricting cross-site tracking could push websites toward more opaque privacy-invasive practices.
However, privacy advocates view the phaseout as crucial in preventing covert user profiling across the web.
With the latest postponement, all parties have more time to prepare for the eventual loss of third-party cookies and adopt Google’s proposed Privacy Sandbox APIs as replacements.
Featured Image: Novikov Aleksey/Shutterstock
SEO
How To Write ChatGPT Prompts To Get The Best Results
ChatGPT is a game changer in the field of SEO. This powerful language model can generate human-like content, making it an invaluable tool for SEO professionals.
However, the prompts you provide largely determine the quality of the output.
To unlock the full potential of ChatGPT and create content that resonates with your audience and search engines, writing effective prompts is crucial.
In this comprehensive guide, we’ll explore the art of writing prompts for ChatGPT, covering everything from basic techniques to advanced strategies for layering prompts and generating high-quality, SEO-friendly content.
Writing Prompts For ChatGPT
What Is A ChatGPT Prompt?
A ChatGPT prompt is an instruction or discussion topic a user provides for the ChatGPT AI model to respond to.
The prompt can be a question, statement, or any other stimulus to spark creativity, reflection, or engagement.
Users can use the prompt to generate ideas, share their thoughts, or start a conversation.
ChatGPT prompts are designed to be open-ended and can be customized based on the user’s preferences and interests.
How To Write Prompts For ChatGPT
Start by giving ChatGPT a writing prompt, such as, “Write a short story about a person who discovers they have a superpower.”
ChatGPT will then generate a response based on your prompt. Depending on the prompt’s complexity and the level of detail you requested, the answer may be a few sentences or several paragraphs long.
Use the ChatGPT-generated response as a starting point for your writing. You can take the ideas and concepts presented in the answer and expand upon them, adding your own unique spin to the story.
If you want to generate additional ideas, try asking ChatGPT follow-up questions related to your original prompt.
For example, you could ask, “What challenges might the person face in exploring their newfound superpower?” Or, “How might the person’s relationships with others be affected by their superpower?”
Remember that ChatGPT’s answers are generated by artificial intelligence and may not always be perfect or exactly what you want.
However, they can still be a great source of inspiration and help you start writing.
Must-Have GPTs Assistant
I recommend installing the WebBrowser Assistant created by the OpenAI Team. This tool allows you to add relevant Bing results to your ChatGPT prompts.
This assistant adds the first web results to your ChatGPT prompts for more accurate and up-to-date conversations.
It is very easy to install in only two clicks. (Click on Start Chat.)
For example, if I ask, “Who is Vincent Terrasi?,” ChatGPT has no answer.
With WebBrower Assistant, the assistant creates a new prompt with the first Bing results, and now ChatGPT knows who Vincent Terrasi is.
You can test other GPT assistants available in the GPTs search engine if you want to use Google results.
Master Reverse Prompt Engineering
ChatGPT can be an excellent tool for reverse engineering prompts because it generates natural and engaging responses to any given input.
By analyzing the prompts generated by ChatGPT, it is possible to gain insight into the model’s underlying thought processes and decision-making strategies.
One key benefit of using ChatGPT to reverse engineer prompts is that the model is highly transparent in its decision-making.
This means that the reasoning and logic behind each response can be traced, making it easier to understand how the model arrives at its conclusions.
Once you’ve done this a few times for different types of content, you’ll gain insight into crafting more effective prompts.
Prepare Your ChatGPT For Generating Prompts
First, activate the reverse prompt engineering.
- Type the following prompt: “Enable Reverse Prompt Engineering? By Reverse Prompt Engineering I mean creating a prompt from a given text.”
ChatGPT is now ready to generate your prompt. You can test the product description in a new chatbot session and evaluate the generated prompt.
- Type: “Create a very technical reverse prompt engineering template for a product description about iPhone 11.”
The result is amazing. You can test with a full text that you want to reproduce. Here is an example of a prompt for selling a Kindle on Amazon.
- Type: “Reverse Prompt engineer the following {product), capture the writing style and the length of the text :
product =”
I tested it on an SEJ blog post. Enjoy the analysis – it is excellent.
- Type: “Reverse Prompt engineer the following {text}, capture the tone and writing style of the {text} to include in the prompt :
text = all text coming from https://www.searchenginejournal.com/google-bard-training-data/478941/”
But be careful not to use ChatGPT to generate your texts. It is just a personal assistant.
Go Deeper
Prompts and examples for SEO:
- Keyword research and content ideas prompt: “Provide a list of 20 long-tail keyword ideas related to ‘local SEO strategies’ along with brief content topic descriptions for each keyword.”
- Optimizing content for featured snippets prompt: “Write a 40-50 word paragraph optimized for the query ‘what is the featured snippet in Google search’ that could potentially earn the featured snippet.”
- Creating meta descriptions prompt: “Draft a compelling meta description for the following blog post title: ’10 Technical SEO Factors You Can’t Ignore in 2024′.”
Important Considerations:
- Always Fact-Check: While ChatGPT can be a helpful tool, it’s crucial to remember that it may generate inaccurate or fabricated information. Always verify any facts, statistics, or quotes generated by ChatGPT before incorporating them into your content.
- Maintain Control and Creativity: Use ChatGPT as a tool to assist your writing, not replace it. Don’t rely on it to do your thinking or create content from scratch. Your unique perspective and creativity are essential for producing high-quality, engaging content.
- Iteration is Key: Refine and revise the outputs generated by ChatGPT to ensure they align with your voice, style, and intended message.
Additional Prompts for Rewording and SEO:
– Rewrite this sentence to be more concise and impactful.
– Suggest alternative phrasing for this section to improve clarity.
– Identify opportunities to incorporate relevant internal and external links.
– Analyze the keyword density and suggest improvements for better SEO.
Remember, while ChatGPT can be a valuable tool, it’s essential to use it responsibly and maintain control over your content creation process.
Experiment And Refine Your Prompting Techniques
Writing effective prompts for ChatGPT is an essential skill for any SEO professional who wants to harness the power of AI-generated content.
Hopefully, the insights and examples shared in this article can inspire you and help guide you to crafting stronger prompts that yield high-quality content.
Remember to experiment with layering prompts, iterating on the output, and continually refining your prompting techniques.
This will help you stay ahead of the curve in the ever-changing world of SEO.
More resources:
Featured Image: Tapati Rinchumrus/Shutterstock
-
PPC6 days ago
19 Best SEO Tools in 2024 (For Every Use Case)
-
MARKETING7 days ago
Ecommerce evolution: Blurring the lines between B2B and B2C
-
SEARCHENGINES5 days ago
Daily Search Forum Recap: April 19, 2024
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: April 18, 2024
-
WORDPRESS6 days ago
How to Make $5000 of Passive Income Every Month in WordPress
-
SEO7 days ago
2024 WordPress Vulnerability Report Shows Errors Sites Keep Making
-
WORDPRESS6 days ago
10 Amazing WordPress Design Resouces – WordPress.com News
-
SEO6 days ago
25 WordPress Alternatives Best For SEO
You must be logged in to post a comment Login