SEO
What Are They & How Do You Get Them?
Rich snippets aren’t a Google ranking factor, but they can make your website’s search results stand out from the crowd.
So what exactly are rich snippets, how are they different from other SERP features, and how can you get them to show for your site?
Rich snippets, rich results, and SERP features are sometimes used interchangeably by SEOs, which can cause confusion.
So what are the differences?
- Rich snippets – Google’s glossary states that rich snippets are now known as rich results.
- Rich results – Google says rich results can include carousels, images, or other non-textual elements and that they are experiences that go beyond the standard blue link.
- SERP features – Provide additional and related information on the search query. Examples include the local pack, videos, and the knowledge panel.
Google supports different types of rich results within its search results. Let’s take a look at some of the most popular types.
Review
One of the most prominent examples of rich snippets is the Review
snippet, which adds a yellow star rating to the search results with additional information about the reviews.
Here’s an example of what a Review snippet can look like, with the snippets highlighted.
Review snippets can appear for the following content types:
- Book
- Course
- Event
- How-to
- Local business (for sites that capture reviews about other local businesses)
- Movie
- Product
- Recipe
- Software app
Product
Product
rich snippets are useful if you have an e-commerce website. They provide more information to your potential customers about your products—like whether the product is currently in stock, its shipping information, and its price.
Here’s an example of what a Product snippet result can look like in the search results, with the snippets highlighted.
Recipe
Recipe
rich snippets give more information about the recipe on the page, such as how long it takes to prepare, its ingredients, and reviews.
Here’s an example of what a recipe result can look like in Google in the Recipes carousel.
Event
Event
snippets highlight the date and location of your events. They’re useful if you have ticketed events like concerts or shows.
Here’s an example of an Event snippet.
Sidenote.
FAQ and HowTo results are not included in this list, as Google announced it was reducing the visibility for them on August 8, 2023, to provide a “cleaner and more consistent” search experience.
To be eligible for rich snippets, you’ll need to add schema markup to your pages and ensure you follow Google’s structured data guidelines.
But before attempting to add the code, check whether your CMS has added it already.
To do this, head to a page where you think there should be markup, open up Ahrefs’ SEO Toolbar, and go to the “Structured data” tab.
If there’s no structured data on the page, you’ll get a message that looks like the one below.
You can double-check this by running a page through the Rich Results Test tool.
If no markup is present on the page, the rich results test will display the message “No items detected.”
Assuming there are no rich results detected, you’re safe to add the code.
Here’s how you do it.
1. Generate the code
If you use a popular content management system (CMS) like WordPress, adding schema to your website is as easy as installing a schema plugin like this one.
If you already use a plugin like Rank Math, you can use its guide to generate and customize your schema.
If you don’t use one of the more popular CMSes, you may have to generate the code yourself.
Tip
If you are not confident with code, it’s worth talking to a developer or SEO consultant to help you implement these changes.
I’m using Merkle’s Schema Markup Generator to generate Product schema markup. But you can use Google’s Structured Data Markup Helper or even ChatGPT as well.
To generate the code, simply fill out the prompts from the tool.
Once you’ve finished, copy the JSON-LD code; this is the code format Google recommends for schema markup.
Sidenote.
Remember to only add code for content that’s visible to users and adheres to Google’s guidelines for the selected schema type.
2. Check and validate the markup
Once you’ve generated the code, it’s just a matter of checking if it’s valid. If it’s not valid, your page won’t be eligible for rich results.
If you generated your code with a plugin or through your CMS, you can check it by:
- Opening the SEO Toolbar on the page you want to check.
- Going to the Structured data tab.
- Clicking on Validate and then the Rich Results Test.
Clicking this will take you to Google’s Rich Results Test. If it’s valid, you’ll see a green tick.
Once you’ve confirmed it’s present and valid, you can skip to step #3 below.
If you’ve manually added your schema code, you’ll need to make two checks:
- Check the code is valid before you implement it
- Check the code is valid after it’s added to your website
To see if your code snippet is valid, select “Code” on the Rich Results Test and paste your code snippet in.
If it’s valid, you’ll see a green tick appear under the subheadings “Detected items.”
Once you’ve validated your code, you can upload it to your website. Add it to the <head>
or <body>
of your website. Google has confirmed either is fine.
Once the code is added, you can run the page URL through the Rich Results Test to double-check it’s valid on-site.
This time, select “URL,” and enter a URL you want to test.
If it’s valid, you’ll see a green tick.
3. Monitor marked-up pages for performance and errors using Ahrefs
There are two reasons monitoring your marked-up pages is important:
- Websites break easily – Even if your code is valid on day #1, it can break later on. There may be code on other pages that isn’t valid as well.
- Existing code may be invalid – Old schema markup may be invalid and need fixing.
The best way to run a check is by using Ahrefs’ Site Audit—you can access this for free using Ahrefs Webmaster Tools.
Here’s how to check your website.
Once you’ve run your audit, head to the All issues report in Site Audit. If there are structured data issues, you’ll see a message like the one below.
Clicking on this issue will show all structured data issues on your website. There are 1,332 results in this example. I prioritize fixes for pages by sorting “Organic traffic” from high to low.
To do this, click on the “Organic traffic” header, then click “View issues” in the “Structured data issues” column to get more details about it.
Although you can check rich results status using Google Search Console (GSC), the advantage of using Site Audit is that you can find and diagnose invalid schema code before it gets picked up by Google by scheduling regular crawls.
That way, when you go to GSC, you’ll see nothing but green “Valid items” that are eligible for Google’s rich results, as you’ve already fixed any invalid code.
Final thoughts
Rich snippets often get more clicks than traditional “blue link” results. But whether they’re worth implementing for your website depends on the type of content you have.
You don’t need to be a coding expert to get rich snippets for your website—but it takes some work to get started. Even once everything is set up, there’s no guarantee they’ll show. Tools like Ahrefs’ Site Audit are helpful here, as they can help you validate and monitor your code.
SEO
The 25 Biggest Traffic Losers in SaaS
We analyzed the organic traffic growth of 1,600 SaaS companies to discover the SEO strategies that work best in 2024…
…and those that work the worst.
In this article, we’re looking at the companies that lost the greatest amount of estimated organic traffic, year over year.
- We analyzed 1,600 SaaS companies and used the Ahrefs API to pull estimated monthly organic traffic data for August 2023 and August 2024.
- Companies were ranked by estimated monthly organic traffic loss as a percentage of their starting traffic.
- We’ve filtered out traffic loss caused by website migrations and URL redirects and set a minimum starting traffic threshold of 10,000 monthly organic pageviews.
This is a list of the SaaS companies that had the greatest estimated monthly organic traffic loss from August 2023 to August 2024.
Sidenote.
Our organic traffic metrics are estimates, and not necessarily reflective of the company’s actual traffic (only they know that). Traffic loss is not always bad, and there are plenty of reasons why companies may choose to delete pages and sacrifice keyword rankings.
Rank | Company | Change | Monthly Organic Traffic 2023 | Monthly Organic Traffic 2024 | Traffic Loss |
---|---|---|---|---|---|
1 | Causal | -99.52% | 307,158 | 1,485 | -305,673 |
2 | Contently | -97.16% | 276,885 | 7,866 | -269,019 |
3 | Datanyze | -95.46% | 486,626 | 22,077 | -464,549 |
4 | BetterCloud | -94.14% | 42,468 | 2,489 | -39,979 |
5 | Ricotta Trivia | -91.46% | 193,713 | 16,551 | -177,162 |
6 | Colourbox | -85.43% | 67,883 | 9,888 | -57,995 |
7 | Tabnine | -84.32% | 160,328 | 25,142 | -135,186 |
8 | AppFollow | -83.72% | 35,329 | 5,753 | -29,576 |
9 | Serverless | -80.61% | 37,896 | 7,348 | -30,548 |
10 | UserGuiding | -80.50% | 115,067 | 22,435 | -92,632 |
11 | Hopin | -79.25% | 19,581 | 4,064 | -15,517 |
12 | Writer | -78.32% | 2,460,359 | 533,288 | -1,927,071 |
13 | NeverBounce by ZoomInfo | -77.91% | 552,780 | 122,082 | -430,698 |
14 | ZoomInfo | -76.11% | 5,192,624 | 1,240,481 | -3,952,143 |
15 | Sakari | -73.76% | 27,084 | 7,106 | -19,978 |
16 | Frase | -71.39% | 83,569 | 23,907 | -59,662 |
17 | LiveAgent | -70.03% | 322,613 | 96,700 | -225,913 |
18 | Scoro | -70.01% | 51,701 | 15,505 | -36,196 |
19 | accessiBe | -69.45% | 111,877 | 34,177 | -77,700 |
20 | Olist | -67.51% | 204,298 | 66,386 | -137,912 |
21 | Hevo Data | -66.96% | 235,427 | 77,781 | -157,646 |
22 | TextGears | -66.68% | 19,679 | 6,558 | -13,121 |
23 | Unbabel | -66.40% | 45,987 | 15,450 | -30,537 |
24 | Courier | -66.03% | 35,300 | 11,992 | -23,308 |
25 | G2 | -65.74% | 4,397,226 | 1,506,545 | -2,890,681 |
For each of the top five companies, I ran a five-minute analysis using Ahrefs Site Explorer to understand what may have caused their traffic decline.
Possible explanations include Google penalties, programmatic SEO, and AI content.
Causal | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 307,158 | 1,485 | -305,673 | -99.52% |
Organic pages | 5,868 | 547 | -5,321 | -90.68% |
Organic keywords | 222,777 | 4,023 | -218,754 | -98.19% |
Keywords in top 3 | 8,969 | 26 | -8943 | -99.71% |
Causal is a finance platform for startups. They lost an estimated 99.52% of their organic traffic as a result of a Google manual penalty:
This story might sound familiar. Causal became internet-famous for an “SEO heist” that saw them clone a competitor’s sitemap and use generative AI to publish 1,800 low-quality articles like this:
Google caught wind and promptly issued a manual penalty. Causal lost hundreds of rankings and hundreds of thousands of pageviews, virtually overnight:
As the Ahrefs SEO Toolbar shows, the offending blog posts are now 301 redirected to the company’s (now much better, much more human-looking) blog homepage:
Contently | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 276,885 | 7,866 | -269,019 | -97.16% |
Organic pages | 32,752 | 1,121 | -31,631 | -96.58% |
Organic keywords | 94,706 | 12,000 | -82,706 | -87.33% |
Keywords in top 3 | 1,874 | 68 | -1,806 | -96.37% |
Contently is a content marketing platform. They lost 97% of their estimated organic traffic by removing thousands of user-generated pages.
Almost all of the website’s traffic loss seems to stem from deindexing the subdomains used to host their members’ writing portfolios:
A quick Google search for “contently writer portfolios” suggests that the company made the deliberate decision to deindex all writer portfolios by default, and only relist them once they’ve been manually vetted and approved:
We can see that these portfolio subdomains are now 302 redirected back to Contently’s homepage:
And looking at the keyword rankings Contently lost in the process, it’s easy to guess why this change was necessary. It looks like the free portfolio subdomains were being abused to promote CBD gummies and pirated movies:
Datanyze | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 486,626 | 22,077 | -464,549 | -95.46% |
Organic pages | 1,168,889 | 377,142 | -791,747 | -67.74% |
Organic keywords | 2,565,527 | 712,270 | -1,853,257 | -72.24% |
Keywords in top 3 | 7,475 | 177 | -7,298 | -97.63% |
Datanyze provides contact data for sales prospecting. They lost 96% of their estimated organic traffic, possibly as a result of programmatic content that Google has since deemed too low quality to rank.
Looking at the Site Structure report in Ahrefs, we can see over 80% of the website’s organic traffic loss is isolated to the /companies and /people subfolders:
Looking at some of the pages in these subfolders, it looks like Datanyze built thousands of programmatic landing pages to help promote the people and companies the company offers data for:
As a result, the majority of Datanyze’s dropped keyword rankings are names of people and companies:
Many of these pages still return 200 HTTP status codes, and a Google site search still shows hundreds of indexed pages:
In this case, not all of the programmatic pages have been deleted—instead, it’s possible that Google has decided to rerank these pages into much lower positions and drop them from most SERPs.
BetterCloud | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 42,468 | 2,489 | -39,979 | -94.14% |
Organic pages | 1,643 | 504 | -1,139 | -69.32% |
Organic keywords | 107,817 | 5,806 | -102,011 | -94.61% |
Keywords in top 3 | 1,550 | 32 | -1,518 | -97.94% |
Bettercloud is a SaaS spend management platform. They lost 94% of their estimated organic traffic around the time of Google’s November Core Update:
Looking at the Top Pages report for BetterCloud, most of the traffic loss can be traced back to a now-deleted /academy subfolder:
The pages in the subfolder are now deleted, but by using Ahrefs’ Page Inspect feature, it’s possible to look at a snapshot of some of the pages’ HTML content.
This short, extremely generic article on “How to Delete an Unwanted Page in Google Docs” looks a lot like basic AI-generated content:
This is the type of content that Google has been keen to demote from the SERPs.
Given the timing of the website’s traffic drop (a small decline after the October core update, and a precipitous decline after the November core update), it’s possible that Google demoted the site after an AI content generation experiment.
Ricotta Trivia | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 193,713 | 16,551 | -177,162 | -91.46% |
Organic pages | 218 | 231 | 13 | 5.96% |
Organic keywords | 83,988 | 37,640 | -46,348 | -55.18% |
Keywords in top 3 | 3,124 | 275 | -2,849 | -91.20% |
Ricotta Trivia is a Slack add-on that offers icebreakers and team-building games. They lost an estimated 91% of their monthly organic traffic, possibly because of thin content and poor on-page experience on their blog.
Looking at the Site Structure report, 99.7% of the company’s traffic loss is isolated to the /blog subfolder:
Digging into the Organic keywords report, we can see that the website has lost hundreds of first-page rankings for high-volume keywords like get to know you questions, funny team names, and question of the day:
While these keywords seem strongly related to the company’s core business, the article content itself seems very thin—and the page is covered with intrusive advertising banners and pop-ups (a common hypothesis for why some sites were negatively impacted by recent Google updates):
The site seems to show a small recovery on the back of the August 2024 core update—so there may be hope yet.
Final thoughts
All of the data for this article comes from Ahrefs. Want to research your competitors in the same way? Check out Site Explorer.
SEO
Mediavine Bans Publisher For Overuse Of AI-Generated Content
According to details surfacing online, ad management firm Mediavine is terminating publishers’ accounts for overusing AI.
Mediavine is a leading ad management company providing products and services to help website publishers monetize their content.
The company holds elite status as a Google Certified Publishing Partner, which indicates that it meets Google’s highest standards and requirements for ad networks and exchanges.
AI Content Triggers Account Terminations
The terminations came to light in a post on the Reddit forum r/Blogging, where a user shared an email they received from Mediavine citing “overuse of artificially created content.”
Trista Jensen, Mediavine’s Director of Ad Operations & Market Quality, states in the email:
“Our third party content quality tools have flagged your sites for overuse of artificially created content. Further internal investigation has confirmed those findings.”
Jensen stated that due to the overuse of AI content, “our top partners will stop spending on your sites, which will negatively affect future monetization efforts.”
Consequently, Mediavine terminated the publisher’s account “effective immediately.”
The Risks Of Low-Quality AI Content
This strict enforcement aligns with Mediavine’s publicly stated policy prohibiting websites from using “low-quality, mass-produced, unedited or undisclosed AI content that is scraped from other websites.”
In a March 7 blog post titled “AI and Our Commitment to a Creator-First Future,” the company declared opposition to low-value AI content that could “devalue the contributions of legitimate content creators.”
Mediavine warned in the post:
“Without publishers, there is no open web. There is no content to train the models that power AI. There is no internet.”
The company says it’s using its platform to “advocate for publishers” and uphold quality standards in the face of AI’s disruptive potential.
Mediavine states:
“We’re also developing faster, automated tools to help us identify low-quality, mass-produced AI content across the web.”
Targeting ‘AI Clickbait Kingpin’ Tactics
While the Reddit user’s identity wasn’t disclosed, the incident has drawn connections to the tactics of Nebojša Vujinović Vujo, who was dubbed an “AI Clickbait Kingpin” in a recent Wired exposé.
According to Wired, Vujo acquired over 2,000 dormant domains and populated them with AI-generated, search-optimized content designed purely to capture ad revenue.
His strategies represent the low-quality, artificial content Mediavine has vowed to prohibit.
Potential Implications
Lost Revenue
Mediavine’s terminations highlight potential implications for publishers that rely on artificial intelligence to generate website content at scale.
Perhaps the most immediate and tangible implication is the risk of losing ad revenue.
For publishers that depend heavily on programmatic advertising or sponsored content deals as key revenue drivers, being blocked from major ad networks could devastate their business models.
Devalued Domains
Another potential impact is the devaluation of domains and websites built primarily on AI-generated content.
If this pattern of AI content overuse triggers account terminations from companies like Mediavine, it could drastically diminish the value proposition of scooping up these domains.
Damaged Reputations & Brands
Beyond the lost monetization opportunities, publishers leaning too heavily into automated AI content also risk permanent reputational damage to their brands.
Once a determining authority flags a website for AI overuse, it could impact how that site is perceived by readers, other industry partners, and search engines.
In Summary
AI has value as an assistive tool for publishers, but relying heavily on automated content creation poses significant risks.
These include monetization challenges, potential reputation damage, and increasing regulatory scrutiny. Mediavine’s strict policy illustrates the possible consequences for publishers.
It’s important to note that Mediavine’s move to terminate publisher accounts over AI content overuse represents an independent policy stance taken by the ad management firm itself.
The action doesn’t directly reflect the content policies or enforcement positions of Google, whose publishing partner program Mediavine is certified under.
We have reached out to Mediavine requesting a comment on this story. We’ll update this article with more information when it’s provided.
Featured Image: Simple Line/Shutterstock
SEO
Google’s Guidance About The Recent Ranking Update
Google’s Danny Sullivan explained the recent update, addressing site recoveries and cautioning against making radical changes to improve rankings. He also offered advice for publishes whose rankings didn’t improve after the last update.
Google’s Still Improving The Algorithm
Danny said that Google is still working on their ranking algorithm, indicating that more changes (for the positive) are likely on the way. The main idea he was getting across is that they’re still trying to fill the gaps in surfacing high quality content from independent sites. Which is good because big brand sites don’t necessarily have the best answers.
He wrote:
“…the work to connect people with “a range of high quality sites, including small or independent sites that are creating useful, original content” is not done with this latest update. We’re continuing to look at this area and how to improve further with future updates.”
A Message To Those Who Were Left Behind
There was a message to those publishers whose work failed to recover with the latest update, to let them know that Google is still working to surface more of the independent content and that there may be relief on the next go.
Danny advised:
“…if you’re feeling confused about what to do in terms of rankings…if you know you’re producing great content for your readers…If you know you’re producing it, keep doing that…it’s to us to keep working on our systems to better reward it.”
Google Cautions Against “Improving” Sites
Something really interesting that he mentioned was a caution against trying to improve rankings of something that’s already on page one in order to rank even higher. Tweaking a site to get from position six or whatever to something higher has always been a risky thing to do for many reasons I won’t elaborate on here. But Danny’s warning increases the pressure to not just think twice before trying to optimize a page for search engines but to think three times and then some more.
Danny cautioned that sites that make it to the top of the SERPs should consider that a win and to let it ride instead of making changes right now in order to improve their rankings. The reason for that caution is that the search results continue to change and the implication is that changing a site now may negatively impact the rankings in a newly updated search index.
He wrote:
“If you’re showing in the top results for queries, that’s generally a sign that we really view your content well. Sometimes people then wonder how to move up a place or two. Rankings can and do change naturally over time. We recommend against making radical changes to try and move up a spot or two”
How Google Handled Feedback
There was also some light shed on what Google did with all the feedback they received from publishers who lost rankings. Danny wrote that the feedback and site examples he received was summarized, with examples, and sent to the search engineers for review. They continue to use that feedback for the next round of improvements.
He explained:
“I went through it all, by hand, to ensure all the sites who submitted were indeed heard. You were, and you continue to be. …I summarized all that feedback, pulling out some of the compelling examples of where our systems could do a better job, especially in terms of rewarding open web creators. Our search engineers have reviewed it and continue to review it, along with other feedback we receive, to see how we can make search better for everyone, including creators.”
Feedback Itself Didn’t Lead To Recovery
Danny also pointed out that sites that recovered their rankings did not do so because of they submitted feedback to Google. Danny wasn’t specific about this point but it conforms with previous statements about Google’s algorithms that they implement fixes at scale. So instead of saying, “Hey let’s fix the rankings of this one site” it’s more about figuring out if the problem is symptomatic of something widescale and how to change things for everybody with the same problem.
Danny wrote:
“No one who submitted, by the way, got some type of recovery in Search because they submitted. Our systems don’t work that way.”
That feedback didn’t lead to recovery but was used as data shouldn’t be surprising. Even as far back as the 2004 Florida Update Matt Cutts collected feedback from people, including myself, and I didn’t see a recovery for a false positive until everyone else also got back their rankings.
Takeaways
Google’s work on their algorithm is ongoing:
Google is continuing to tune its algorithms to improve its ability to rank high quality content, especially from smaller publishers. Danny Sullivan emphasized that this is an ongoing process.
What content creators should focus on:
Danny’s statement encouraged publishers to focus on consistently creating high quality content and not to focus on optimizing for algorithms. Focusing on quality should be the priority.
What should publishers do if their high-quality content isn’t yet rewarded with better rankings?
Publishers who are certain of the quality of their content are encouraged to hold steady and keep it coming because Google’s algorithms are still being refined.
Featured Image by Shutterstock/Cast Of Thousands
-
SEO6 days ago
How to Market When Information is Dirt Cheap
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 2, 2024
-
SEO4 days ago
Early Analysis & User Feedback
-
SEARCHENGINES5 days ago
Daily Search Forum Recap: September 3, 2024
-
SEO5 days ago
Google Trends Subscriptions Quietly Canceled
-
SEO6 days ago
What Is Largest Contentful Paint: An Easy Explanation
-
WORDPRESS6 days ago
MyDataNinja
-
AFFILIATE MARKETING3 days ago
What Is Founder Mode and Why Is It Better Than Manager Mode?