SEO
The Three Pillars Of SEO: Authority, Relevance, And Experience
What do you need to compete in SEO?
Some say more inbound links, others better content, while some might emphasize a technically healthy site.
Experienced SEOs know that the most successful sites when it comes to organic search have the right mix of high-level fundamentals.
In recent years, there has been a lot of attention around E-A-T (Expertise, Authority, and Trustworthiness) as mentioned in Google’s Search Rater Quality Guidelines.
Some have come to think of these as the most fundamental aspects of SEO.
However, as important as E-A-T maybe for some sites, it only addresses one aspect: content.
A holistic SEO program needs to include more.
Over the years, I’ve come to think that SEO can be reduced at its most fundamental level to building three things into a site and its pages:
- Authority.
- Relevance.
- Experience (of the users and bots visiting the site).
The sites that pay attention to all three of these are more likely to be valued by both search engines and users, and attract more organic traffic over time.
Notice that one of my categories, Authority, overlaps with an E-A-T category.
That’s because I believe at the highest level of SEO, expertise and trustworthiness are really parts of what makes a site or page authoritative.
Let’s dive into each of these A-R-E categories to see how they should be incorporated into a holistic SEO program.
Authority: Do You Matter?
In SEO, authority refers to the importance or weight given to a page relative to a given search query.
Modern search engines such as Google use many factors (or signals) when evaluating the authority of a webpage.
Why does Google care about assessing the authority of a page?
For most queries, there are thousands or even millions of pages available that could be ranked.
Google wants to bring to the top the ones that are most likely to satisfy the user with accurate, reliable information that fully answers the intent of the query.
Google cares about serving users the most authoritative pages for their queries because users that are satisfied by the pages they click through to from Google are more likely to use Google again, and thus get more exposure to Google’s ads, the primary source of its revenue.
Authority Came First
Assessing the authority of webpages was the first fundamental problem search engines had to solve.
Some of the earliest search engines relied on human evaluators, but as the world wide web exploded, that quickly became impossible to scale.
Google overtook all its rivals because its creators, Larry Page and Sergey Brin, developed the idea of PageRank, using links from other pages on the web as weighed citations to assess the authoritativeness of a page.
Page and Brin realized that links were an already-existing system of constantly evolving polling where other authoritative sites “voted” for pages they saw as reliable and relevant to their users.
Search engines use links much like we might treat scholarly citations. The more scholarly papers relevant to a source document that cite it, the better.
The relative authority and trustworthiness of each of the citing source come into play as well.
So, of our three fundamental categories, authority came first because it was the easiest of the three to crack given the ubiquity of hyperlinks on the web.
The other two, relevance and user experience, would be tackled later, as machine learning/AI-driven algorithms developed.
Links Still Primary For Authority
The big innovation that made Google the dominant search engine in a short period was that it used an analysis of links on the web as a ranking factor.
This started with a paper written by Larry Page and Sergey Brin called The Anatomy of a Large-Scale Hypertextual Web Search Engine.
The essential insight behind this paper was that the web is built on the notion of documents inter-connected with each other via links.
Since putting a link on your site to a third-party site might cause a user to leave your site, there was little incentive for a publisher to link to another site, unless it was really good and of great value to their site’s users.
In other words, linking to a third-party site acts a bit like a “vote” for it, and each vote could be considered an endorsement, endorsing the page the link points to as one of the best resources on the web for a given topic.
Then, in principle, the more votes you get, the better and the more authoritative a search engine would consider you to be, and you should therefore rank higher.
Passing PageRank
A significant piece of the initial Google algorithm was based on the concept of PageRank, a system for evaluating which pages are the most important based on scoring the links they receive.
So, a page that has large quantities of valuable links pointing to it will have a higher PageRank, and in principle will be likely to rank higher in the search results than other pages without as high a PageRank score.
When a page links to another page, it passes a portion of its PageRank to the page it links to.
Thus, pages accumulate more PageRank based on the number and quality of links they receive.
Not All Links Are Created Equal
So more votes are better, right?
Well, that’s true in theory, but it’s a lot more complicated than that.
PageRank scores range from a base value of one to values that likely exceed trillions.
Higher PageRank pages can have a lot more PageRank to pass than lower PageRank pages. In fact, a link from one page can easily be worth more than one million times a link from another page.
Let’s use our intuition for a moment.
Imagine you have a page that’s selling a book, and it gets two links. One is from Joe’s Book Store, and the other one is from Amazon.
It’s pretty obvious which one you would value more as a user, right? As users, we recognize that Amazon has more authority on this topic.
As it turns out, the web has recognized this as well, and Amazon has a much more powerful link profile (and higher PageRank) than any other site involved in selling books.
As a result, it has a much higher PageRank, and can pass more PageRank to the pages that it links to.
It’s important to note that Google’s algorithms have evolved a long way from the original PageRank thesis.
The way that links are evaluated has changed in significant ways – some of which we know, and some of which we don’t.
What About Trust?
You may hear many people talk about the role of trust in search rankings and in evaluating link quality.
For the record, Google says they don’t have a concept of trust they apply to links (or ranking), so you should take those discussions with many grains of salt.
These discussions began because of a Yahoo patent on the concept of TrustRank.
The idea was that if you started with a seed set of hand-picked, highly trusted sites, and you then counted the number of clicks it took you to go from those sites to yours, the fewer clicks the more trusted your site was.
Google has long said they don’t use this type of metric.
However, in April 2018, Google was granted a patent related to evaluating the trustworthiness of links. But the existence of a granted patent does not mean it’s used in practice.
For your own purposes, however, if you want to assess the trustworthiness of a site as a source of a link, using the concept of trusted links is not a bad idea.
If they do any of the following, then it probably isn’t a good source for a link:
- Sell links to others.
- Have less than great content.
- Otherwise don’t appear reputable.
Google may not be calculating trust the way you do in your analysis, but chances are good that some other aspect of their system will devalue that link anyway.
Fundamentals Of Earning & Attracting Links
Now that you know that obtaining links to your site is critical to SEO success, it’s time to start putting together a plan to get some.
The key to success is understanding that Google wants this entire process to be holistic.
Google actively discourages, and in some cases punishes, schemes to get links in an artificial way. This means certain practices are seen as bad, such as:
- Buying links for SEO purposes.
- Going to forums and blogs and adding comments with links back to your site.
- Hacking people’s sites and injecting links into their content.
- Distributing poor-quality infographics or widgets that include links back to your pages.
- Offering discount codes or affiliate programs as a way to get links.
- And many other schemes where the resulting links are artificial in nature.
What Google really wants is for you to make a fantastic website, and promote it effectively, with the result that you earn or attract links.
So, how do you do that?
Who Links?
The first key insight is to understand who it is that might link to the content that you create.
Here is a chart that profiles the major groups of people in any given market space:
Who do you think are the people that might implement links?
It’s certainly not the laggards, and it’s also not the early or late majority.
It’s the innovators and early adopters. These are the people who write on media sites, or have blogs, and who might add links to your site.
There are also other sources of links, such as locally-oriented sites, such as the local chamber of commerce or local newspapers.
You might also find some opportunities with colleges and universities if they have pages that relate to some of the things you’re doing in your market space.
Relevance: Will Users Swipe Right On Your Page?
You have to be relevant to a given topic.
Think of every visit to a page as an encounter on a dating app. Will users “swipe right” (thinking, “this looks like a good match!)?
If you have a page about Tupperware, it doesn’t matter how many links you get – you’ll never rank for queries related to used cars.
This defines a limitation on the power of links as a ranking factor, and it shows how relevance also impacts the value of a link.
Consider a page on a site that is selling a used Ford Mustang. Imagine that it gets a link from Car and Driver magazine. That link is highly relevant.
Also, think of this intuitively. Is it likely that Car and Driver magazine has some expertise related to Ford Mustangs? Of course, they do.
In contrast, imagine a link to that Ford Mustang from a site that usually writes about sports. Is the link still helpful?
Probably, but not as helpful, because there is less evidence to Google that the sports site has a lot of knowledge about used Ford Mustangs.
In short, the relevance of the linking page, and the linking site, impacts how valuable a link might be considered.
What are some ways that Google evaluates relevance?
The Role Of Anchor Text
Anchor text is another aspect of links that matters to Google.
The anchor text helps Google confirm what the content on the page receiving the link is about.
For example, if the anchor text is the phrase “iron bathtubs” and the page has content on that topic, the anchor text, plus the link, acts as further confirmation that the page is about that topic.
Thus, the links act to evaluate both the relevance and authority of the page.
Be careful, though, as you don’t want to go aggressively obtaining links to your page that all use your main key phrase as the anchor text.
Google also looks for signs that you are manually manipulating links for SEO purposes.
One of the simplest indicators is if your anchor text looks manually manipulated.
Internal Linking
There is growing evidence that Google uses internal linking to evaluate how relevant a site is to a topic.
Properly structured internal links connecting related content are a way of showing Google that you have the topic well-covered, with pages about many different aspects.
By the way, anchor text is as important when creating external links as it is for external, inbound links.
Related to internal linking is your overall site structure.
Think strategically about where your pages fall in your site hierarchy. If it makes sense for users it will probably be useful to search engines.
The Content Itself
Of course, the most important indicator of the relevance of a page has to be the content on that page.
Most SEOs are aware that assessing the relevance of content to a query has become way more sophisticated than merely having the keywords a user is searching for.
Due to advances in natural language processing and machine learning, search engines like Google have vastly increased their competence in being able to assess the content on a page.
What are some things Google likely looks for in determining what queries a page should be relevant for?
- Keywords: While the days of keyword stuffing as an effective SEO tactic are (thankfully) way behind us, having certain words on a page still matters. My company has numerous case studies showing that merely adding key terms that are common among top-ranking pages for a topic is often enough to increase organic traffic to a page.
- Depth: The top-ranking pages for a topic usually cover the topic at the right depth. That is, they have enough content to cover the topic to satisfy searchers for a query, and/or are linked to/from pages that help flesh out the topic.
- Structure: Structural elements like H1…H2…H3, bolded topic headings, and schema structured data may help Google better understand the relevance and coverage of a page.
What About E-A-T?
Of course, Google encourages all site owners to create content that makes a visitor feel like this is authoritative, trustworthy content written by someone with expertise appropriate to the topic.
But how much they do or are able to evaluate those categories is still a topic of debate.
The main thing to keep in mind is that the more YMYL (Your Money or Your Life) your site is the more you should pay attention to E-A-T.
YMYL sites are those whose main content addresses things that might have an effect on people’s well-being or finances.
If your site is YMYL, you should go the extra mile in ensuring the accuracy of your content, and displaying that you have qualified experts writing it.
Building A Content Marketing Plan
Last, but certainly not least, create a real plan for your content marketing.
Don’t just suddenly start doing a lot of random stuff.
Take the time to study what your competitors are doing so you can invest your content marketing efforts in a way that’s likely to provide a solid ROI.
One approach to doing that is to pull their backlink profiles using tools that can do that.
With this information, you can see what types of links they’ve been getting and then based on that figure out what links you need to get to beat them.
Take the time to do this exercise and also to map which links are going to which pages on the competitors’ sites, as well as what each of those pages rank for.
Building out this kind of detailed view will help you scope out your plan of attack and give you some understanding of what keywords you might be able to rank for.
It’s well worth the effort!
In addition, study the competitor’s content plans.
Learn what they are doing and carefully consider what you can do that’s different.
Focus on developing a very clear differentiation in your content for topics that are in high demand with your potential customers.
This is another investment of time that will be very well spent.
Experience
As we traced above, Google started by focusing on the ranking pages by authority, then found ways to assess relevance.
The third evolution of search was the evaluation of user experience.
In fact, many SEOs (and I’m among them) prefer to speak of SEO not as Search Engine Optimization, but as Search Experience Optimization.
Google realized that authoritativeness and relevancy, as important as they are, were not the only things users were looking for when searching.
Users also want a good experience on the pages and sites Google sends them to.
What is a “good user experience”? It includes at least the following:
- The page the searcher lands on is what they would expect to see given their query. No bait and switch.
- The content on the landing page is highly relevant to the user’s query.
- The content is sufficient to answer the intent of the user’s query but also links to other relevant sources and related topics.
- The page loads quickly, the relevant content is immediately apparent, and page elements settle into place quickly (all aspects of Google’s Page Experience Update).
In addition, many of the suggestions made above about creating better content apply to user experience as well.
In summary, Google wants to rank pages that satisfy the query and make it as easy as possible for the searcher to identify and understand what they were searching for.
Putting It All Together
Search engines want happy users who will come back to them again and again when they have a question or need.
The way they create and sustain that happiness is by providing the best possible results that satisfy that question or need.
To keep their users happy, search engines must be able to understand and measure the relative authority of webpages for the topics they cover.
When you create content that is highly useful (or engaging or entertaining) to visitors – and when those visitors find your content reliable enough that they would willingly return again to your site, or even seek you out above others – you’ve gained authority.
The search engines work hard at continually improving their ability to match that human quest for trustworthy authority.
As we explained above, that same kind of quality content is key to earning the kinds of links that assure the search engines you should rank highly for relevant searches.
That can be either content on your site that others want to link to or content that other quality, relevant sites want to publish, with appropriate links back to your site.
Focusing on these three pillars of SEO – authority, relevance, and experience – will increase the opportunities for your content and make link-earning easier.
You now have everything you need to know for SEO success, so get to work!
Featured Image: Paulo Bobita/Search Engine Journal
SEO
Google Ads To Phase Out Enhanced CPC Bidding Strategy
Google has announced plans to discontinue its Enhanced Cost-Per-Click (eCPC) bidding strategy for search and display ad campaigns.
This change, set to roll out in stages over the coming months, marks the end of an era for one of Google’s earliest smart bidding options.
Dates & Changes
Starting October 2024, new search and display ad campaigns will no longer be able to select Enhanced CPC as a bidding strategy.
However, existing eCPC campaigns will continue to function normally until March 2025.
From March 2025, all remaining search and display ad campaigns using Enhanced CPC will be automatically migrated to manual CPC bidding.
Advertisers who prefer not to change their campaigns before this date will see their bidding strategy default to manual CPC.
Impact On Display Campaigns
No immediate action is required for advertisers running display campaigns with the Maximize Clicks strategy and Enhanced CPC enabled.
These campaigns will automatically transition to the Maximize Clicks bidding strategy in March 2025.
Rationale Behind The Change
Google introduced Enhanced CPC over a decade ago as its first Smart Bidding strategy. The company has since developed more advanced machine learning-driven bidding options, such as Maximize Conversions with an optional target CPA and Maximize Conversion Value with an optional target ROAS.
In an email to affected advertisers, Google stated:
“These strategies have the potential to deliver comparable or superior outcomes. As we transition to these improved strategies, search and display ads campaigns will phase out Enhanced CPC.”
What This Means for Advertisers
This update signals Google’s continued push towards more sophisticated, AI-driven bidding strategies.
In the coming months, advertisers currently relying on Enhanced CPC will need to evaluate their options and potentially adapt their campaign management approaches.
While the change may require some initial adjustments, it also allows advertisers to explore and leverage Google’s more advanced bidding strategies, potentially improving campaign performance and efficiency.
FAQ
What change is Google implementing for Enhanced CPC bidding?
Google will discontinue the Enhanced Cost-Per-Click (eCPC) bidding strategy for search and display ad campaigns.
- New search and display ad campaigns can’t select eCPC starting October 2024.
- Existing campaigns will function with eCPC until March 2025.
- From March 2025, remaining eCPC campaigns will switch to manual CPC bidding.
How will this update impact existing campaigns using Enhanced CPC?
Campaigns using Enhanced CPC will continue as usual until March 2025. After that:
- Search and display ad campaigns employing eCPC will automatically migrate to manual CPC bidding.
- Display campaigns with Maximize Clicks and eCPC enabled will transition to the Maximize Clicks strategy in March 2025.
What are the recommended alternatives to Enhanced CPC?
Google suggests using its more advanced, AI-driven bidding strategies:
- Maximize Conversions – Can include an optional target CPA (Cost Per Acquisition).
- Maximize Conversion Value – Can include an optional target ROAS (Return on Ad Spend).
These strategies are expected to deliver comparable or superior outcomes compared to Enhanced CPC.
What should advertisers do in preparation for this change?
Advertisers need to evaluate their current reliance on Enhanced CPC and explore alternatives:
- Assess how newer AI-driven bidding strategies can be integrated into their campaigns.
- Consider transitioning some campaigns earlier to adapt to the new strategies gradually.
- Leverage tools and resources provided by Google to maximize performance and efficiency.
This proactive approach will help manage changes smoothly and explore potential performance improvements.
Featured Image: Vladimka production/Shutterstock
SEO
The 25 Biggest Traffic Losers in SaaS
We analyzed the organic traffic growth of 1,600 SaaS companies to discover the SEO strategies that work best in 2024…
…and those that work the worst.
In this article, we’re looking at the companies that lost the greatest amount of estimated organic traffic, year over year.
- We analyzed 1,600 SaaS companies and used the Ahrefs API to pull estimated monthly organic traffic data for August 2023 and August 2024.
- Companies were ranked by estimated monthly organic traffic loss as a percentage of their starting traffic.
- We’ve filtered out traffic loss caused by website migrations and URL redirects and set a minimum starting traffic threshold of 10,000 monthly organic pageviews.
This is a list of the SaaS companies that had the greatest estimated monthly organic traffic loss from August 2023 to August 2024.
Sidenote.
Our organic traffic metrics are estimates, and not necessarily reflective of the company’s actual traffic (only they know that). Traffic loss is not always bad, and there are plenty of reasons why companies may choose to delete pages and sacrifice keyword rankings.
Rank | Company | Change | Monthly Organic Traffic 2023 | Monthly Organic Traffic 2024 | Traffic Loss |
---|---|---|---|---|---|
1 | Causal | -99.52% | 307,158 | 1,485 | -305,673 |
2 | Contently | -97.16% | 276,885 | 7,866 | -269,019 |
3 | Datanyze | -95.46% | 486,626 | 22,077 | -464,549 |
4 | BetterCloud | -94.14% | 42,468 | 2,489 | -39,979 |
5 | Ricotta Trivia | -91.46% | 193,713 | 16,551 | -177,162 |
6 | Colourbox | -85.43% | 67,883 | 9,888 | -57,995 |
7 | Tabnine | -84.32% | 160,328 | 25,142 | -135,186 |
8 | AppFollow | -83.72% | 35,329 | 5,753 | -29,576 |
9 | Serverless | -80.61% | 37,896 | 7,348 | -30,548 |
10 | UserGuiding | -80.50% | 115,067 | 22,435 | -92,632 |
11 | Hopin | -79.25% | 19,581 | 4,064 | -15,517 |
12 | Writer | -78.32% | 2,460,359 | 533,288 | -1,927,071 |
13 | NeverBounce by ZoomInfo | -77.91% | 552,780 | 122,082 | -430,698 |
14 | ZoomInfo | -76.11% | 5,192,624 | 1,240,481 | -3,952,143 |
15 | Sakari | -73.76% | 27,084 | 7,106 | -19,978 |
16 | Frase | -71.39% | 83,569 | 23,907 | -59,662 |
17 | LiveAgent | -70.03% | 322,613 | 96,700 | -225,913 |
18 | Scoro | -70.01% | 51,701 | 15,505 | -36,196 |
19 | accessiBe | -69.45% | 111,877 | 34,177 | -77,700 |
20 | Olist | -67.51% | 204,298 | 66,386 | -137,912 |
21 | Hevo Data | -66.96% | 235,427 | 77,781 | -157,646 |
22 | TextGears | -66.68% | 19,679 | 6,558 | -13,121 |
23 | Unbabel | -66.40% | 45,987 | 15,450 | -30,537 |
24 | Courier | -66.03% | 35,300 | 11,992 | -23,308 |
25 | G2 | -65.74% | 4,397,226 | 1,506,545 | -2,890,681 |
For each of the top five companies, I ran a five-minute analysis using Ahrefs Site Explorer to understand what may have caused their traffic decline.
Possible explanations include Google penalties, programmatic SEO, and AI content.
Causal | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 307,158 | 1,485 | -305,673 | -99.52% |
Organic pages | 5,868 | 547 | -5,321 | -90.68% |
Organic keywords | 222,777 | 4,023 | -218,754 | -98.19% |
Keywords in top 3 | 8,969 | 26 | -8943 | -99.71% |
Causal is a finance platform for startups. They lost an estimated 99.52% of their organic traffic as a result of a Google manual penalty:
This story might sound familiar. Causal became internet-famous for an “SEO heist” that saw them clone a competitor’s sitemap and use generative AI to publish 1,800 low-quality articles like this:
Google caught wind and promptly issued a manual penalty. Causal lost hundreds of rankings and hundreds of thousands of pageviews, virtually overnight:
As the Ahrefs SEO Toolbar shows, the offending blog posts are now 301 redirected to the company’s (now much better, much more human-looking) blog homepage:
Contently | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 276,885 | 7,866 | -269,019 | -97.16% |
Organic pages | 32,752 | 1,121 | -31,631 | -96.58% |
Organic keywords | 94,706 | 12,000 | -82,706 | -87.33% |
Keywords in top 3 | 1,874 | 68 | -1,806 | -96.37% |
Contently is a content marketing platform. They lost 97% of their estimated organic traffic by removing thousands of user-generated pages.
Almost all of the website’s traffic loss seems to stem from deindexing the subdomains used to host their members’ writing portfolios:
A quick Google search for “contently writer portfolios” suggests that the company made the deliberate decision to deindex all writer portfolios by default, and only relist them once they’ve been manually vetted and approved:
We can see that these portfolio subdomains are now 302 redirected back to Contently’s homepage:
And looking at the keyword rankings Contently lost in the process, it’s easy to guess why this change was necessary. It looks like the free portfolio subdomains were being abused to promote CBD gummies and pirated movies:
Datanyze | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 486,626 | 22,077 | -464,549 | -95.46% |
Organic pages | 1,168,889 | 377,142 | -791,747 | -67.74% |
Organic keywords | 2,565,527 | 712,270 | -1,853,257 | -72.24% |
Keywords in top 3 | 7,475 | 177 | -7,298 | -97.63% |
Datanyze provides contact data for sales prospecting. They lost 96% of their estimated organic traffic, possibly as a result of programmatic content that Google has since deemed too low quality to rank.
Looking at the Site Structure report in Ahrefs, we can see over 80% of the website’s organic traffic loss is isolated to the /companies and /people subfolders:
Looking at some of the pages in these subfolders, it looks like Datanyze built thousands of programmatic landing pages to help promote the people and companies the company offers data for:
As a result, the majority of Datanyze’s dropped keyword rankings are names of people and companies:
Many of these pages still return 200 HTTP status codes, and a Google site search still shows hundreds of indexed pages:
In this case, not all of the programmatic pages have been deleted—instead, it’s possible that Google has decided to rerank these pages into much lower positions and drop them from most SERPs.
BetterCloud | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 42,468 | 2,489 | -39,979 | -94.14% |
Organic pages | 1,643 | 504 | -1,139 | -69.32% |
Organic keywords | 107,817 | 5,806 | -102,011 | -94.61% |
Keywords in top 3 | 1,550 | 32 | -1,518 | -97.94% |
Bettercloud is a SaaS spend management platform. They lost 94% of their estimated organic traffic around the time of Google’s November Core Update:
Looking at the Top Pages report for BetterCloud, most of the traffic loss can be traced back to a now-deleted /academy subfolder:
The pages in the subfolder are now deleted, but by using Ahrefs’ Page Inspect feature, it’s possible to look at a snapshot of some of the pages’ HTML content.
This short, extremely generic article on “How to Delete an Unwanted Page in Google Docs” looks a lot like basic AI-generated content:
This is the type of content that Google has been keen to demote from the SERPs.
Given the timing of the website’s traffic drop (a small decline after the October core update, and a precipitous decline after the November core update), it’s possible that Google demoted the site after an AI content generation experiment.
Ricotta Trivia | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 193,713 | 16,551 | -177,162 | -91.46% |
Organic pages | 218 | 231 | 13 | 5.96% |
Organic keywords | 83,988 | 37,640 | -46,348 | -55.18% |
Keywords in top 3 | 3,124 | 275 | -2,849 | -91.20% |
Ricotta Trivia is a Slack add-on that offers icebreakers and team-building games. They lost an estimated 91% of their monthly organic traffic, possibly because of thin content and poor on-page experience on their blog.
Looking at the Site Structure report, 99.7% of the company’s traffic loss is isolated to the /blog subfolder:
Digging into the Organic keywords report, we can see that the website has lost hundreds of first-page rankings for high-volume keywords like get to know you questions, funny team names, and question of the day:
While these keywords seem strongly related to the company’s core business, the article content itself seems very thin—and the page is covered with intrusive advertising banners and pop-ups (a common hypothesis for why some sites were negatively impacted by recent Google updates):
The site seems to show a small recovery on the back of the August 2024 core update—so there may be hope yet.
Final thoughts
All of the data for this article comes from Ahrefs. Want to research your competitors in the same way? Check out Site Explorer.
SEO
Mediavine Bans Publisher For Overuse Of AI-Generated Content
According to details surfacing online, ad management firm Mediavine is terminating publishers’ accounts for overusing AI.
Mediavine is a leading ad management company providing products and services to help website publishers monetize their content.
The company holds elite status as a Google Certified Publishing Partner, which indicates that it meets Google’s highest standards and requirements for ad networks and exchanges.
AI Content Triggers Account Terminations
The terminations came to light in a post on the Reddit forum r/Blogging, where a user shared an email they received from Mediavine citing “overuse of artificially created content.”
Trista Jensen, Mediavine’s Director of Ad Operations & Market Quality, states in the email:
“Our third party content quality tools have flagged your sites for overuse of artificially created content. Further internal investigation has confirmed those findings.”
Jensen stated that due to the overuse of AI content, “our top partners will stop spending on your sites, which will negatively affect future monetization efforts.”
Consequently, Mediavine terminated the publisher’s account “effective immediately.”
The Risks Of Low-Quality AI Content
This strict enforcement aligns with Mediavine’s publicly stated policy prohibiting websites from using “low-quality, mass-produced, unedited or undisclosed AI content that is scraped from other websites.”
In a March 7 blog post titled “AI and Our Commitment to a Creator-First Future,” the company declared opposition to low-value AI content that could “devalue the contributions of legitimate content creators.”
Mediavine warned in the post:
“Without publishers, there is no open web. There is no content to train the models that power AI. There is no internet.”
The company says it’s using its platform to “advocate for publishers” and uphold quality standards in the face of AI’s disruptive potential.
Mediavine states:
“We’re also developing faster, automated tools to help us identify low-quality, mass-produced AI content across the web.”
Targeting ‘AI Clickbait Kingpin’ Tactics
While the Reddit user’s identity wasn’t disclosed, the incident has drawn connections to the tactics of Nebojša Vujinović Vujo, who was dubbed an “AI Clickbait Kingpin” in a recent Wired exposé.
According to Wired, Vujo acquired over 2,000 dormant domains and populated them with AI-generated, search-optimized content designed purely to capture ad revenue.
His strategies represent the low-quality, artificial content Mediavine has vowed to prohibit.
Potential Implications
Lost Revenue
Mediavine’s terminations highlight potential implications for publishers that rely on artificial intelligence to generate website content at scale.
Perhaps the most immediate and tangible implication is the risk of losing ad revenue.
For publishers that depend heavily on programmatic advertising or sponsored content deals as key revenue drivers, being blocked from major ad networks could devastate their business models.
Devalued Domains
Another potential impact is the devaluation of domains and websites built primarily on AI-generated content.
If this pattern of AI content overuse triggers account terminations from companies like Mediavine, it could drastically diminish the value proposition of scooping up these domains.
Damaged Reputations & Brands
Beyond the lost monetization opportunities, publishers leaning too heavily into automated AI content also risk permanent reputational damage to their brands.
Once a determining authority flags a website for AI overuse, it could impact how that site is perceived by readers, other industry partners, and search engines.
In Summary
AI has value as an assistive tool for publishers, but relying heavily on automated content creation poses significant risks.
These include monetization challenges, potential reputation damage, and increasing regulatory scrutiny. Mediavine’s strict policy illustrates the possible consequences for publishers.
It’s important to note that Mediavine’s move to terminate publisher accounts over AI content overuse represents an independent policy stance taken by the ad management firm itself.
The action doesn’t directly reflect the content policies or enforcement positions of Google, whose publishing partner program Mediavine is certified under.
We have reached out to Mediavine requesting a comment on this story. We’ll update this article with more information when it’s provided.
Featured Image: Simple Line/Shutterstock
-
SEO7 days ago
How to Market When Information is Dirt Cheap
-
SEO5 days ago
Early Analysis & User Feedback
-
SEARCHENGINES7 days ago
Daily Search Forum Recap: September 2, 2024
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 3, 2024
-
SEO6 days ago
Google Trends Subscriptions Quietly Canceled
-
WORDPRESS7 days ago
MyDataNinja
-
WORDPRESS5 days ago
Analysing Features, Pricing, and User Experience
-
AFFILIATE MARKETING4 days ago
What Is Founder Mode and Why Is It Better Than Manager Mode?
You must be logged in to post a comment Login