SEO
How to Market When Information is Dirt Cheap
Much of marketing today is built on the idea that education is enough. Provide a searcher with functional, half-decent information, and we’re halfway down the path to winning hearts and minds.
In the pre-AI world, that was true. Accurate, relevant information was rare. Curating the web’s content into more accessible formats was a real value-add. Education was the engine that fuelled growth.
But today, information has become impossibly cheap. Any brand can become a generalist publisher, churning out thousands of search-optimized how to guides on virtually any topic. It’s becoming easier and easier to find customized, personalized answers to even the weirdest of long-tail queries.
The internet has transformed from a place of information scarcity to one of information abundance, and with it, the value of “education” as a marketing strategy has fallen off a cliff.
In my early career (some thirteen years ago), many of my published articles were the only pieces of “educational” SEO content written on a given topic.
That may sound appealing today, but at the time, it was a problem. Many SERPs were a mish-mash of different content types and search intents. The onus was on the searcher to piece together their answer from a platter of partially-helpful sources.
As I’ve written before:
There was a time when a Google search would yield a page of only vaguely relevant search results; finding an article that addressed your specific question was rare, and incredibly welcome.
The information searchers wanted usually existed, but it was locked up in hard-to-access places: obscure forum posts, esoteric PDFs, hard-to-find personal blogs.
My value-add was simply finding and repurposing this information into a more accessible format—something that would appear when people searched for it. (This was the great benefit of the skyscraper approach: centralizing disparate information into one place.)
This is arbitrage: taking advantage of a temporary information asymmetry to turn a profit. The information I shared already existed on the internet, but it was difficult to find—I made it easy, making it more specific and tailoring it to whatever language was used by searchers. My content was rewarded by traffic growth.
Looking back, we can refer to this time as Google’s era of information scarcity:
- Specific, hyper-relevant information was hard to find.
- Content was costly to create.
- Simple information arbitrage was useful and appreciated.
- There was little competition; companies in every industry could become first-movers.
- The source of information mattered little; you would take information from wherever you could get it.
- It was easy to separate good content from bad.
The era of information scarcity was characterized by a hunt for signal amongst noise. You had a specific problem; search engines helped you trawl through semi-relevant information in the hope of an answer.
But the internet is a different place today.
SEO is a tablestakes strategy, used by everyone from solopreneurs to multinational enterprises. It’s too late for the first-mover advantage to apply: simple arbitrage doesn’t have the same impact, because the chances are high that another brand (or dozen brands) has already beaten you to the punch.
It’s also the easiest and cheapest it’s ever been to make content. The marginal cost of content creation has plummeted to virtually zero; brands can publish fifty articles a day and have change left over from a hundred-dollar bill. The amount of “educational” SEO content is growing exponentially as more brands become generalist publishers.
Even the most niche, long-tail, ultra-specific queries can benefit from extremely relevant answers because LLMs can generate them on the fly, pulling from disparate sources and changing the context to make it fit the query. Thanks to AI Overviews, Google can even do this for you.
This AI content is at least as good as average human content (which is to say, not very good—but that has always been true of SEO content). Most questions on most topics can receive a passable answer.
… or something that looks like it. The hallucinating nature of LLMs means that generated content can have the look and feel of something polished and professional—while containing garbled nonsense information. Bad content looks increasingly like good content. It’s hard to tell the difference without deeper inspection.
We have entered Google’s era of information abundance:
- Specific, relevant information for most queries is virtually guaranteed.
- Content is cheap to create; there is no barrier to entry.
- Simple information arbitrage has become almost worthless.
- There is high competition; companies in every industry are very likely to be second-movers.
- The source of information is everything. Searchers will seek out trusted brands and people for their information.
- It’s much harder to separate good content from bad.
The era of information abundance is characterized by the hunt for signal amongst… signal. There are dozens, even hundreds of competing sources claiming the correct answer (including Google itself). Much of this is AI slop, LLM output regurgitating LLM output, with ever-worsening resolution.
This single change—information becoming impossibly cheap and plentiful—has changed how marketing functions.
The simple act of sharing simple educational content used to be enough to win the hearts and minds of your audience. In the era of AI, where educational content has become impossibly cheap and ubiquitous, we need to do more.
But how?
Offer new flavors of information
Assume you are limited to publishing the same information as your competitors. Can you find a way to differentiate?
Yes: by offering a unique “flavor” of that information.
For example: there are a hundred different ways to consume the news. There’s news for positive people. News for people with overt political leanings. News for financiers and economists. News for nerds. News for local communities.
The core body of information—things happening around the world—is largely the same, but the curation, presentation and experience of that information is radically different.
We can do the same for the information we share. That “ultimate guide to link building” can become “the SaaS founder’s guide to link-building”, or “how to build your first 10 high-quality links,” or a content series following you as you actually build links.
The core information contained in each “flavor” of link-building guide will be largely the same, but the experience of consuming it will be radically different.
There is a trade-off here: the more specific your focus, the smaller the total addressable market. But search is becoming increasingly zero-sum. For many brands, it will be better to own a low-volume topic than try to contest a highly competitive high-volume one.
Create new information
Thankfully, we aren’t constrained to publishing the same information as everyone else. We can create new information, and expand the pool of available data.
Very few topics have a completely fixed body of knowledge. By running simple experiments, trying to solve hard problems, or exploring weird edge cases, you can probably find a way to breathe new, useful information into existence—something that can’t immediately be found on a competitor’s website or in an LLM’s output.
This is generally more difficult and expensive to do, but it offers longer-lasting benefits. I wrote more about the practicalities of doing this here: How To Stand Out in an Ocean of AI Content.
Move past rote information
Finally: assume that education is table stakes, that every brand offers an exhaustive resource center and certification program covering the core topics of their industry. How would you attract attention?
Entertainment is one obvious answer. The majority of the media most people consume each day is not overtly educational—it’s entertaining. Big brands like Paddle and HubSpot and small brands like Wistia and AudiencePlus recognize this reality, and are willing to make big bets into entertainment strategies with no easily-calculable pay-off.
Entertainment is extremely hard, but it brings many benefits:
- Larger TAM. The strategies mentioned above work because they hone in on specific audiences, creating ultra-specific content that resonates with a small audience. Media is the opposite, widening your total addressable market to the largest possible size.
- Moat to entry. There’s a reason most companies haven’t built out media brands already: entertainment is hard. It requires a far greater understanding of your target audience than simple educational content; it’s subjective and unfamiliar and risky. This makes it harder to execute well, but infinitely more valuable should you succeed.
- Quicker time-to-value. As I’ve shared before, “Content marketing allows companies to deliver value to consumers at an earlier stage of the buying process than they would otherwise be able to; but as content marketing becomes more commonplace, media enables this to happen at an earlier stage still.”
Source: Media Strategies Aren’t as Crazy as They Seem
As HubSpot’s Kieran Flanegan put it, “the challenge with education is it’s only relevant when you need it.” Entertainment-as-marketing-strategy allows you to reach your audience at the earliest possible stage of awareness—before they are even problem aware. There are virtually no competitors at this stage of the buying cycle.
Final thoughts
Today, most digital marketing is fuelled by “educational” content: simple, utilitarian information, created by jack-of-all-trades generalists and bylined by faceless brand accounts.
We’ve at least progressed to a level of sophistication where most brands publish fairly accurate, fairly helpful information; but very few brands have progressed measurably past the stage of simple information arbitrage. Most marketing content is a rehash of someone else’s work.
In the era of information abundance, this kind of arbitrage is worthless. Large, established brands will use their brand awareness and domain authority to eeke out a few more years of benefit from these strategies; but smaller brands looking to carve out market share will need to do something radically different.
SEO
Mediavine Bans Publisher For Overuse Of AI-Generated Content
According to details surfacing online, ad management firm Mediavine is terminating publishers’ accounts for overusing AI.
Mediavine is a leading ad management company providing products and services to help website publishers monetize their content.
The company holds elite status as a Google Certified Publishing Partner, which indicates that it meets Google’s highest standards and requirements for ad networks and exchanges.
AI Content Triggers Account Terminations
The terminations came to light in a post on the Reddit forum r/Blogging, where a user shared an email they received from Mediavine citing “overuse of artificially created content.”
Trista Jensen, Mediavine’s Director of Ad Operations & Market Quality, states in the email:
“Our third party content quality tools have flagged your sites for overuse of artificially created content. Further internal investigation has confirmed those findings.”
Jensen stated that due to the overuse of AI content, “our top partners will stop spending on your sites, which will negatively affect future monetization efforts.”
Consequently, Mediavine terminated the publisher’s account “effective immediately.”
The Risks Of Low-Quality AI Content
This strict enforcement aligns with Mediavine’s publicly stated policy prohibiting websites from using “low-quality, mass-produced, unedited or undisclosed AI content that is scraped from other websites.”
In a March 7 blog post titled “AI and Our Commitment to a Creator-First Future,” the company declared opposition to low-value AI content that could “devalue the contributions of legitimate content creators.”
Mediavine warned in the post:
“Without publishers, there is no open web. There is no content to train the models that power AI. There is no internet.”
The company says it’s using its platform to “advocate for publishers” and uphold quality standards in the face of AI’s disruptive potential.
Mediavine states:
“We’re also developing faster, automated tools to help us identify low-quality, mass-produced AI content across the web.”
Targeting ‘AI Clickbait Kingpin’ Tactics
While the Reddit user’s identity wasn’t disclosed, the incident has drawn connections to the tactics of Nebojša Vujinović Vujo, who was dubbed an “AI Clickbait Kingpin” in a recent Wired exposé.
According to Wired, Vujo acquired over 2,000 dormant domains and populated them with AI-generated, search-optimized content designed purely to capture ad revenue.
His strategies represent the low-quality, artificial content Mediavine has vowed to prohibit.
Potential Implications
Lost Revenue
Mediavine’s terminations highlight potential implications for publishers that rely on artificial intelligence to generate website content at scale.
Perhaps the most immediate and tangible implication is the risk of losing ad revenue.
For publishers that depend heavily on programmatic advertising or sponsored content deals as key revenue drivers, being blocked from major ad networks could devastate their business models.
Devalued Domains
Another potential impact is the devaluation of domains and websites built primarily on AI-generated content.
If this pattern of AI content overuse triggers account terminations from companies like Mediavine, it could drastically diminish the value proposition of scooping up these domains.
Damaged Reputations & Brands
Beyond the lost monetization opportunities, publishers leaning too heavily into automated AI content also risk permanent reputational damage to their brands.
Once a determining authority flags a website for AI overuse, it could impact how that site is perceived by readers, other industry partners, and search engines.
In Summary
AI has value as an assistive tool for publishers, but relying heavily on automated content creation poses significant risks.
These include monetization challenges, potential reputation damage, and increasing regulatory scrutiny. Mediavine’s strict policy illustrates the possible consequences for publishers.
It’s important to note that Mediavine’s move to terminate publisher accounts over AI content overuse represents an independent policy stance taken by the ad management firm itself.
The action doesn’t directly reflect the content policies or enforcement positions of Google, whose publishing partner program Mediavine is certified under.
We have reached out to Mediavine requesting a comment on this story. We’ll update this article with more information when it’s provided.
Featured Image: Simple Line/Shutterstock
SEO
Google’s Guidance About The Recent Ranking Update
Google’s Danny Sullivan explained the recent update, addressing site recoveries and cautioning against making radical changes to improve rankings. He also offered advice for publishes whose rankings didn’t improve after the last update.
Google’s Still Improving The Algorithm
Danny said that Google is still working on their ranking algorithm, indicating that more changes (for the positive) are likely on the way. The main idea he was getting across is that they’re still trying to fill the gaps in surfacing high quality content from independent sites. Which is good because big brand sites don’t necessarily have the best answers.
He wrote:
“…the work to connect people with “a range of high quality sites, including small or independent sites that are creating useful, original content” is not done with this latest update. We’re continuing to look at this area and how to improve further with future updates.”
A Message To Those Who Were Left Behind
There was a message to those publishers whose work failed to recover with the latest update, to let them know that Google is still working to surface more of the independent content and that there may be relief on the next go.
Danny advised:
“…if you’re feeling confused about what to do in terms of rankings…if you know you’re producing great content for your readers…If you know you’re producing it, keep doing that…it’s to us to keep working on our systems to better reward it.”
Google Cautions Against “Improving” Sites
Something really interesting that he mentioned was a caution against trying to improve rankings of something that’s already on page one in order to rank even higher. Tweaking a site to get from position six or whatever to something higher has always been a risky thing to do for many reasons I won’t elaborate on here. But Danny’s warning increases the pressure to not just think twice before trying to optimize a page for search engines but to think three times and then some more.
Danny cautioned that sites that make it to the top of the SERPs should consider that a win and to let it ride instead of making changes right now in order to improve their rankings. The reason for that caution is that the search results continue to change and the implication is that changing a site now may negatively impact the rankings in a newly updated search index.
He wrote:
“If you’re showing in the top results for queries, that’s generally a sign that we really view your content well. Sometimes people then wonder how to move up a place or two. Rankings can and do change naturally over time. We recommend against making radical changes to try and move up a spot or two”
How Google Handled Feedback
There was also some light shed on what Google did with all the feedback they received from publishers who lost rankings. Danny wrote that the feedback and site examples he received was summarized, with examples, and sent to the search engineers for review. They continue to use that feedback for the next round of improvements.
He explained:
“I went through it all, by hand, to ensure all the sites who submitted were indeed heard. You were, and you continue to be. …I summarized all that feedback, pulling out some of the compelling examples of where our systems could do a better job, especially in terms of rewarding open web creators. Our search engineers have reviewed it and continue to review it, along with other feedback we receive, to see how we can make search better for everyone, including creators.”
Feedback Itself Didn’t Lead To Recovery
Danny also pointed out that sites that recovered their rankings did not do so because of they submitted feedback to Google. Danny wasn’t specific about this point but it conforms with previous statements about Google’s algorithms that they implement fixes at scale. So instead of saying, “Hey let’s fix the rankings of this one site” it’s more about figuring out if the problem is symptomatic of something widescale and how to change things for everybody with the same problem.
Danny wrote:
“No one who submitted, by the way, got some type of recovery in Search because they submitted. Our systems don’t work that way.”
That feedback didn’t lead to recovery but was used as data shouldn’t be surprising. Even as far back as the 2004 Florida Update Matt Cutts collected feedback from people, including myself, and I didn’t see a recovery for a false positive until everyone else also got back their rankings.
Takeaways
Google’s work on their algorithm is ongoing:
Google is continuing to tune its algorithms to improve its ability to rank high quality content, especially from smaller publishers. Danny Sullivan emphasized that this is an ongoing process.
What content creators should focus on:
Danny’s statement encouraged publishers to focus on consistently creating high quality content and not to focus on optimizing for algorithms. Focusing on quality should be the priority.
What should publishers do if their high-quality content isn’t yet rewarded with better rankings?
Publishers who are certain of the quality of their content are encouraged to hold steady and keep it coming because Google’s algorithms are still being refined.
Featured Image by Shutterstock/Cast Of Thousands
SEO
Plot Up To Five Metrics At Once
Google has rolled out changes to Analytics, adding features to help you make more sense of your data.
The update brings several key improvements:
- You can now compare up to five different metrics side by side.
- A new tool automatically spots unusual trends in your data.
- A more detailed report on transactions gives a closer look at revenue.
- The acquisition reports now separate user and session data more clearly.
- It’s easier to understand what each report does with new descriptions.
Here’s an overview of these new features, why they matter, and how they might help improve your data analysis and decision-making.
▶ ️We’ve introduced plot rows in detailed reports. You can now visualize up to 5 rows of data directly within your detailed reports to measure their changes over time.
We’ve also launched these new report features:
🔎: Anomaly detection to flag unusual data fluctuations
📊:… pic.twitter.com/VDPXe2Q9wQ— Google Analytics (@googleanalytics) September 5, 2024
Plot Rows: Enhanced Data Visualization
The most prominent addition is the “Plot Rows” feature.
You can now visualize up to five rows of data simultaneously within your reports, allowing for quick comparisons and trend analysis.
This feature is accessible by selecting the desired rows and clicking the “Plot Rows” option.
Anomaly Detection: Spotting Unusual Patterns
Google Analytics has implemented an anomaly detection system to help you identify potential issues or opportunities.
This new tool automatically flags unusual data fluctuations, making it easier to spot unexpected traffic spikes, sudden drops, or other noteworthy trends.
Improved Report Navigation & Understanding
Google Analytics has added hover-over descriptions for report titles.
These brief explanations provide context and include links to more detailed information about each report’s purpose and metrics.
Key Event Marking In Events Report
The Events report allows you to mark significant events for easy reference.
This feature, accessed through a three-dot menu at the end of each event row, helps you prioritize and track important data points.
New Transactions Report For Revenue Insights
For ecommerce businesses, the new Transactions report offers granular insights into revenue streams.
This feature provides information about each transaction, utilizing the transaction_id parameter to give you a comprehensive view of sales data.
Scope Changes In Acquisition Reports
Google has refined its acquisition reports to offer more targeted metrics.
The User Acquisition report now includes user-related metrics such as Total Users, New Users, and Returning Users.
Meanwhile, the Traffic Acquisition report focuses on session-related metrics like Sessions, Engaged Sessions, and Sessions per Event.
What To Do Next
As you explore these new features, keep in mind:
- Familiarize yourself with the new Plot Rows function to make the most of comparative data analysis.
- Pay attention to the anomaly detection alerts, but always investigate the context behind flagged data points.
- Take advantage of the more detailed Transactions report to understand your revenue patterns better.
- Experiment with the refined acquisition reports to see which metrics are most valuable for your needs.
As with any new tool, there will likely be a learning curve as you incorporate these features into your workflow.
FAQ
What is the “Plot Rows” feature in Google Analytics?
The “Plot Rows” feature allows you to visualize up to five rows of data at the same time. This makes it easier to compare different metrics side by side within your reports, facilitating quick comparisons and trend analysis. To use this feature, select the desired rows and click the “Plot Rows” option.
How does the new anomaly detection system work in Google Analytics?
Google Analytics’ new anomaly detection system automatically flags unusual data patterns. This tool helps identify potential issues or opportunities by spotting unexpected traffic spikes, sudden drops, or other notable trends, making it easier for users to focus on significant data fluctuations.
What improvements have been made to the Transactions report in Google Analytics?
The enhanced Transactions report provides detailed insights into revenue for ecommerce businesses. It utilizes the transaction_id parameter to offer granular information about each transaction, helping businesses get a better understanding of their revenue streams.
Featured Image: Vladimka production/Shutterstock
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 2, 2024
-
SEO3 days ago
Early Analysis & User Feedback
-
SEO6 days ago
What Is Largest Contentful Paint: An Easy Explanation
-
SEO5 days ago
Google Trends Subscriptions Quietly Canceled
-
SEARCHENGINES5 days ago
Daily Search Forum Recap: September 3, 2024
-
SEARCHENGINES7 days ago
Google August 2024 Core Update Impact Survey Results
-
WORDPRESS7 days ago
Tento Launches as Exclusive Lending Partner for WooCommerce
-
SEO7 days ago
How To Run A Live Youtube Show To An International Audience And Gain Traction