SEO
Why Prediction Of 25% Search Volume Drop Due to Chatbots Fails Scrutiny
Gartner’s predictions that AI Chatbots are the future and will account for a 25% drop in search market share got a lot of attention. What didn’t get attention is the fact that the claim fails to account for seven facts that call into question the accuracy of the prediction and demonstrates that it simply does not hold up to scrutiny.
1. AI Search Engines Don’t Actually Exist
The problem with AI technology is that it’s currently impossible to use AI infrastructure to create a constantly updated search index of web content in addition to billions of pages of news and social media that is constantly generated in real-time. Attempts to create a real-time AI search index fail because the nature of the technology requires retraining the entire language model to update it with new information. That’s why language models like GPT-4 don’t have access to current information.
So-called AI search engines aren’t really AI search engines. In practice, they’re chatbots that are inserted between the searcher and a traditional search engine. When a user asks a question, a traditional search engine finds the answers and the AI chatbot chooses the best answer and summarizes them in a natural language response.
So, when you use a chatbot AI search engine what’s essentially happening is that you’re asking a chatbot to Google/Bing it for you. This is true for Bing Copilot, Google SGE and Perplexity. It’s an interesting way to search but it’s not an actual AI-based search engine, there’s still a traditional search engine behind the chatbot.
The time to panic is when the transformer technology goes through a significant change so that it can handle a real-time updated search index (or another technology replaces it). But that time is not here yet, which makes the prediction of a 25% drop in search demand by 2026 appear a bit premature.
2. Generative AI Is Not Ready For Widescale Use
The recent fiasco with Gemini’s image search underscores the fact that generative AI as a technology is still in its infancy. Microsoft Copilot completely went off the rails in March 2024 by assuming a godlike persona, calling itself “SupremacyAGI,” and demanding to be worshipped under the threat of imprisoning users of the service.
This is the technology that Gartner predicts will take away 25% of market share? Really?
Generative AI is unsafe and despite attempts to add guardrails the technology still manages to jump off the cliffs with harmful responses. The technology is literally in its infancy. To assert that it will be ready for widescale use in two years is excessively optimistic about the progress of the technology
3. True AI Search Engines Are Not Economically Viable
AI Search Engines are exponentially more expensive than traditional search engines. It currently costs $20/month to subscribe to a Generative AI chatbot and that comes with limits of 40 queries every 3 hours and the reason for that is because generating AI answers is vastly more expensive than generating traditional search engine responses.
Google last year admitted that an AI chat is ten times more expensive than a regular search engine query. Microsoft’s GitHub Copilot is reported to lose an average of $20 per user every month. The economic realities of AI technology at this time basically rules out the use of an AI search engine as a replacement for traditional search engines.
4. Gartner’s Prediction Of 25% Decrease Assumes Search Engines Will Remain Unchanged
Gartner predicts a 25% decrease in traditional search query volume by 2026 but that prediction assumes that traditional search engines will remain the same. The Gartner analysis fails to account for the fact that search engines evolve not just on a yearly basis but on a month to month basis.
Search engines currently integrate AI technologies that increase search relevance in ways that innovate the entire search engine paradigm. For example, Google makes images tappable so that users can launch an image-based search for answers about the subject that’s in the image.
That’s called multi-modal search, a way to search using sound and vision in addition to traditional text-based searching. There is absolutely no mention of multimodality in traditional search, a technology that shows how traditional search engines evolve to meet user’s needs.
So-called AI chatbot search engines are in their infancy and offer zero multimodality. How can a technology so comparatively primitive even be considered competitive to traditional search?
5. Why Claim That AI Chatbots Will Steal Market Share Is Unrealistic
The Gartner report assumes that AI chatbots and virtual agents will become more popular but that fails to consider that Gartner’s own research from June 2023 shows that users distrust AI Chatbots.
Gartner’s own report states:
“Only 8% of customers used a chatbot during their most recent customer service experience, according to a survey by Gartner, Inc. Of those, just 25% said they would use that chatbot again in the future.”
Customer’s lack of trust is especially noticeable in Your Money Or Your Life (YMYL) tasks that involve money.
Gartner reported:
“Just 17% of billing disputes are resolved by customers who used a chatbot at some stage in their journey…”
Gartner’s enthusiastic assumption that users will trust AI chatbots may be unfounded because it may not have considered that users do not trust chatbots for important YMYL search queries, according to Gartner’s own research data.
6. Gartner Advice Is To Rethink What?
Gartner’s advice to search marketers is to incorporate more experience, expertise, authoritativeness and trustworthiness in their content, which betrays a misunderstanding what EEAT actually is. For example, trustworthiness is not something that is added to content like a feature, trustworthiness is the sum of the experience, expertise and authoritativeness that the author of the content brings to an article.
Secondly, EEAT is a concept of what Google aspires to rank in search engines but they’re not actual ranking factors, they’re just concepts.
Third, marketers are already furiously incorporating the concept of EEAT into their search marketing strategy. So the advice to incorporate EEAT as part of the future marketing strategy is itself too late and a bit bereft of unique insight.
The advice also fails to acknowledge that user interactions and user engagement not only a role in search engine success in the present but that they will likely increase in importance as search engines incorporate AI to improve their relevance and meaningfulness to users.
That means traditional that search marketing will remain effective and in demand for creating awareness and demand.
7. Why Watermarking May Not Have An Impact
Gartner suggests that watermarking and authentication will increasingly become common due to government regulation. But that prediction fails to understand the supporting role that AI can play in content creation.
For example, there are workflows where a human reviews a product, scores it, provides a sentiment score and insights about which users may enjoy the product and then submits the review data to an AI to write the article based on the human insights. Should that be watermarked?
Another way that content creators use AI is to dictate their thoughts into a recording then hand it over to the AI with the instruction to polish it up and turn into to a professional article. Should that be watermarked as AI generated?
The ability of AI to analyze vast amounts of data complements the content production workflow and can pick out key qualities of the data such key concepts and conclusions, which in turn can be used by humans to create a document that is filled with their insights, bringing to bear their human expertise on interpreting the data. Now, what if that human then uses an AI to polish up the document and make it professional. Should that be watermarked?
The Gartner’s predictions about watermarking AI content fails to take into account how AI is actually used by many publishers to create well written content with human-first insights, which absolutely complicate the use of watermarking and calls into question the adoption of it in the long term, not to mention the adoption of it by 2026.
Gartner Predictions Don’t Hold Up To Scrutiny
The Gartner predictions cite actual facts from the real-world. But it fails to consider real-world factors that make AI technology as an impotent threat to traditional search engines. For example, there is no consideration of the inability to of AI to create a fresh search index or that AI Chatbot search engines aren’t even actual AI search engines.
It is incredible that the analysis failed to cite the fact that Bing Chat experienced no significant increase in users and has failed to peel way search volume from Google. These failures cast serious doubt on the accuracy of the predictions that search volume will decrease by 25%.
Read Gartner’s press release here:
Featured Image by Shutterstock/Renovacio
SEO
Holistic Marketing Strategies That Drive Revenue [SaaS Case Study]
Brands are seeing success driving quality pipeline and revenue growth. It’s all about building an intentional customer journey, aligning sales + marketing, plus measuring ROI.
Check out this executive panel on-demand, as we show you how we do it.
With Ryann Hogan, senior demand generation manager at CallRail, and our very own Heather Campbell and Jessica Cromwell, we chatted about driving demand, lead gen, revenue, and proper attribution.
This B2B leadership forum provided insights you can use in your strategy tomorrow, like:
- The importance of the customer journey, and the keys to matching content to your ideal personas.
- How to align marketing and sales efforts to guide leads through an effective journey to conversion.
- Methods to measure ROI and determine if your strategies are delivering results.
While the case study is SaaS, these strategies are for any brand.
Watch on-demand and be part of the conversation.
Join Us For Our Next Webinar!
Navigating SERP Complexity: How to Leverage Search Intent for SEO
Join us live as we break down all of these complexities and reveal how to identify valuable opportunities in your space. We’ll show you how to tap into the searcher’s motivation behind each query (and how Google responds to it in kind).
SEO
What Marketers Need to Learn From Hunter S. Thompson
We’ve passed the high-water mark of content marketing—at least, content marketing in its current form.
After thirteen years in content marketing, I think it’s fair to say that most of the content on company blogs was created by people with zero firsthand experience of their subject matter. We have built a profession of armchair commentators, a class of marketers who exist almost entirely in a world of theory and abstraction.
I count myself among their number. I have hundreds of bylines about subfloor moisture management, information security, SaaS pricing models, agency resource management. I am an expert in none of these topics.
This has been the happy reality of content marketing for over a decade, a natural consequence of the incentives created by early Google Search. Historically, being a great content marketer required precisely no subject matter expertise. It was enough to read widely and write quickly.
Mountains of organic traffic have been built on the backs of armchair commentators like myself. Time spent doing deep, detailed research was, generally speaking, wasted, because 80% of the returns came from simply shuffling other people’s ideas around and slapping a few keyword-targeted H2s in the right places.
But this doesn’t work today.
For all of its flaws, generative AI is an excellent, truly world-class armchair commentator. If the job-to-be-done is reading a dozen articles and how-to’s and turning them into something semi-original and fairly coherent, AI really is the best tool for the job. Humans cannot out-copycat generative AI.
Put another way, the role of the content marketer as a curator has been rendered obsolete. So where do we go from here?
Hunter S. Thompson popularised the idea of gonzo journalism, “a style of journalism that is written without claims of objectivity, often including the reporter as part of the story using a first-person narrative.”
In other words, Hunter was the story.
When asked to cover the rising phenomenon of the Hell’s Angels, he became a Hell’s Angel. During his coverage of the ‘72 presidential campaign, he openly supported his preferred candidate, George McGovern, and actively disparaged Richard Nixon. His chronicle of the Kentucky Derby focused almost entirely on his own debauchery and chaos-making—a story that has outlasted any factual account of the race itself.
In the same vein, content marketers today need to become their stories.
It’s a content marketing truism that it’s unreasonable to expect writers to become experts. There’s a superficial level of truth to that claim—no content marketer can acquire a decade’s worth of experience in a few days or weeks—but there are great benefits awaiting any company willing to challenge that truism very, very seriously.
As Thompson proved, short, intense periods of firsthand experience can yield incredible insights and stories. So what would happen if you radically reduced your content output and dedicated half of your content team’s time to research and experimentation? If their job was doing things worth writing about, instead of just writing? If skin-in-the-game, no matter how small, was a prerequisite of the role?
We’re already seeing this shift.
Every week, I see more companies hiring marketers who are true, bonafide subject matter experts (I include the Ahrefs content team here—for the majority of our team, “writing” is a skill secondary to a decade of hands-on search and marketing experience). They are expensive, hard to find, and in the era of AI, worth every cent.
I see a growing expectation that marketers will document their experiences and experiments on social media, creating meta-content that often outperforms the “real” content. I see more companies willing to share subjective experiences and stories, and avoid competing solely on the sharing of objective, factual information. I see companies spending money to promote the personal brands of in-house creators, actively encouraging parasocial relationships as their corporate brand accounts lay dormant.
These are ideas that made no sense in the old model of content marketing, but they make much more sense today. This level of effort is fast becoming the only way to gain any kind of moat, creating material that doesn’t already exist on a dozen other company blogs.
In the era of information abundance, our need for information is relatively easy to sate; but we have a near-limitless hunger for entertainment, and personal interaction, and weird, pattern-interrupting experiences.
Gonzo content marketing can deliver.
SEO
I Got 129.7% More Traffic With Related Keywords
A few weeks ago, I optimized one of my blog posts for related keywords. Today, it gets an estimated 2,300 more monthly organic visits:
In this post, I’ll show you how I found and optimized my post for these related keywords.
Related keywords are words and phrases closely linked to your main keyword. There are many ways to find them. You can even just ask ChatGPT.
But here’s the thing: These keywords aren’t useful for optimizing content.
If more traffic is your goal, you need to find keywords that represent subtopics—not just any related ones.
Think of it like this: you improve a recipe by adding the right ingredients, not everything in your fridge!
Below are two methods for finding the right related keywords (including the one I used):
Method 1. Use content optimization tools
Content optimization tools look for keywords on other top-ranking pages but not yours. They usually then recommend adding these keywords to your content a certain number of times.
These tools can be useful if you take their recommendations with a pinch of salt, as some of them can lead you astray.
For example, this tool recommends that I add six mentions of the phrase “favorite features” to our keyword research guide.
Does that seem like an important related keyword to you? It certainly doesn’t to me!
They also usually have a content score that increases as you add the recommended related keywords. This can trick you into believing that something is important when it probably isn’t—especially as content scores have a weak correlation with rankings.
My advice? If you’re going to use these tools, apply common sense and look for recommendations that seem to represent important subtopics.
For example, when I analyze our content audit guide, it suggests adding quite a few keywords related to content quality.
It doesn’t take a genius to work out that this is an extremely important consideration for a content audit—yet our guide mentions nothing about it.
This is a huge oversight and definitely a batch of related keywords worth optimizing for.
Try the beta version of our new AI Content Helper!
Instead of counting terms that you need to include in your content, Content Helper uses AI to identify the core topics for your target keywords and scores your content (as well as your competitors) against those topics as you write it. In effect, it groups related keywords by subtopic, making it easier to optimize for the broader picture.
For example, it looks like my post doesn’t cover Google Business Profile optimization too well. This is something it might be worth going into more detail about.
Method 2. Do a keyword gap analysis (this is the method I used!)
Keyword gaps are when competitors rank for keywords you don’t. If you do this analysis at the page level, it’ll uncover related keywords—some of which will usually represent subtopics.
If possible, I recommend doing this for pages that already rank on the first page for their main target keyword. These pages are doing well already and likely just need a bit of a push to rank high and for more related keywords. You can find these in Site Explorer:
- Enter your domain
- Go to the Organic Keywords report
- Filter for positions 2-10
- Look for the main keywords you’re targeting
Once you have a few contenders, here’s how to do a keyword gap analysis:
a) Find competitors who are beating you
In the Organic Keywords report, hit the SERP dropdown next to the keyword to see the current top-ranking pages. Look for similar pages that are getting more traffic than yours and have fewer referring domains.
For example, our page ranks #10 for “local SEO,” has 909 referring domains, and gets an estimated 813 monthly visits:
All of these competing pages get more traffic with fewer backlinks:
Sidenote.
I’m going to exclude the page from Moz going forward as it’s a blog category page. That’s very different to ours so it’s probably not worth including in our analysis.
b) Send them to the content gap tool
Hit the check boxes next to your competitors, then click “Open In” and choose Content gap.
By default, this will show you keywords where one or more competitors rank in the top 10, but you don’t rank anywhere in the top 100.
I recommend changing this so it shows all keywords competitors rank for, even if you also rank for them. This is because you may still be able to better optimize for related keywords you already rank for.
I also recommend turning the “Main results only” filter on to exclude rankings in sitelinks and other SERP features:
c) Look for related keywords worth optimizing for
This is where common sense comes into play. Your task is to scan the list for related keywords that could represent important subtopics.
For example, keywords like these aren’t particularly useful because they’re just different ways of searching for the main topic of local SEO:
But a related keyword like “what is local SEO” is useful because it represents a subtopic searchers are looking for:
If this process feels too much like trying to find a needle in a haystack, try exporting the full list of keywords, pasting them into Keywords Explorer, and going to the “Cluster by terms” report. As the name suggests, this groups keywords into clusters by common terms:
This is useful because it can highlight common themes among related keywords and helps you to spot broader gaps.
For example, when I was looking for related keywords for our SEO pricing guide (more on this later!), I saw 17 related keywords containing the term “month”:
Upon checking the keywords, I noticed that they’re all ways of searching for how much SEO costs per month:
This is an easy batch of related keywords to optimize for. All I need to do is answer that question in the post.
If you’re still struggling to spot good related keywords, look for ones sending competing pages way more traffic than you. This usually happens because competitors’ pages are better optimized for those terms.
You can spot these in the content gap report by comparing the traffic columns.
For example, every competing page is getting more traffic than us for the keyword “how much does SEO cost”—and Forbes is getting over 300 more visits!
Now you have a bunch of related keywords, what should you do with them?
This is a nuanced process, so I’m going to show you exactly how I did it for our local SEO guide. Its estimated organic traffic grew by 135% after my optimizations for related keywords:
Sidenote.
Google kindly rolled out a Core update the day after I did these optimizations, so there’s always a chance the traffic increase is unrelated. That said, traffic to our blog as a whole stayed pretty consistent after the update, while this post’s traffic grew massively. I’m pretty sure the related keyword optimization is what caused this.
Here are the related keywords I optimized it for and how:
Related keyword 1: “What is local SEO”
Every competing page was getting significantly more traffic than us for this keyword (and ranking significantly higher). One page was even getting an estimated 457 more visits than ours per month:
People were also searching for this in a bunch of different ways:
My theory on why we weren’t performing well for this? Although we did have a definition on the page, it wasn’t great. It was also buried under a H3 with a lot of fluff to read before you get to it.
I tried to solve this by getting rid of the fluff, improving the definition (with a little help from ChatGPT), and moving it under a H2.
Result? The page jumped multiple positions for the keyword “what is local SEO” and a few other similar related keywords:
Related keyword 2: Local SEO strategy
Once again, all competing pages were getting more traffic than ours from this keyword.
I feel like the issue here may be that there’s no mention of “strategy” in our post, whereas competitors mention it multiple times.
To solve this, I added a short section about local SEO strategy.
I also asked ChatGPT to add “strategy” to the definition of local SEO. (I’m probably clutching at straws with this one, but it reads nicely with the addition, so… why not?)
Result? The page jumped seven positions from the bottom of page two to page one for the related keyword:
Related keyword 3: “How to do local SEO”
Most of the competing pages were getting more traffic than us for this keyword—albeit not a lot.
However, I also noticed Google shows this keyword in the “things to know” section when you search for local SEO—so it seems pretty important.
I’d also imagine that anyone searching for local SEO wants to know how to do it.
Unfortunately, although our guide does show you how to do local SEO, it’s kind of buried in a bunch of uninspiring chapters. There’s no obvious “how to do it” subheading for readers (or Google) to skim, so you have to read between the lines to figure out the “how.”
In an attempt to solve this, I restructured the content into steps and put it under a new H2 titled “How to do local SEO”:
Result? Position #7 → #4
No. Nothing in SEO is guaranteed, and this is no different.
In fact, I optimized our SEO pricing guide for related keywords on the same day, and—although traffic did improve—it only improved by around 23%:
Sidenote.
You might have noticed the results were a bit delayed here. I think this is because the keywords the post ranks for aren’t so popular, so they’re not updated as often in Ahrefs.
For full transparency, here’s every related keyword I optimized the post for and the results:
Related keyword 1: “How much does SEO cost”
Each competing page got more traffic than ours from this keyword, with one getting an estimated 317 more monthly visits:
When I clustered the keywords by terms in Keywords Explorer, I also saw ~70 keywords containing the word “much” (this was around 19% of all keywords in the Content Gap report!):
These were all different ways of searching for how much SEO costs:
The issue here appears to be that although we do answer the question on the page, it’s quite buried. There’s no obvious subheading with the answer below it, making it hard for searchers (and possibly Google) to skim and find what they’re looking for:
To solve this, I added a H2 titled “How much does SEO cost?” and added a direct answer below.
Result? No change in rankings for the related keyword itself, but the page did win a few snippets for longer-tail variations thanks to the copy I added:
Related keyword 2: “SEO cost per month”
Nearly all competing pages were getting more traffic than us for this keyword, with one getting an estimated 72 monthly visits more than more us.
The term clustering report in Keywords Explorer also showed that people are searching for the monthly cost of SEO in different ways:
This is not the case for hourly or retainer pricing; there are virtually no searches for this.
I think we’re not ranking for this because we haven’t prioritized this information on the page. The first subheading is all about hourly pricing, which nobody cares about. Monthly pricing data is buried below that.
To fix this, I moved the data on monthly pricing further up the page and wrote a more descriptive subheading (“Monthly retainer pricing” →“Monthly retainer pricing: How much does SEO cost per month?”).
I also changed the key takeaways in the intro to focus more on monthly pricing, as this is clearly what people care about. Plus, I simplified it and made it more prominent so searchers can find the information they’re actually looking for faster.
Result? The page won the featured snippet for this related keyword and a few other variations:
Related keyword 3: “Local SEO pricing”
I found this one in the term clustering report in Keywords Explorer, as 16 keywords contained the term “local.”
Upon further inspection, I realized these were all different ways of searching for the cost of local SEO services.
I think the problem here is although our post has some data on local SEO pricing, it doesn’t have the snappy figure searchers are likely looking for. Plus, even the information we did have was buried deep on the page.
So… I actually pulled new statistics from the data we collected for the post, then put them under a new H3 titled “How much does local SEO cost?”
Result? Small but notable improvements for this keyword and a few other variations:
Related keyword 4: “How much does SEO cost for a small business”
I saw that one competing page was getting an estimated 105 more monthly organic visits than us from this term.
When clustering by terms in Keywords Explorer, I also saw a cluster of nine keywords containing the word “small.” These were all different ways of searching for small business SEO pricing:
Once again, the issue here is clear: the information people are looking for isn’t on the page. There’s not even a mention of small businesses.
This is good as it means the solution is simple: add an answer to the page. I did this and put it under a new H3 titled “How much does SEO cost for small businesses?”
Result? #15 → #5 for this related keyword, and notable improvements for a few other variations:
Related keyword 5: “SEO pricing models”
This related keyword probably isn’t that important, but I spotted it looking through the Content gap report and thought it’d be pretty easy to optimize for.
All I did was create a new H2 titled “SEO pricing models: a deeper breakdown of costs.” I then briefly explained the three common pricing models under this and re-jigged and nested the rest of the content from the page under there.
Result? #5 → #1:
Final thoughts
Related keyword optimization isn’t about shoehorning a bunch of keyword variations into your content. Google is smart enough to know that things like “SEO” and “search engine optimization” mean the same thing.
Instead, look for keywords that represent subtopics and make sure you’re covering them well. This might involve adding a new section or reformatting an existing section for more clarity.
This is easy to do. It took me around 2-3 hours per page.
-
SEARCHENGINES7 days ago
Daily Search Forum Recap: September 11, 2024
-
WORDPRESS7 days ago
14 Tools for Creating and Selling Digital Products (Expert Pick)
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 12, 2024
-
GOOGLE6 days ago
Google Warns About Misuse of Its Indexing API
-
WORDPRESS5 days ago
How to Connect Your WordPress Site to the Fediverse – WordPress.com News
-
SEARCHENGINES5 days ago
Daily Search Forum Recap: September 13, 2024
-
SEO6 days ago
OpenAI Claims New “o1” Model Can Reason Like A Human
-
SEO6 days ago
How to Build a Fandom by Talent-Scouting Great Content