Connect with us

SEO

How “Deep Content” Will Protect Your SEO in the AI Era

Published

on

How "Deep Content" Will Protect Your SEO in the AI Era

SEOs are panicking that AI is taking over and their jobs are done. But I’m uncharacteristically optimistic this isn’t the case.

Yes, AI has changed SEO forever (and will continue to do so), but it’s not going to kill it. We just need to adapt. How? By prioritizing “deep content” about topics that can’t be answered quickly or easily.

Here are three reasons why I think this is the future of SEO.

If you ask ChatGPT how to reinstall macOS, you can easily follow its instructions to get the job done. It clearly explains what you need to do and how to do it.

Advertisement

This is the kind of “shallow” topic that AI is coming for.

Once Google rolls out SGE to the masses (which will probably happen soon), searchers won’t need to click for answers to these kinds of questions. They’ll be right there in the search results courtesy of AI. The robots will steal the traffic.

My prediction is that clicks to shallow topics will go to AI in the future. The only way to future-proof your SEO is to tackle deep topicsMy prediction is that clicks to shallow topics will go to AI in the future. The only way to future-proof your SEO is to tackle deep topics

Compare this to ChatGPT’s answer for a topic like how to run a content audit:

ChatGPT explaining how to run a content audit. The result leaves a lot to be desired.ChatGPT explaining how to run a content audit. The result leaves a lot to be desired.

Even with half the answer cut off, the issue is clear: it tells us what to do but not how to do it.

  • How do we choose the right objective?
  • How do we pull data from Google Analytics, our CMS, and website crawlers to compile a content inventory?
  • Which are the “appropriate tools” that will help with the audit?

Because this is a deeper topic, it needs a deeper answer. You can’t get that from AI. You need to click a result to find a tutorial from someone with actual experience performing content audits.

These are the kinds of topics you should be prioritizing in an AI world.

How can you find “deep” topics?

There’s no exact science. It’s largely about knowing your industry well and applying common sense. However, if you’re doing keyword research, you can narrow things down by excluding keywords that trigger featured snippets. After all, if Google thinks a query can already be answered well with a featured snippet, an AI answer will definitely do the job.

You can do this in Keywords Explorer with the SERP features filter:

Advertisement
Filtering for keywords without featured snippets in Ahrefs' Keywords ExplorerFiltering for keywords without featured snippets in Ahrefs' Keywords Explorer

You can also use the “Identify intent” button in Keywords Explorer to learn more about what searchers are looking for. If it seems like something that couldn’t be answered quickly or easily, it’s probably a “deep” topic.

For example, it tells us that many of the people searching for “content audit” want “a detailed process for conducting a content audit, including templates”:

Using AI to identify keyword intent in Ahrefs' Keywords ExplorerUsing AI to identify keyword intent in Ahrefs' Keywords Explorer

It’s going to be virtually impossible for generative AI to give searchers this—especially the templates.

If you’re still not sure whether a topic is “deep” enough for AI to fail, paste your topic into ChatGPT or Gemini and see what it generates. If it leaves much to be desired and only tells you the what, not the how, then it’s probably a deep topic.

Backlinks are still a ranking factor. AI hasn’t changed that. You need to earn them if you want to rank for anything competitive, and the best way to do that is to showcase unique experiences, expertise, and data in your content.

For example, we got a link from adobe.com (DR 96) when they cited a statistic from our study on how many pages get no search traffic:

Example of a backlink earned with dataExample of a backlink earned with data

And we got a link from hubspot.com (DR 93) when they cited the SEO report template we made:

Example of a backlink earned with a templateExample of a backlink earned with a template

But here’s the problem:

It’s hard to do this for shallow topics because there’s not much you can add. 

Advertisement

For example, take the topic of how to reinstall macOS. What exactly can you write here beyond the same basic instructions featured in every other post? Nothing. It’s virtually impossible to make “linkable” content about this topic. It’s too shallow.

It’s hard to rank for shallow topics because your content is never unique. It’s just a bunch of words that have already been said a million times. There’s nothing underneath the surface for people to cite and link to.

It's what's below the surface of your content that earns backlinks!It's what's below the surface of your content that earns backlinks!
It’s what’s below the surface of your content that earns backlinks!

It’s hard to make shallow topics interesting. I think that’s why there’s so much dull “SEO content” out there. You know the kind of stuff I’m talking about: no personality, just vague answers to boring questions nested under keyword-rich H2s.

Now, I know what you’re probably thinking:

“But Josh, this is what works! We’re only doing it because it’s what Google wants!”

That might be true for shallow topics, but we already discussed how AI will steal the traffic from these in the not-so-distant future. For deep topics that require more explanation, your content needs to be engaging and interesting.

There are (at least) two reasons for this.

Advertisement

Interesting content = “information gain”

Even if bringing something new and interesting to the table doesn’t earn you more backlinks, it may still help you rank higher in Google.

That’s because Google cares about the originality of content and almost certainly has mechanisms in place to identify and reward it. They even patented a mechanism for scoring “information gain” back in 2022.

Engaging content = better user signals

If you want to spark debate among SEOs, steer the conversation towards user signals.

Many in the industry have been convinced for years that user signals like click-through rate are ranking signals. You might even be familiar with Rand Fishkin’s infamous mini-experiment from 2014 where he asked his Twitter followers to click on a search result en masse, leading to a #1 ranking that evening.

Google says signals like these aren’t ranking factors because they’re too noisy.

Advertisement

If you think about it, clicks in general are incredibly noisy. People do weird things on the search result pages. They click around like crazy, and in general it’s really, really hard to clean up that data.

Gary IllyesGary Illyes

But… they also say this on their “How Search Works” page:

We also use aggregated and anonymised interaction data to assess whether Search results are relevant to queries. We transform that data into signals that help our machine-learned systems better estimate relevance.

Tomayto, tomahto. Directly or indirectly, user data influences Google’s algorithms. If all you’re publishing is dull, uninspiring content that AI could write in a heartbeat, nobody is going to read or engage with it. This might negatively impact your ability to rank.

Final thoughts

If you want the ChatGPT-esque summary of this post, it’s this: prioritize deep topics that AI will struggle to answer and create interesting and engaging content about them. That’s how you build an SEO moat in an AI world.

I know that can seem like a waste of time, especially in a world of ever-decreasing attention spans. It’s easy to convince yourself that dull AI answers are what people actually want.

This is exactly how I felt a few months ago… before Tim kindly set me straight 😅

Tim's comment to me a few months ago. Very wise, if you ask me :) Tim's comment to me a few months ago. Very wise, if you ask me :)

Maybe it’s just because he’s my boss and I respect him, but I think there’s some real truth to this comment. People still want to read content, but only if it’s engaging and deep.

So, let’s make it our collective mission to pick wiser battles and craft interesting content that stands the test of time.

Advertisement



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Limits News Links In California Over Proposed ‘Link Tax’ Law

Published

on

By

A brown cardboard price tag with a twine string and a black dollar sign symbol, influenced by the Link Tax Law, set against a dark gray background.

Google announced that it plans to reduce access to California news websites for a portion of users in the state.

The decision comes as Google prepares for the potential passage of the California Journalism Preservation Act (CJPA), a bill requiring online platforms like Google to pay news publishers for linking to their content.

What Is The California Journalism Preservation Act?

The CJPA, introduced in the California State Legislature, aims to support local journalism by creating what Google refers to as a “link tax.”

If passed, the Act would force companies like Google to pay media outlets when sending readers to news articles.

However, Google believes this approach needs to be revised and could harm rather than help the news industry.

Advertisement

Jaffer Zaidi, Google’s VP of Global News Partnerships, stated in a blog post:

“It would favor media conglomerates and hedge funds—who’ve been lobbying for this bill—and could use funds from CJPA to continue to buy up local California newspapers, strip them of journalists, and create more ghost papers that operate with a skeleton crew to produce only low-cost, and often low-quality, content.”

Google’s Response

To assess the potential impact of the CJPA on its services, Google is running a test with a percentage of California users.

During this test, Google will remove links to California news websites that the proposed legislation could cover.

Zaidi states:

“To prepare for possible CJPA implications, we are beginning a short-term test for a small percentage of California users. The testing process involves removing links to California news websites, potentially covered by CJPA, to measure the impact of the legislation on our product experience.”

Google Claims Only 2% of Search Queries Are News-Related

Zaidi highlighted peoples’ changing news consumption habits and its effect on Google search queries (emphasis mine):

“It’s well known that people are getting news from sources like short-form videos, topical newsletters, social media, and curated podcasts, and many are avoiding the news entirely. In line with those trends, just 2% of queries on Google Search are news-related.”

Despite the low percentage of news queries, Google wants to continue helping news publishers gain visibility on its platforms.

Advertisement

However, the “CJPA as currently constructed would end these investments,” Zaidi says.

A Call For A Different Approach

In its current form, Google maintains that the CJPA undermines news in California and could leave all parties worse off.

The company urges lawmakers to consider alternative approaches supporting the news industry without harming smaller local outlets.

Google argues that, over the past two decades, it’s done plenty to help news publishers innovate:

“We’ve rolled out Google News Showcase, which operates in 26 countries, including the U.S., and has more than 2,500 participating publications. Through the Google News Initiative we’ve partnered with more than 7,000 news publishers around the world, including 200 news organizations and 6,000 journalists in California alone.”

Zaidi suggested that a healthy news industry in California requires support from the state government and a broad base of private companies.

As the legislative process continues, Google is willing to cooperate with California publishers and lawmakers to explore alternative paths that would allow it to continue linking to news.

Advertisement

Featured Image:Ismael Juan/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

The Best of Ahrefs’ Digest: March 2024

Published

on

The Best of Ahrefs’ Digest: March 2024

Every week, we share hot SEO news, interesting reads, and new posts in our newsletter, Ahrefs’ Digest.

If you’re not one of our 280,000 subscribers, you’ve missed out on some great reads!

Here’s a quick summary of my personal favorites from the last month:

Best of March 2024

How 16 Companies are Dominating the World’s Google Search Results

Author: Glen Allsopp

tl;dr

Glen’s research reveals that just 16 companies representing 588 brands get 3.5 billion (yes, billion!) monthly clicks from Google.

My takeaway

Glen pointed out some really actionable ideas in this report, such as the fact that many of the brands dominating search are adding mini-author bios.

Advertisement
Example of mini-author bios on The VergeExample of mini-author bios on The Verge

This idea makes so much sense in terms of both UX and E-E-A-T. I’ve already pitched it to the team and we’re going to implement it on our blog.

How Google is Killing Independent Sites Like Ours

Authors: Gisele Navarro, Danny Ashton

tl;dr

Big publications have gotten into the affiliate game, publishing “best of” lists about everything under the sun. And despite often not testing products thoroughly, they’re dominating Google rankings. The result, Gisele and Danny argue, is that genuine review sites suffer and Google is fast losing content diversity.

My takeaway

I have a lot of sympathy for independent sites. Some of them are trying their best, but unfortunately, they’re lumped in with thousands of others who are more than happy to spam.

Estimated search traffic to Danny and Gisele's site fell off a cliff after Google's March updatesEstimated search traffic to Danny and Gisele's site fell off a cliff after Google's March updates
Estimated search traffic to Danny and Gisele’s site fell off a cliff after Google’s March updates 🙁 

I know it’s hard to hear, but the truth is Google benefits more from having big sites in the SERPs than from having diversity. That’s because results from big brands are likely what users actually want. By and large, people would rather shop at Walmart or ALDI than at a local store or farmer’s market.

That said, I agree with most people that Forbes (with its dubious contributor model contributing to scams and poor journalism) should not be rewarded so handsomely.

The Discussion Forums Dominating 10,000 Product Review Search Results

Author: Glen Allsopp

Tl;dr

Glen analyzed 10,000 “product review” keywords and found that:

Advertisement

My takeaway

After Google’s heavy promotion of Reddit from last year’s Core Update, to no one’s surprise, unscrupulous SEOs and marketers have already started spamming Reddit. And as you may know, Reddit’s moderation is done by volunteers, and obviously, they can’t keep up.

I’m not sure how this second-order effect completely escaped the smart minds at Google, but from the outside, it feels like Google has capitulated to some extent.

John Mueller seemingly having too much faith in Reddit...John Mueller seemingly having too much faith in Reddit...

I’m not one to make predictions and I have no idea what will happen next, but I agree with Glen: Google’s results are the worst I’ve seen them. We can only hope Google sorts itself out.

Who Sends Traffic on the Web and How Much? New Research from Datos & SparkToro

Author: Rand Fishkin

tl;dr

63.41% of all U.S. web traffic referrals from the top 170 sites are initiated on Google.com.

Data from SparktoroData from Sparktoro

My takeaway

Despite all of our complaints, Google is still the main platform to acquire traffic from. That’s why we all want Google to sort itself out and do well.

But it would also be a mistake to look at this post and think Google is the only channel you should drive traffic from. As Rand’s later blog post clarifies, “be careful not to ascribe attribution or credit to Google when other investments drove the real value.”

I think many affiliate marketers learned this lesson well from the past few Core Updates: Relying on one single channel to drive all of your traffic is not a good idea. You should be using other platforms to build brand awareness, interest, and demand.

Want more?

Each week, our team handpicks the best SEO and marketing content from around the web for our newsletter. Sign up to get them directly in your inbox.

Advertisement



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Unplugs “Notes on Search” Experiment

Published

on

By

Google unplugs Notes On Search Experiment

Google is shutting down it’s Google Notes Search Labs experiment that allowed users to see and leave notes on Google’s search results and many in the search community aren’t too surprised.

Google Search Notes

Availability of the feature was limited to Android and Apple devices and there was never a clearly defined practical purpose or usefulness of the Notes experiment. Search marketers reaction throughout has consistently been that would become a spam-magnet.

The Search Labs page for the experiment touts it as mode of self-expression, to help other users and as a way for users to collect their own notes within their Google profiles.

The official Notes page in Search Labs has a simple notice:

Notes on Search Ends May 2024

That’s it.

Advertisement

Screenshot Of Notice

Reaction From Search Community

Kevin Indig tweeted his thoughts that anything Google makes with a user generated content aspect was doomed to attract spam.

He tweeted:

“I’m gonna assume Google retires notes because of spam.

It’s crazy how spammy the web has become. Google can’t launch anything UGC without being bombarded.”

Cindy Krum (@Suzzicks) tweeted that it was author Purna Virji (LinkedIn profile) who predicted that it would be shut down once Google received enough data.

She shared:

Advertisement

“It was actually @purnavirji who predicted it when we were at @BarbadosSeo – while I was talking. Everyone agreed that it would be spammed, but she said it would just be a test to collect a certain type of information until they got what they needed, and then it would be retired.”

Purna herself responded with a tweet:

“My personal (non-employer) opinion is that everyone wants all the UGC to train the AI models. Eg Reddit deal also could potentially help with that.”

Google’s Notes for Search seemed destined to never take off, it was met with skepticism and a shrug when it came out and nobody’s really mourning that it’s on the way out, either.

Featured Image by Shutterstock/Jamesbin



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS