Connect with us

SEO

Site Quality Is Simpler Than People Think

Published

on

Site Quality Is Simpler Than People Think

Google’s John Mueller, Martin Splitt and Gary Illyes discussed site quality in a recent podcast, explaining the different ways of thinking about site quality and at one point saying it’s not rocket science. The discussion suggests that site quality could be simpler than most people know.

Site Quality Is Not Rocket Science

The first point they touched on is to recommend reading site quality documentation, insisting that site quality is not especially difficult to understand.

Gary Illyes said:

“So I would go to a search engine’s documentation.

Most of them have some documentation about how they function and just try to figure out where your content might be failing or where your page might be failing because honestly, okay, this is patronizing, but it’s not rocket science.”

No Tools For Site Quality – What To Do?

Gary acknowledged that there’s no tool for diagnosing site quality, not in the same way there are tools for objectively detecting technical issues.

Advertisement

The traffic metrics that show a downward movement don’t explain why, they just show that something changed.

Gary Illyes:

“I found the up-down metric completely useless because you still have to figure out what’s wrong with it or why people didn’t like it.

And then you’re like, “This is a perfectly good page. I wrote it, I know that it’s perfect.”

And then people, or I don’t know, like 99.7% of people are downvoting it. And you’re like, ‘Why?’”

Martin Splitt

“And I think that’s another thing.

How do I spot, I wrote the page, so clearly it is perfect and helpful and useful and amazing, but then people disagree, as you say.

Advertisement

How do you think about that? What do you do then?

How can I make my content more helpful, better, more useful? I don’t know.

…There’s all these tools that I can just look at and I see that something’s good or something’s bad.

But for quality, how do I go about that?”

Gary Illyes

“What if quality is actually simpler than at least most people think?

…What if it’s about writing the thing that will help people achieve whatever they need to achieve when they come to the page? And that’s it.”

Advertisement

Martin Splitt asked if Gary was talking about reviewing the page from the perspective of the user.

Illyes answered:

“No, we are reframing.”

Reframing generally means to think about the problem differently.

Gary’s example is to reframe the problem as whether the page delivers what it says it’s going to deliver (like helping users achieve X,Y,Z).

Something I see a lot with content is that the topic being targeted (for example, queries about how to catch a trout) isn’t matched by the content (which might actually be about tools for catching trout) which is not what the site visitor wants to achieve.

Quality In Terms Of Adding Value

There are different kinds of things that relate to site and page quality and in the next part of the podcast John Mueller and Gary Illyes discuss the issue about adding something of value.

Advertisement

Adding something of value came up in the context of where the SERPs offer good answers from websites that people not only enjoy but they expect to see those sites as answers for those queries.

You can tell when users expect specific sites for individual search queries when Google Suggests shows the brand name and the keyword.

That’s a clue that probably a lot of people are turning keywords into branded searches, which signals to Google what people want to see.

So, the problem of quality in those situations isn’t about being relevant for a query with the perfect answer.

For these situations, like for competitive queries, it’s not enough to be relevant or have the perfect answer.

John Mueller explains:

Advertisement

“The one thing I sometimes run into when talking with people is that they’ll be like, “Well, I feel I need to make this page.”

And I made this page for users in air quotes…

But then when I look at the search results, it’s like 9,000 other people also made this page.

It’s like, is this really adding value to the Internet?

And that’s sometimes kind of a weird discussion to have.

It’s like, ‘Well, it’s a good page, but who needs it?’

There are so many other versions of this page already, and people are happy with those.”

Advertisement

This is the type of situation where competitive analysis to “reverse engineer” the SERPs  works against the SEO.

It’s stale because using what’s in the SERPs as a template for what to do rank is feeding Google what it already has.

It’s like, as an example, let’s represent the site ranked in Google with a baseline of the number zero.

Let’s imagine everything in the SERPs has a baseline of zero. Less than zero is poor quality. Higher than zero is higher quality.

Zero is not better than zero, it’s just zero.

The SEOs who think they’re reverse engineering Google by copying entities, copying topics, they’re really just achieving an imperfect score of zero.

Advertisement

So, according to Mueller, Google responds with, “it’s a good page, but who needs it?”

What Google is looking for in this situation is not the baseline of what’s already in the SERPs, zero.

According to Mueller, they’re looking for something that’s not the same as the baseline.

So in my analogy, Google is looking for something above the baseline of what is already in the SERPs, a number greater than zero, which is a one.

You can’t add value by feeding Google back what’s already there. And you can’t add value by doing the same thing ten times bigger. It’s still the same thing.

Breaking Into The SERPs By The Side Door

Gary Illyes next discusses a way to break into a tough SERP, saying the way to do it is indirectly.

Advertisement

This is an old strategy but a good one that still works today.

So, rather than bringing a knife to a gunfight, Gary Illyes suggests choosing more realistic battles to compete in.

Gary continued the conversation about competing in tough SERPs.

He said:

“…this also is kind of related to the age-old topic that if you are a new site, then how can you break into your niche?

I think on today’s Internet, like back when I was doing ‘SEO’, it was already hard.

For certain topics or niches, it was absolutely a nightmare, like ….mesothelioma….

Advertisement

That was just impossible to break into. Legal topics, it was impossible to break into.

And I think by now, we have so much content on the Internet that there’s a very large number of topics where it is like 15 years ago or 20 years ago, that mesothelioma topic, where it was impossible to break into.

…I remember Matt Cutts, former head of Web Spam, …he was doing these videos.

And in one of the videos, he said try to offer something unique or your own perspective to the thing that you are writing about.

Then the number of perspective or available perspectives, free perspectives, is probably already gone.

But if you find a niche where people are not talking too much about, then suddenly, it’s much easier to break into.

Advertisement

So basically, this is me saying that you can break into most niches if you know what you are doing and if you are actually trying to help people.”

What Illyes is suggesting as a direction is to “know what you are doing and if you are actually trying to help people.

That’s one of my secrets to staying one step ahead in SEO.

For example, before the reviews update, before Google added Experience to E-A-T, I was telling clients privately to do that for their review pages and I told them to keep it a secret, because I knew I had it dialed in.

I’m not psychic, I was just looking at what Google wants to rank and I figured it out several years before the reviews update that you need to have original photos, you need to have hands-on experience with the reviewed product, etc.

Gary’s right when he advises to look at the problem from the perspective of “trying to help people.”

Advertisement

He next followed up with this idea about choosing which battles to fight.

He said:

“…and I think the other big motivator is, as always, money. People are trying to break into niches that make the most money. I mean, duh, I would do the same thing probably.

But if you write about these topics that most people don’t write about, let’s say just three people wrote about it on the Internet, then maybe you can capture some traffic.

And then if you have many of those, then maybe you can even outdo those high-traffic niches.”

Barriers To Entry

What Gary is talking about is how to get around the barrier to entry, which are the established sites. His suggestion is to stay away from offering what everyone else is offering (which is a quality thing).

Creating content that the bigger sites can’t or don’t know to create is an approach I’ve used with a new site.

Advertisement

Weaknesses can be things that the big site does poorly, like their inability to resonate with a younger or older audience and so on.

Those are examples of offering something different that makes the site stand out from a quality perspective.

Gary is talking about picking the battles that can be won, planting a flag, then moving on to the next hill.

That’s a far better strategies than walking up toe to toe with the bigger opponent.

Analyzing For Quality Issues

It’s a lot easier to analyze a site for technical issues than it is for quality issues.

But a few of the takeaways are:

Advertisement
  • Be aware that the people closest to the content are not always the best judges of content is quality.
  • Read Google’s search documentation (for on-page factors, content, and quality guidelines).
  • Content quality is simpler than it seems. Just think about knowing the topic well and being helpful to people.
  • Being original is about looking at the SERPs for things that you can do differently, not about copying what the competitors are doing.

In my experience, it’s super important to keep an open mind, to not get locked into one way of thinking, especially when it comes to site quality. This will help one keep from getting locked into a point of view that can keep one from seeing the true cause of ranking issues.

Featured Image by Shutterstock/Stone36

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

10 Completely Free SEO Training Courses

Published

on

10 Completely Free SEO Training Courses

Learning SEO doesn’t have to break the bank. There are plenty of quality free SEO courses teaching everything from the basics to keyword research to link building.

Here are ten that won’t cost a dime.

Course provider: Ahrefs

Duration: 2 hours

Advertisement

Instructor(s): Sam Oh

Level: Beginner

Link: SEO Course for Beginners

What you’ll learn

  • The fundamentals of what search engine optimization is and how it works
  • Why SEO is important
  • How to do keyword research
  • How to optimize web pages for search engines
  • Beginner-friendly link building strategies to get backlinks to your site
  • Technical SEO best practices for beginners

This comprehensive course is ours and covers the fundamentals of SEO, including keyword research, on-page SEO, technical SEO, and link building.

SEO Certification Course by HubSpotSEO Certification Course by HubSpot

Course provider: HubSpot

Duration: 3 hours 51 minutes

Instructor(s): Rachel Sheldon, Matthew Howells-Barby

Advertisement

Level: Beginner

Link: SEO Certification Course

What you’ll learn

  • How to evaluate and improve your website’s SEO
  • How to build backlinks to your website at scale to increase your website’s visibility in organic search
  • How to use insights from keyword research and reporting to improve your search performance

HubSpot’s SEO Training Course is tailored for marketers, content creators, and anyone looking to enhance their website’s visibility. Through practical lessons and real-world examples, the course participants will learn how to build a robust SEO strategy, analyze their website’s performance, and adapt to the changing algorithms of search engines.

Make sure customers find you online by Google SkillshopMake sure customers find you online by Google Skillshop

Course provider: Google

Duration: 3 hours

Instructor(s): Google

Level: Beginner

Advertisement

Link: Make Sure Customers Find You Online

What you’ll learn

  • How to get started with search
  • How to make search work for you
  • How to get discovered with search
  • How to help people nearby find you online

This free course from Google Skillshop helps businesses discover ways to reach and connect with more customers online. It covers improving SEO and using online advertising (SEM) to boost sales and awareness.

Google SEO Fundamentals by UC Davis on CourseraGoogle SEO Fundamentals by UC Davis on Coursera

Course provider: University of California, Davis

Duration: 28 hours

Instructor(s): Rebekah May

Level: Beginner

Link: Google SEO Fundamentals

Advertisement

What you’ll learn

  • How to complete a competitive analysis on a webpage
  • How to interpret brand recognition through social media
  • How to create sitemaps and robot.txt files, plan redirects, and manage site errors
  • How to use a variety of SEO tools to conduct an audience analysis and develop personas of your ideal buyer

Offered by the University of California, Davis, this course on Coursera delves into the fundamental aspects of SEO, including how search engines work and how to implement effective SEO strategies to attract more organic traffic.

However, due to its length (28 hours), it may not be the most suitable if you want to learn SEO fast.

SEO for Beginners Training by YoastSEO for Beginners Training by Yoast

Course provider: Yoast

Duration: 2 hours

Instructor(s): Joost de Valk

Level: Beginner

Link: SEO for Beginners Training

Advertisement

What you’ll learn

  • What SEO is and what Google does
  • Tips for quick wins to improve your site
  • Insights into the content and technical side of SEO

This free course discusses what SEO is and how it works. Some of the important points from the course are how to use keywords to optimize your website, how to write content that Google likes, and how to make your website crawlable by search engines.

Keyword Research Course by AhrefsKeyword Research Course by Ahrefs

Course provider: Ahrefs

Duration: 2 hours

Instructor(s): Sam Oh

Level: Beginner

Link: Keyword Research Course

What you’ll learn

  • How to do keyword research and drive targeted traffic to your website

This is our specialized course that focuses specifically on keyword research. It covers topics such as how to choose keywords, how to analyze search intent, and how to find low-competition keywords.

Technical SEO Course by AhrefsTechnical SEO Course by Ahrefs

Course provider: Ahrefs

Duration: 1 hour 21 minutes

Advertisement

Instructor(s): Sam Oh

Level: Beginner to intermediate

Link: Technical SEO Course

What you’ll learn

  • The fundamentals of technical SEO
  • How to run a technical SEO audit
  • How to optimize your website’s technical SEO

Another specialized course from us, this course is designed for those looking to dive deeper into the technical side of SEO. It covers advanced topics such as site audits, page speed optimization, and how to resolve common technical issues that can impact search rankings.

Technical SEO Certification by Blue ArrayTechnical SEO Certification by Blue Array

Course provider: Blue Array

Duration: 7 hours

Instructor(s): Damion Edwards

Advertisement

Level: Beginner to intermediate

Link: Technical SEO Certification

What you’ll learn

Aimed at professionals seeking to certify their expertise, this course covers a wide range of technical SEO topics, including crawling, indexing, ranking, and on-page optimization. From site architecture to schema markup, it equips learners with the skills to tackle technical challenges and improve website performance.

Local SEO Course by AhrefsLocal SEO Course by Ahrefs

Course provider: Ahrefs

Duration: 44 minutes

Instructor(s): Sam Oh

Level: Beginner

Link: Local SEO Course

Advertisement

What you’ll learn

  • How to do local SEO
  • How to do local keyword research
  • How to do local link building

Ideal for businesses targeting local customers, this course teaches the basics of optimizing for local search. It covers essential tactics for improving local visibility, such as Google Business Profile optimization and local keyword targeting.

Advanced Link Building Course by AhrefsAdvanced Link Building Course by Ahrefs

Course provider: Ahrefs

Duration: 1 hour 48 minutes

Instructor(s): Sam Oh

Level: Intermediate to advanced

Link: Advanced Link Building Course

What you’ll learn

  • How to find prospects with the “seed and lookalike” approach
  • How to validate link building campaigns with a “blitz list”
  • How to craft personalized and benefit-rich outreach emails
  • How to create, structure and manage a link building team
  • How to scale your link building operations

Focusing on one of the most challenging aspects of SEO, Sam shares his years of experience creating campaigns, sending outreach emails, and building teams. This is a must-finish course if you need help building and scaling your link building operations.

Final thoughts

The best way to learn SEO is to do.

Advertisement

So, don’t just go through the courses, take notes, and leave it aside. You need to actually execute to find out what works and what doesn’t. Create a website, implement the ideas you’re learning, and see if you can get more organic traffic to it.

That’s how you become an SEO pro.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Answers A Crawl Budget Issue Question

Published

on

By

Google Answers A Crawl Budget Issue Question

Someone on Reddit posted a question about their “crawl budget” issue and asked if a large number of 301 redirects to 410 error responses were causing Googlebot to exhaust their crawl budget. Google’s John Mueller offered a reason to explain why the Redditor may be experiencing a lackluster crawl pattern and clarified a point about crawl budgets in general.

Crawl Budget

It’s a commonly accepted idea that Google has a crawl budget, an idea that SEOs invented to explain why some sites aren’t crawled enough. The idea is that every site is allotted a set number of crawls, a cap on how much crawling a site qualifies for.

It’s important to understand the background of the idea of the crawl budget because it helps understand what it really is. Google has long insisted that there is no one thing at Google that can be called a crawl budget, although how Google crawls a site can give an impression that there is a cap on crawling.

A top Google engineer (at the time) named Matt Cutts alluded to this fact about the crawl budget in a 2010 interview.

Matt answered a question about a Google crawl budget by first explaining that there was no crawl budget in the way that SEOs conceive of it:

Advertisement

“The first thing is that there isn’t really such thing as an indexation cap. A lot of people were thinking that a domain would only get a certain number of pages indexed, and that’s not really the way that it works.

There is also not a hard limit on our crawl.”

In 2017 Google published a crawl budget explainer that brought together numerous crawling-related facts that together resemble what the SEO community was calling a crawl budget. This new explanation is more precise than the vague catch-all phrase “crawl budget” ever was (Google crawl budget document summarized here by Search Engine Journal).

The short list of the main points about a crawl budget are:

  • A crawl rate is the number of URLs Google can crawl based on the ability of the server to supply the requested URLs.
  • A shared server for example can host tens of thousands of websites, resulting in hundreds of thousands if not millions of URLs. So Google has to crawl servers based on the ability to comply with requests for pages.
  • Pages that are essentially duplicates of others (like faceted navigation) and other low-value pages can waste server resources, limiting the amount of pages that a server can give to Googlebot to crawl.
  • Pages that are lightweight are easier to crawl more of.
  • Soft 404 pages can cause Google to focus on those low-value pages instead of the pages that matter.
  • Inbound and internal link patterns can help influence which pages get crawled.

Reddit Question About Crawl Rate

The person on Reddit wanted to know if the perceived low value pages they were creating was influencing Google’s crawl budget. In short, a request for a non-secure URL of a page that no longer exists redirects to the secure version of the missing webpage which serves a 410 error response (it means the page is permanently gone).

It’s a legitimate question.

This is what they asked:

“I’m trying to make Googlebot forget to crawl some very-old non-HTTPS URLs, that are still being crawled after 6 years. And I placed a 410 response, in the HTTPS side, in such very-old URLs.

So Googlebot is finding a 301 redirect (from HTTP to HTTPS), and then a 410.

Advertisement

http://example.com/old-url.php?id=xxxx -301-> https://example.com/old-url.php?id=xxxx (410 response)

Two questions. Is G**** happy with this 301+410?

I’m suffering ‘crawl budget’ issues, and I do not know if this two responses are exhausting Googlebot

Is the 410 effective? I mean, should I return the 410 directly, without a first 301?”

Google’s John Mueller answered:

G*?

301’s are fine, a 301/410 mix is fine.

Advertisement

Crawl budget is really just a problem for massive sites ( https://developers.google.com/search/docs/crawling-indexing/large-site-managing-crawl-budget ). If you’re seeing issues there, and your site isn’t actually massive, then probably Google just doesn’t see much value in crawling more. That’s not a technical issue.”

Reasons For Not Getting Crawled Enough

Mueller responded that “probably” Google isn’t seeing the value in crawling more webpages. That means that the webpages could probably use a review to identify why Google might determine that those pages aren’t worth crawling.

Certain popular SEO tactics tend to create low-value webpages that lack originality. For example, a popular SEO practice is to review the top ranked webpages to understand what factors on those pages explain why those pages are ranking, then taking that information to improve their own pages by replicating what’s working in the search results.

That sounds logical but it’s not creating something of value. If you think of it as a binary One and Zero choice, where zero is what’s already in the search results and One represents something original and different, the popular SEO tactic of emulating what’s already in the search results is doomed to create another Zero, a website that doesn’t offer anything more than what’s already in the SERPs.

Clearly there are technical issues that can affect the crawl rate such as the server health and other factors.

But in terms of what is understood as a crawl budget, that’s something that Google has long maintained is a consideration for massive sites and not for smaller to medium size websites.

Advertisement

Read the Reddit discussion:

Is G**** happy with 301+410 responses for the same URL?

Featured Image by Shutterstock/ViDI Studio

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

SEOs, Are You Using These 6 Mental Models?

Published

on

SEOs, Are You Using These 6 Mental Models?

People use mental models to comprehend reality, solve problems, and make decisions in everyday life. SEO is not an exception here, yet it’s not a topic you often hear about in this industry.

The thing is, you need to be careful with mental models because they’re sneaky. We tend to develop them during our lives, inherit them from our colleagues and mentors, and rely on them almost instinctively while not fully aware of their influence or the existence of better alternatives.

So, let’s talk about mental models you will find helpful in your SEO work and the ones you should approach with caution.

3 helpful mental models

In the noisy, uncertain world of SEO, these will be your north star.

Advertisement

First principles thinking is a problem-solving approach that involves breaking down complex problems into their most basic elements and reassembling them from the ground up.

It’s about asking oneself what is absolutely true about a situation and then reasoning up from there to create new solutions.

Using first principles thinking to rearrange the same building blocks into a brand new shape. 

Uncertainty is a chronic condition in SEO. And it is so by design because the whole industry is based on Google’s secrets. Access to the truth is extremely limited. We got to the point that we got used to accepting speculation and theories on SEO so much that we started to crave them.

This is where the first principles come in. Whenever you need a brand new solution for a problem or when you feel that you’ve gone too far into speculation, come back to the first principles — things that have the best chance to be true in this industry. For example:

The Pareto Principle (aka the 80/20 rule) is about a disproportionate relationship between inputs and outputs, effort and results, or causes and effects. A small number of causes (20%) often leads to a large number of effects (80%).

The Pareto principleThe Pareto principle
The 80/20 rule: 80% of results come from 20% of the projects.

This concept was named after Vilfredo Pareto, an Italian economist who, in 1906, noticed that 80% of Italy’s land was owned by 20% of the population.

If we use this principle as a mental model in decision-making, we’ll find it easier to prioritize work. It’s ok to ignore some things because they likely won’t matter that much. The result that you’re after will come from focusing on the things that will likely have the biggest impact, and not from spreading yourself too thin.

For example, if you want to build links to your site, pitch your best content. That can be the content that has already proven to earn links in the past.

Advertisement
Best by links report in Ahrefs.Best by links report in Ahrefs.

Or if you need to recover some of that lost traffic, home in on the pages that lost the most traffic.

Top pages report in Ahrefs. Top pages report in Ahrefs.

The key is to treat the 80/20 as an approximation, a heuristic, and not take the numbers literally. To illustrate, roughly 80% of our site’s traffic comes from no more than 6% of pages.

Total organic traffic breakdown in Ahrefs. Total organic traffic breakdown in Ahrefs.

But on the other hand, if we try to find the top 20% pages that contribute to the traffic the most, we’ll find that they bring not 80% but 96.8% traffic. However you look at it, the idea still holds — a small amount of causes led to a large portion of effects.

“It takes all the running you can do, to keep in the same place.”

Sounds very much like SEO already, doesn’t it?

This quote comes from Lewis Carroll’s “Through the Looking-Glass,” and it’s how the Red Queen explains to Alice the nature of her kingdom, where it requires constant effort just to maintain one’s current position.

It was used to name an evolutionary biology theory which posits that each species must adapt and evolve not just for incremental gains but for survival, as their competitors are also evolving. Sorry, we’re in an endless race.

The Red Queen Theory as an endless race.The Red Queen Theory as an endless race.
SEO is like a road with no finish line—the race continues forever.

You can probably already guess how this applies to SEO — rankings. If you want to maintain high rankings, you can’t stop improving your pages. There will always be enough competitors to challenge your position.

But in our world, pressure comes from competitors and the environment. Google keeps evolving too, pushing the bar for content higher, making elements that used to give you an edge a standard.

I’m sure we’ve all been there – even our top backlink-earning, top traffic-generating, most time-consuming content gets pushed down. But if you keep optimizing, you get a chance to come back to the top.

Position history graph in Ahrefs.Position history graph in Ahrefs.

This mental model is another way of saying that SEO works best as an always-on strategy without a set end date or final goal.

3 mental models to watch out for

It’s not so much about avoiding them but being able to spot them when they happen or could happen.

Advertisement

A local maximum (aka local optimum) refers to a solution that is the best solution within a neighboring set of solutions, but not necessarily the best possible solution overall (global optimum).

Local maxima.Local maxima.

So if you’re feeling that you’re spending immense effort just to make marginal improvements, you have to be willing to assume that you’ve hit a local maxima. Then, the question to ask is: what can I do differently?

Here’s an example.

Until November last year, traffic to our site was a series of local optima. Our content marketing was delivering the results, but the growth was relatively slow. Obviously, we were doing the same tried and tested stuff. But then we launched two programmatic SEO projects that instantly elevated us to a level we’d have to work years for — look how fast the yellow line grew (pages) and how that corresponded with the orange line (traffic).

Organic performance graph in Ahrefs.Organic performance graph in Ahrefs.

The sunk cost fallacy is a cognitive bias that occurs when people continue to do something as a result of previously invested resources (time, money, effort) despite new evidence suggesting that the current path will not lead to a beneficial outcome.

Sunk cost fallacy as a graph.Sunk cost fallacy as a graph.
Sunk cost in action: the more you invest in something, the more attached to it you become.

We all know SEO is a long-term game, right? Strategies like these are crowded with long-term projects with big time and money investments. Sometimes, despite the investments, you just can’t go beyond a certain level of traffic, backlinks, etc.

Now, this mental model, this voice in your head, will tell you to keep going down the same path no matter what. Loss aversion kicks in, acting like a defense mechanism for your past selves and actions. And the more aggressive and blind the “hustle” culture is at one’s team, the harder it is to see clearly.

But, overall, it could be better for you and the company to let it go and focus on something else. You can even come back to it later with a fresh mind. But continuing something just because you’ve been doing it for some time is a losing strategy.

Example. Despite several attempts and time counted in years, Ahrefs doesn’t rank for “seo”.

Advertisement
Position history for "seo" via Ahrefs.Position history for "seo" via Ahrefs.

Sad but true. And from our point of view, it’s frustrating. Almost like we’re the only ones not to get invited to the party, the only ones not to graduate from high school… you get the idea.

But not ranking for “SEO” hasn’t hindered our growth, so it’s better to cut losses and deal with unfulfilled ambition than to let that goal hold us back from other projects (like that programmatic project mentioned above).

Confirmation bias is the tendency to give more attention and weight to data that support one’s own beliefs, while simultaneously dismissing or underestimating evidence that contradicts those beliefs.

Confirmation bias - beliefs outweigh the facts. Confirmation bias - beliefs outweigh the facts.

We’re all guilty of this. It’s human nature. And it’s not exclusively a bad thing. I mean, in some situations, this tendency can keep us on “the bright side” and help us go through tough times or keep our motivation up.

So, I think that it’s not something to get out of your system completely. Just be mindful of situations where this can negatively affect your judgment:

  • Selective evidence in ranking factors. You see a page ranking high, and you think it’s because of an aspect you strongly believe in, disregarding all of the evidence against it (e.g., long-form content, social signals).
  • Bias in keyword selection. Your keyword selection runs along the lines of your beliefs about the audience preferences without substantial evidence to back up these beliefs.
  • Bias in strategy development. After developing a new strategy, you encounter a talk or an article advocating a similar approach, which immediately reinforces your confidence in this strategy.
  • Focus on confirmatory data during audits. During a content audit, you find a small piece of data that confirms your belief. As a result, you may prioritize minor findings over more significant but less personally affirming data.
  • Overconfidence in familiar tactics. Leaning on SEO tactics that have worked in the past, you develop a sense of overconfidence in them. You resist trying anything new or the idea that a dip in performance comes from an unfamiliar factor.

Keep learning

If you like what you’re reading, I think you will find other mental models fascinating:

Want to share models you find useful? Ping me on X or LinkedIn.



Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS