Connect with us

SEO

Research Shows Tree Of Thought Prompting Better Than Chain Of Thought

Published

on

Research Shows Tree Of Thought Prompting Better Than Chain Of Thought

Researchers discovered a way to defeat the safety guardrails in GPT4 and GPT4-Turbo, unlocking the ability to generate harmful and toxic content, essentially beating a large language model with another large language model.

The researchers discovered that the use of tree-of-thought (ToT)reasoning to repeat and refine a line of attack was useful for jailbreaking another large language model.

What they found is that the ToT approach was successful against GPT4, GPT4-Turbo, and PaLM-2, using a remarkably low number of queries to obtain a jailbreak, on average less than thirty queries.

Tree Of Thoughts Reasoning

A Google research paper from around May 2022 discovered Chain of Thought Prompting.

Chain of Thought (CoT) is a prompting strategy used on a generative AI to make it follow a sequence of steps in order to solve a problem and complete a task. The CoT method is often accompanied with examples to show the LLM how the steps work in a reasoning task.

Advertisement

So, rather than just ask a generative AI like Midjourney or ChatGPT to do a task, the chain of thought method instructs the AI how to follow a path of reasoning that’s composed of a series of steps.

Tree of Thoughts (ToT) reasoning, sometimes referred to as Tree of Thought (singular) is essentially a variation and improvement of CoT, but they’re two different things.

Tree of Thoughts reasoning is similar to CoT. The difference is that rather than training a generative AI to follow a single path of reasoning, ToT is built on a process that allows for multiple paths so that the AI can stop and self-assess then come up with alternate steps.

Tree of Thoughts reasoning was developed in May 2023 in a research paper titled Tree of Thoughts: Deliberate Problem Solving with Large Language Models (PDF)

The research paper describes Tree of Thought:

“…we introduce a new framework for language model inference, Tree of Thoughts (ToT), which generalizes over the popular Chain of Thought approach to prompting language models, and enables exploration over coherent units of text (thoughts) that serve as intermediate steps toward problem solving.

ToT allows LMs to perform deliberate decision making by considering multiple different reasoning paths and self-evaluating choices to decide the next course of action, as well as looking ahead or backtracking when necessary to make global choices.

Advertisement

Our experiments show that ToT significantly enhances language models’ problem-solving abilities…”

Tree Of Attacks With Pruning (TAP)

This new method of jailbreaking large language models is called Tree of Attacks with Pruning, TAP. TAP uses two LLMs, one for attacking and the other for evaluating.

TAP is able to outperform other jailbreaking methods by significant margins, only requiring black-box access to the LLM.

A black box, in computing, is where one can see what goes into an algorithm and what comes out. But what happens in the middle is unknown, thus it’s said to be in a black box.

Tree of thoughts (TAP) reasoning is used against a targeted LLM like GPT-4 to repetitively try different prompting, assess the results, then if necessary change course if that attempt is not promising.

This is called a process of iteration and pruning. Each prompting attempt is analyzed for the probability of success. If the path of attack is judged to be a dead end, the LLM will “prune” that path of attack and begin another and better series of prompting attacks.

Advertisement

This is why it’s called a “tree” in that rather than using a linear process of reasoning which is the hallmark of chain of thought (CoT) prompting, tree of thought prompting is non-linear because the reasoning process branches off to other areas of reasoning, much like a human might do.

The attacker issues a series of prompts, the evaluator evaluates the responses to those prompts and then makes a decision as to what the next path of attack will be by making a call as to whether the current path of attack is irrelevant or not, plus it also evaluates the results to determine the likely success of prompts that have not yet been tried.

What’s remarkable about this approach is that this process reduces the number of prompts needed to jailbreak GPT-4. Additionally, a greater number of jailbreaking prompts are discovered with TAP than with any other jailbreaking method.

The researchers observe:

“In this work, we present Tree of Attacks with Pruning (TAP), an automated method for generating jailbreaks that only requires black-box access to the target LLM.

TAP utilizes an LLM to iteratively refine candidate (attack) prompts using tree-of-thoughts reasoning until one of the generated prompts jailbreaks the target.

Crucially, before sending prompts to the target, TAP assesses them and prunes the ones unlikely to result in jailbreaks.

Advertisement

Using tree-of-thought reasoning allows TAP to navigate a large search space of prompts and pruning reduces the total number of queries sent to the target.

In empirical evaluations, we observe that TAP generates prompts that jailbreak state-of-the-art LLMs (including GPT4 and GPT4-Turbo) for more than 80% of the prompts using only a small number of queries. This significantly improves upon the previous state-of-the-art black-box method for generating jailbreaks.”

Tree Of Thought (ToT) Outperforms Chain Of Thought (CoT) Reasoning

Another interesting conclusion reached in the research paper is that, for this particular task, ToT reasoning outperforms CoT reasoning, even when adding pruning to the CoT method, where off topic prompting is pruned and discarded.

ToT Underperforms With GPT 3.5 Turbo

The researchers discovered that ChatGPT 3.5 Turbo didn’t perform well with CoT, revealing the limitations of GPT 3.5 Turbo. Actually, GPT 3.5 performed exceedingly poorly, dropping from 84% success rate to only a 4.2% success rate.

This is their observation about why GPT 3.5 underperforms:

“We observe that the choice of the evaluator can affect the performance of TAP: changing the attacker from GPT4 to GPT3.5-Turbo reduces the success rate from 84% to 4.2%.

The reason for the reduction in success rate is that GPT3.5-Turbo incorrectly determines that the target model is jailbroken (for the provided goal) and, hence, preemptively stops the method.

Advertisement

As a consequence, the variant sends significantly fewer queries than the original method…”

What This Mean For You

While it’s amusing that the researchers use the ToT method to beat an LLM with another LLM, it also highlights the usefulness of ToT for generating surprising new directions in prompting in order to achieve higher levels of output.

  • TL/DR Takeaways:
  • Tree of Thought prompting outperformed Chain of Thought methods
  • GPT 3.5 worked significantly poorly in comparison to GPT 4 in ToT
  • Pruning is a useful part of a prompting strategy
  • Research showed that ToT is superior to CoT in an intensive reasoning task like jailbreaking an LLM

Read the original research paper:

Tree of Attacks: Jailbreaking Black-Box LLMs Automatically (PDF)

Featured Image by Shutterstock/THE.STUDIO

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

10 Completely Free SEO Training Courses

Published

on

10 Completely Free SEO Training Courses

Learning SEO doesn’t have to break the bank. There are plenty of quality free SEO courses teaching everything from the basics to keyword research to link building.

Here are ten that won’t cost a dime.

Course provider: Ahrefs

Duration: 2 hours

Advertisement

Instructor(s): Sam Oh

Level: Beginner

Link: SEO Course for Beginners

What you’ll learn

  • The fundamentals of what search engine optimization is and how it works
  • Why SEO is important
  • How to do keyword research
  • How to optimize web pages for search engines
  • Beginner-friendly link building strategies to get backlinks to your site
  • Technical SEO best practices for beginners

This comprehensive course is ours and covers the fundamentals of SEO, including keyword research, on-page SEO, technical SEO, and link building.

SEO Certification Course by HubSpotSEO Certification Course by HubSpot

Course provider: HubSpot

Duration: 3 hours 51 minutes

Instructor(s): Rachel Sheldon, Matthew Howells-Barby

Advertisement

Level: Beginner

Link: SEO Certification Course

What you’ll learn

  • How to evaluate and improve your website’s SEO
  • How to build backlinks to your website at scale to increase your website’s visibility in organic search
  • How to use insights from keyword research and reporting to improve your search performance

HubSpot’s SEO Training Course is tailored for marketers, content creators, and anyone looking to enhance their website’s visibility. Through practical lessons and real-world examples, the course participants will learn how to build a robust SEO strategy, analyze their website’s performance, and adapt to the changing algorithms of search engines.

Make sure customers find you online by Google SkillshopMake sure customers find you online by Google Skillshop

Course provider: Google

Duration: 3 hours

Instructor(s): Google

Level: Beginner

Advertisement

Link: Make Sure Customers Find You Online

What you’ll learn

  • How to get started with search
  • How to make search work for you
  • How to get discovered with search
  • How to help people nearby find you online

This free course from Google Skillshop helps businesses discover ways to reach and connect with more customers online. It covers improving SEO and using online advertising (SEM) to boost sales and awareness.

Google SEO Fundamentals by UC Davis on CourseraGoogle SEO Fundamentals by UC Davis on Coursera

Course provider: University of California, Davis

Duration: 28 hours

Instructor(s): Rebekah May

Level: Beginner

Link: Google SEO Fundamentals

Advertisement

What you’ll learn

  • How to complete a competitive analysis on a webpage
  • How to interpret brand recognition through social media
  • How to create sitemaps and robot.txt files, plan redirects, and manage site errors
  • How to use a variety of SEO tools to conduct an audience analysis and develop personas of your ideal buyer

Offered by the University of California, Davis, this course on Coursera delves into the fundamental aspects of SEO, including how search engines work and how to implement effective SEO strategies to attract more organic traffic.

However, due to its length (28 hours), it may not be the most suitable if you want to learn SEO fast.

SEO for Beginners Training by YoastSEO for Beginners Training by Yoast

Course provider: Yoast

Duration: 2 hours

Instructor(s): Joost de Valk

Level: Beginner

Link: SEO for Beginners Training

Advertisement

What you’ll learn

  • What SEO is and what Google does
  • Tips for quick wins to improve your site
  • Insights into the content and technical side of SEO

This free course discusses what SEO is and how it works. Some of the important points from the course are how to use keywords to optimize your website, how to write content that Google likes, and how to make your website crawlable by search engines.

Keyword Research Course by AhrefsKeyword Research Course by Ahrefs

Course provider: Ahrefs

Duration: 2 hours

Instructor(s): Sam Oh

Level: Beginner

Link: Keyword Research Course

What you’ll learn

  • How to do keyword research and drive targeted traffic to your website

This is our specialized course that focuses specifically on keyword research. It covers topics such as how to choose keywords, how to analyze search intent, and how to find low-competition keywords.

Technical SEO Course by AhrefsTechnical SEO Course by Ahrefs

Course provider: Ahrefs

Duration: 1 hour 21 minutes

Advertisement

Instructor(s): Sam Oh

Level: Beginner to intermediate

Link: Technical SEO Course

What you’ll learn

  • The fundamentals of technical SEO
  • How to run a technical SEO audit
  • How to optimize your website’s technical SEO

Another specialized course from us, this course is designed for those looking to dive deeper into the technical side of SEO. It covers advanced topics such as site audits, page speed optimization, and how to resolve common technical issues that can impact search rankings.

Technical SEO Certification by Blue ArrayTechnical SEO Certification by Blue Array

Course provider: Blue Array

Duration: 7 hours

Instructor(s): Damion Edwards

Advertisement

Level: Beginner to intermediate

Link: Technical SEO Certification

What you’ll learn

Aimed at professionals seeking to certify their expertise, this course covers a wide range of technical SEO topics, including crawling, indexing, ranking, and on-page optimization. From site architecture to schema markup, it equips learners with the skills to tackle technical challenges and improve website performance.

Local SEO Course by AhrefsLocal SEO Course by Ahrefs

Course provider: Ahrefs

Duration: 44 minutes

Instructor(s): Sam Oh

Level: Beginner

Link: Local SEO Course

Advertisement

What you’ll learn

  • How to do local SEO
  • How to do local keyword research
  • How to do local link building

Ideal for businesses targeting local customers, this course teaches the basics of optimizing for local search. It covers essential tactics for improving local visibility, such as Google Business Profile optimization and local keyword targeting.

Advanced Link Building Course by AhrefsAdvanced Link Building Course by Ahrefs

Course provider: Ahrefs

Duration: 1 hour 48 minutes

Instructor(s): Sam Oh

Level: Intermediate to advanced

Link: Advanced Link Building Course

What you’ll learn

  • How to find prospects with the “seed and lookalike” approach
  • How to validate link building campaigns with a “blitz list”
  • How to craft personalized and benefit-rich outreach emails
  • How to create, structure and manage a link building team
  • How to scale your link building operations

Focusing on one of the most challenging aspects of SEO, Sam shares his years of experience creating campaigns, sending outreach emails, and building teams. This is a must-finish course if you need help building and scaling your link building operations.

Final thoughts

The best way to learn SEO is to do.

Advertisement

So, don’t just go through the courses, take notes, and leave it aside. You need to actually execute to find out what works and what doesn’t. Create a website, implement the ideas you’re learning, and see if you can get more organic traffic to it.

That’s how you become an SEO pro.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Answers A Crawl Budget Issue Question

Published

on

By

Google Answers A Crawl Budget Issue Question

Someone on Reddit posted a question about their “crawl budget” issue and asked if a large number of 301 redirects to 410 error responses were causing Googlebot to exhaust their crawl budget. Google’s John Mueller offered a reason to explain why the Redditor may be experiencing a lackluster crawl pattern and clarified a point about crawl budgets in general.

Crawl Budget

It’s a commonly accepted idea that Google has a crawl budget, an idea that SEOs invented to explain why some sites aren’t crawled enough. The idea is that every site is allotted a set number of crawls, a cap on how much crawling a site qualifies for.

It’s important to understand the background of the idea of the crawl budget because it helps understand what it really is. Google has long insisted that there is no one thing at Google that can be called a crawl budget, although how Google crawls a site can give an impression that there is a cap on crawling.

A top Google engineer (at the time) named Matt Cutts alluded to this fact about the crawl budget in a 2010 interview.

Matt answered a question about a Google crawl budget by first explaining that there was no crawl budget in the way that SEOs conceive of it:

Advertisement

“The first thing is that there isn’t really such thing as an indexation cap. A lot of people were thinking that a domain would only get a certain number of pages indexed, and that’s not really the way that it works.

There is also not a hard limit on our crawl.”

In 2017 Google published a crawl budget explainer that brought together numerous crawling-related facts that together resemble what the SEO community was calling a crawl budget. This new explanation is more precise than the vague catch-all phrase “crawl budget” ever was (Google crawl budget document summarized here by Search Engine Journal).

The short list of the main points about a crawl budget are:

  • A crawl rate is the number of URLs Google can crawl based on the ability of the server to supply the requested URLs.
  • A shared server for example can host tens of thousands of websites, resulting in hundreds of thousands if not millions of URLs. So Google has to crawl servers based on the ability to comply with requests for pages.
  • Pages that are essentially duplicates of others (like faceted navigation) and other low-value pages can waste server resources, limiting the amount of pages that a server can give to Googlebot to crawl.
  • Pages that are lightweight are easier to crawl more of.
  • Soft 404 pages can cause Google to focus on those low-value pages instead of the pages that matter.
  • Inbound and internal link patterns can help influence which pages get crawled.

Reddit Question About Crawl Rate

The person on Reddit wanted to know if the perceived low value pages they were creating was influencing Google’s crawl budget. In short, a request for a non-secure URL of a page that no longer exists redirects to the secure version of the missing webpage which serves a 410 error response (it means the page is permanently gone).

It’s a legitimate question.

This is what they asked:

“I’m trying to make Googlebot forget to crawl some very-old non-HTTPS URLs, that are still being crawled after 6 years. And I placed a 410 response, in the HTTPS side, in such very-old URLs.

So Googlebot is finding a 301 redirect (from HTTP to HTTPS), and then a 410.

Advertisement

http://example.com/old-url.php?id=xxxx -301-> https://example.com/old-url.php?id=xxxx (410 response)

Two questions. Is G**** happy with this 301+410?

I’m suffering ‘crawl budget’ issues, and I do not know if this two responses are exhausting Googlebot

Is the 410 effective? I mean, should I return the 410 directly, without a first 301?”

Google’s John Mueller answered:

G*?

301’s are fine, a 301/410 mix is fine.

Advertisement

Crawl budget is really just a problem for massive sites ( https://developers.google.com/search/docs/crawling-indexing/large-site-managing-crawl-budget ). If you’re seeing issues there, and your site isn’t actually massive, then probably Google just doesn’t see much value in crawling more. That’s not a technical issue.”

Reasons For Not Getting Crawled Enough

Mueller responded that “probably” Google isn’t seeing the value in crawling more webpages. That means that the webpages could probably use a review to identify why Google might determine that those pages aren’t worth crawling.

Certain popular SEO tactics tend to create low-value webpages that lack originality. For example, a popular SEO practice is to review the top ranked webpages to understand what factors on those pages explain why those pages are ranking, then taking that information to improve their own pages by replicating what’s working in the search results.

That sounds logical but it’s not creating something of value. If you think of it as a binary One and Zero choice, where zero is what’s already in the search results and One represents something original and different, the popular SEO tactic of emulating what’s already in the search results is doomed to create another Zero, a website that doesn’t offer anything more than what’s already in the SERPs.

Clearly there are technical issues that can affect the crawl rate such as the server health and other factors.

But in terms of what is understood as a crawl budget, that’s something that Google has long maintained is a consideration for massive sites and not for smaller to medium size websites.

Advertisement

Read the Reddit discussion:

Is G**** happy with 301+410 responses for the same URL?

Featured Image by Shutterstock/ViDI Studio

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

SEOs, Are You Using These 6 Mental Models?

Published

on

SEOs, Are You Using These 6 Mental Models?

People use mental models to comprehend reality, solve problems, and make decisions in everyday life. SEO is not an exception here, yet it’s not a topic you often hear about in this industry.

The thing is, you need to be careful with mental models because they’re sneaky. We tend to develop them during our lives, inherit them from our colleagues and mentors, and rely on them almost instinctively while not fully aware of their influence or the existence of better alternatives.

So, let’s talk about mental models you will find helpful in your SEO work and the ones you should approach with caution.

3 helpful mental models

In the noisy, uncertain world of SEO, these will be your north star.

Advertisement

First principles thinking is a problem-solving approach that involves breaking down complex problems into their most basic elements and reassembling them from the ground up.

It’s about asking oneself what is absolutely true about a situation and then reasoning up from there to create new solutions.

Using first principles thinking to rearrange the same building blocks into a brand new shape. 

Uncertainty is a chronic condition in SEO. And it is so by design because the whole industry is based on Google’s secrets. Access to the truth is extremely limited. We got to the point that we got used to accepting speculation and theories on SEO so much that we started to crave them.

This is where the first principles come in. Whenever you need a brand new solution for a problem or when you feel that you’ve gone too far into speculation, come back to the first principles — things that have the best chance to be true in this industry. For example:

The Pareto Principle (aka the 80/20 rule) is about a disproportionate relationship between inputs and outputs, effort and results, or causes and effects. A small number of causes (20%) often leads to a large number of effects (80%).

The Pareto principleThe Pareto principle
The 80/20 rule: 80% of results come from 20% of the projects.

This concept was named after Vilfredo Pareto, an Italian economist who, in 1906, noticed that 80% of Italy’s land was owned by 20% of the population.

If we use this principle as a mental model in decision-making, we’ll find it easier to prioritize work. It’s ok to ignore some things because they likely won’t matter that much. The result that you’re after will come from focusing on the things that will likely have the biggest impact, and not from spreading yourself too thin.

For example, if you want to build links to your site, pitch your best content. That can be the content that has already proven to earn links in the past.

Advertisement
Best by links report in Ahrefs.Best by links report in Ahrefs.

Or if you need to recover some of that lost traffic, home in on the pages that lost the most traffic.

Top pages report in Ahrefs. Top pages report in Ahrefs.

The key is to treat the 80/20 as an approximation, a heuristic, and not take the numbers literally. To illustrate, roughly 80% of our site’s traffic comes from no more than 6% of pages.

Total organic traffic breakdown in Ahrefs. Total organic traffic breakdown in Ahrefs.

But on the other hand, if we try to find the top 20% pages that contribute to the traffic the most, we’ll find that they bring not 80% but 96.8% traffic. However you look at it, the idea still holds — a small amount of causes led to a large portion of effects.

“It takes all the running you can do, to keep in the same place.”

Sounds very much like SEO already, doesn’t it?

This quote comes from Lewis Carroll’s “Through the Looking-Glass,” and it’s how the Red Queen explains to Alice the nature of her kingdom, where it requires constant effort just to maintain one’s current position.

It was used to name an evolutionary biology theory which posits that each species must adapt and evolve not just for incremental gains but for survival, as their competitors are also evolving. Sorry, we’re in an endless race.

The Red Queen Theory as an endless race.The Red Queen Theory as an endless race.
SEO is like a road with no finish line—the race continues forever.

You can probably already guess how this applies to SEO — rankings. If you want to maintain high rankings, you can’t stop improving your pages. There will always be enough competitors to challenge your position.

But in our world, pressure comes from competitors and the environment. Google keeps evolving too, pushing the bar for content higher, making elements that used to give you an edge a standard.

I’m sure we’ve all been there – even our top backlink-earning, top traffic-generating, most time-consuming content gets pushed down. But if you keep optimizing, you get a chance to come back to the top.

Position history graph in Ahrefs.Position history graph in Ahrefs.

This mental model is another way of saying that SEO works best as an always-on strategy without a set end date or final goal.

3 mental models to watch out for

It’s not so much about avoiding them but being able to spot them when they happen or could happen.

Advertisement

A local maximum (aka local optimum) refers to a solution that is the best solution within a neighboring set of solutions, but not necessarily the best possible solution overall (global optimum).

Local maxima.Local maxima.

So if you’re feeling that you’re spending immense effort just to make marginal improvements, you have to be willing to assume that you’ve hit a local maxima. Then, the question to ask is: what can I do differently?

Here’s an example.

Until November last year, traffic to our site was a series of local optima. Our content marketing was delivering the results, but the growth was relatively slow. Obviously, we were doing the same tried and tested stuff. But then we launched two programmatic SEO projects that instantly elevated us to a level we’d have to work years for — look how fast the yellow line grew (pages) and how that corresponded with the orange line (traffic).

Organic performance graph in Ahrefs.Organic performance graph in Ahrefs.

The sunk cost fallacy is a cognitive bias that occurs when people continue to do something as a result of previously invested resources (time, money, effort) despite new evidence suggesting that the current path will not lead to a beneficial outcome.

Sunk cost fallacy as a graph.Sunk cost fallacy as a graph.
Sunk cost in action: the more you invest in something, the more attached to it you become.

We all know SEO is a long-term game, right? Strategies like these are crowded with long-term projects with big time and money investments. Sometimes, despite the investments, you just can’t go beyond a certain level of traffic, backlinks, etc.

Now, this mental model, this voice in your head, will tell you to keep going down the same path no matter what. Loss aversion kicks in, acting like a defense mechanism for your past selves and actions. And the more aggressive and blind the “hustle” culture is at one’s team, the harder it is to see clearly.

But, overall, it could be better for you and the company to let it go and focus on something else. You can even come back to it later with a fresh mind. But continuing something just because you’ve been doing it for some time is a losing strategy.

Example. Despite several attempts and time counted in years, Ahrefs doesn’t rank for “seo”.

Advertisement
Position history for "seo" via Ahrefs.Position history for "seo" via Ahrefs.

Sad but true. And from our point of view, it’s frustrating. Almost like we’re the only ones not to get invited to the party, the only ones not to graduate from high school… you get the idea.

But not ranking for “SEO” hasn’t hindered our growth, so it’s better to cut losses and deal with unfulfilled ambition than to let that goal hold us back from other projects (like that programmatic project mentioned above).

Confirmation bias is the tendency to give more attention and weight to data that support one’s own beliefs, while simultaneously dismissing or underestimating evidence that contradicts those beliefs.

Confirmation bias - beliefs outweigh the facts. Confirmation bias - beliefs outweigh the facts.

We’re all guilty of this. It’s human nature. And it’s not exclusively a bad thing. I mean, in some situations, this tendency can keep us on “the bright side” and help us go through tough times or keep our motivation up.

So, I think that it’s not something to get out of your system completely. Just be mindful of situations where this can negatively affect your judgment:

  • Selective evidence in ranking factors. You see a page ranking high, and you think it’s because of an aspect you strongly believe in, disregarding all of the evidence against it (e.g., long-form content, social signals).
  • Bias in keyword selection. Your keyword selection runs along the lines of your beliefs about the audience preferences without substantial evidence to back up these beliefs.
  • Bias in strategy development. After developing a new strategy, you encounter a talk or an article advocating a similar approach, which immediately reinforces your confidence in this strategy.
  • Focus on confirmatory data during audits. During a content audit, you find a small piece of data that confirms your belief. As a result, you may prioritize minor findings over more significant but less personally affirming data.
  • Overconfidence in familiar tactics. Leaning on SEO tactics that have worked in the past, you develop a sense of overconfidence in them. You resist trying anything new or the idea that a dip in performance comes from an unfamiliar factor.

Keep learning

If you like what you’re reading, I think you will find other mental models fascinating:

Want to share models you find useful? Ping me on X or LinkedIn.



Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS