Connect with us

SEO

How to Spot SEO Myths: 20 Common SEO Myths, Debunked

Published

on

How to Spot SEO Myths: 20 Common SEO Myths, Debunked

There’s a lot of advice going around about SEO.

Some of it is helpful but some of it will lead you astray if acted on.

The difficulty is knowing which is which.

It can be hard to identify what advice is accurate and based on fact, and what is just regurgitated from misquoted articles or poorly understood Google statements.

SEO myths abound.

You’ll hear them in the strangest places.

A client will tell you with confidence how they are suffering from a duplicate content penalty.

Your boss will chastise you for not keeping your page titles to 60 characters.

Sometimes the myths are obviously fake. Other times they can be harder to detect.

The Dangers of SEO Myths

The issue is, we simply don’t know exactly how the search engines work.

Due to this, a lot of what we do as SEOs ends up being trial and error and educated guesswork.

When you are learning about SEO, it can be difficult to test out all of the claims you are hearing.

That’s when the SEO myths begin to take hold.

Before you know it, you’re proudly telling your line manager that you’re planning to “BERT optimize” your website copy.

SEO myths can be busted a lot of the time with a pause and some consideration.

How, exactly, would Google be able to measure that?

Would that actually benefit the end-user in any way?

There is a danger in SEO of considering the search engines to be omnipotent, and because of this, wild myths about how they understand and measure our websites start to grow.

What Is An SEO Myth?

Before we debunk some common SEO myths, we should first understand what forms they take.

Untested Wisdom

Myths in SEO tend to take the form of handed-down wisdom that isn’t tested.

As a result, something that might well have no impact on driving qualified organic traffic to a site gets treated like it matters.

Minor Factors Blown out of Proportion

SEO myths might also be something that has a small impact on organic rankings or conversion but is given too much importance.

This might be a “tick box” exercise that is hailed as being a critical factor in SEO success, or simply an activity that might only cause your site to eke ahead if everything else with your competition was truly equal.

Outdated Advice

Myths can arise simply because what used to be effective in helping sites to rank and convert well no longer does but is still being advised.

It might be that something used to work really well.

Over time the algorithms have grown smarter.

The public is more adverse to being marketed to.

Simply, what was once good advice is now defunct.

Google Being Misunderstood

Many times the start of a myth is Google itself.

Unfortunately, a slightly obscure or just not straightforward piece of advice from a Google representative gets misunderstood and run away with.

Before we know it, a new optimization service is being sold off the back of a flippant comment a Googler made in jest.

SEO myths can be based in fact, or perhaps these are more accurately SEO legends?

In the case of Google-born myths, it tends to be that the fact has been so distorted by the SEO industry’s interpretation of the statement that it no longer resembles useful information.

When Can Something Appear to Be a Myth

Sometimes an SEO technique can be written off as a myth by others purely because they have not experienced success from carrying out this activity for their own site.

It is important to remember that every website has its own industry, set of competitors, the technology powering it, and other factors that make it unique.

Blanket application of techniques to every website and expecting them to have the same outcome is naive.

Someone may not have had success with a technique when they have tried it in their highly competitive vertical.

It doesn’t mean it won’t help someone in a less competitive industry have success.

Causation & Correlation Being Confused

Sometimes SEO myths arise because of an inappropriate connection between an activity that was carried out and a rise in organic search performance.

If an SEO has seen a benefit from something they did, then it is natural that they would advise others to try the same.

Unfortunately, we’re not always great at separating causation and correlation.

Just because rankings or click-through rate increased around-about the same time as you implemented a new tactic doesn’t mean it caused the increase.

There could be other factors at play.

Soon an SEO myth arises from an overeager SEO wanting to share what they incorrectly believe to be a golden ticket.

Steering Clear of SEO Myths

It can save you from experiencing headaches, lost revenue, and a whole lot of time if you learn to spot SEO myths and act accordingly.

Test

The key to not falling for SEO myths is making sure you can test advice whenever possible.

If you have been given the advice that structuring your page titles a certain way will help your pages rank better for their chosen keywords, then try it with one or two pages first.

This can help you to measure whether making a change across many pages will be worth the time before you commit to doing so.

Is Google Just Testing?

Sometimes there will be a big uproar in the SEO community because of changes in the way Google displays or orders search results.

These changes are often tested in the wild before they are rolled out to more search results.

Once a big change has been spotted by one or two SEOs, advice on how to optimize for it begins to spread.

Remember the favicons in the desktop search results?

The upset that caused the SEO industry (and Google users in general) was vast.

Suddenly articles sprang up about the importance of favicons in attracting users to your search result.

Whether favicons would impact click-through rate that much barely had time to be studied.

Because just like that, Google changed it back.

Before you jump for the latest SEO advice that is being spread around Twitter as a result of a change by Google, wait to see if it is going to hold.

It could be that the advice that appears sound now will quickly become a myth if Google rolls back changes.

20 Common SEO Myths

So now we know what causes and perpetuates SEO myths, let’s find out the truth behind some of the more common ones.

1. The Google Sandbox

It is a belief held by some SEOs that Google will automatically suppress new websites in the organic search results for a period of time before they are able to rank more freely.

It’s something that many SEOs will argue simply is not the case.

So who is right?

SEOs who have been around for many years will give you anecdotal evidence that would both support and detract from the idea of a sandbox.

The only guidance that has been given by Google from this appears to be in the form of tweets.

As already discussed, Google’s social media responses can often be misinterpreted.

Tweet about Google sandbox myth

Verdict: Officially? It’s a myth.

Unofficially – there does seem to be a period of time whilst Google tries to understand and rank the pages belonging to a new site.

This might mimic a sandbox.

2. Duplicate Content Penalty

This is a myth that I hear a lot. The idea is that if you have content on your website that is duplicated elsewhere on the web, Google will penalize you for it.

The key to understanding what is really going on here is knowing the difference between algorithmic suppression and manual action.

A manual action, the situation that can result in webpages being removed from Google’s index, will be actioned by a human at Google.

The website owner will be notified through Google Search Console.

An algorithmic suppression occurs when your page cannot rank well due to it being caught by a filter from an algorithm.

Chuck Price does a great job of explaining the difference between the two in this article that lays out all of the different manual actions available from Google.

Essentially, having copy that is taken from another webpage might mean you can’t outrank that other page.

The search engines may determine the original host of the copy is more relevant to the search query than yours.

As there is no benefit to having both in the search results, yours gets suppressed. This is not a penalty. This is the algorithm doing its job.

There are some content-related manual actions, as covered in Price’s article, but essentially copying one or two pages of someone else’s content is not going to trigger them.

It is, however, potentially going to land you in other trouble if you have no legal right to use that content. It also can detract from the value your website brings to the user.

Verdict: SEO myth

3. PPC Advertising Helps Rankings

This is a common myth. It’s also quite quick to debunk.

The idea is that Google will favor websites in the organic search results, which spend money with it through pay-per-click advertising.

This is simply false.

Google’s algorithm for ranking organic search results is completely separate from the one used to determine PPC ad placements.

Running a paid search advertising campaign through Google at the same time as carrying out SEO might benefit your site for other reasons, but it won’t directly benefit your ranking.

Verdict: SEO myth

4. Domain Age Is a Ranking Factor

This claim finds itself seated firmly in the “confusing causation and correlation” camp.

Because a website has been around for a long time and is ranking well, age must be a ranking factor.

Google has debunked this myth itself many times.

In fact, as recently as July 2019, Google Webmaster Trends Analyst John Mueller replied to a tweet suggesting that domain age was one of “200 signals of ranking” saying “No, domain age helps nothing”

Tweet answering domain age

The truth behind this myth is that an older website has had more time to do things well.

For instance, a website that has been live and active for 10 years may well have acquired a high volume of relevant backlinks to its key pages.

A website that has been running for less than six months will be unlikely to compete with that.

The older website appears to be ranking better, and the conclusion is that age must be the determining factor.

Verdict: SEO myth

5. Tabbed Content Affects Rankings

This idea is one that has roots going back a long way.

The premise is that Google will not assign as much value to the content that is sitting behind a tab or accordion.

For example, text that is not viewable on the first load of a page.

Google has again debunked this myth as recently as March 31, 2020, but it has been a contentious idea amongst many SEOs years.

In September 2018, Gary Illyes, Webmaster Trends Analyst at Google, answered a tweet thread about using tabs to display content.

His response:

“AFAIK, nothing’s changed here, Bill: we index the content, its weight is fully considered for ranking, but it might not get bolded in the snippets. It’s another, more technical question how that content is surfaced by the site. Indexing does have limitations.”

If the content is visible in the HTML, there is no reason to assume that it is being devalued just because it is not apparent to the user on the first load of the page.

This is not an example of cloaking, and Google can easily fetch the content.

As long as there is nothing else that is stopping the text from being viewed by Google, it should be weighted the same as copy, which isn’t in tabs.

Want more clarification on this?

Then check out Roger Montti’s post that puts this myth to bed.

Verdict: SEO myth

6. Google Uses Google Analytics Data in Rankings

This is a common fear amongst business owners.

They study their Google Analytics reports.

They feel their average sitewide bounce rate is too high, or their time on page is too low.

So they worry that Google will perceive their site to be low quality because of that.

They fear they won’t rank well because of it.

The myth is that Google uses the data in your Google Analytics account as part of its ranking algorithm.

It’s a myth that has been around for a long time.

Google’s Gary Illyes has again debunked this idea simply with, “We don’t use *anything* from Google analytics [sic] in the “algo.”

tweet about Google using analytics for ranking algorithm

If we think about this logically, using Google Analytics data as a ranking factor would be really hard to police.

For instance, using filters could manipulate data to make it seem like the site was performing in a way that it isn’t really.

What is good performance anyway?

High “time on page” might be good for some long-form content.

Low “time on page” could be understandable for shorter content.

Is either right or wrong?

Google would also need to understand the intricate ways in which each Google Analytics account had been configured.

Some might be excluding all known bots, and others might not.

Some might use custom dimensions and channel groupings, and others haven’t configured anything.

Using this data reliably would be extremely complicated to do.

Consider the hundreds of thousands of websites that use other analytics programs.

How would Google treat them?

Verdict: SEO myth

This myth is another case of “causation, not correlation.”

A high sitewide bounce rate might be indicative of a quality problem, or it might not be.

Low time on page could mean your site isn’t engaging, or it could mean your content is quickly digestible.

These metrics give you clues as to why you might not be ranking well, they aren’t the cause of it.

7. Google Cares About Domain Authority

PageRank is a link analysis algorithm used by Google to measure the importance of a webpage.

Google used to display a page’s PageRank score, a number up to 10, on its toolbar.

Google stopped updating the PageRank displayed in toolbars in 2013. In 2016 Google confirmed that the PageRank toolbar metric was not going to be used going forward.

In the absence of PageRank, many other third-party authority scores have been developed.

Commonly known ones are:

  • Moz’s Domain Authority and Page Authority scores.
  • Majestic’s Trust Flow and Citation Flow.
  • Ahrefs’ Domain Rating and URL Rating.

These scores are used by some SEOs to determine the “value” of a page.

That calculation can never be an entirely accurate reflection of how a search engine values a page, however.

Commonly, SEOs will refer to the ranking power of a website often in conjunction with its backlink profile.

This too is known as the domain’s authority.

You can see where the confusion lies.

Google representatives have dispelled the notion of a domain authority metric used by them.

Gary Illyes once again debunking myths with “we don’t really have “overall domain authority.”

tweet confirming overall domain authority myth

Verdict: SEO myth

8. Longer Content Is Better

You will have definitely heard it said before that longer content ranks better.

More words on a page automatically make yours more rank-worthy than your competitor’s.

This is “wisdom” that is often shared around SEO forums without little evidence to substantiate it.

There are a lot of studies that have been released over the years that state facts about the top-ranking webpages, such as “on average pages in the top 10 positions in the SERPs have over 1,450 words on them.”

It would be quite easy for someone to take this information in isolation and assume it means that pages need approximately 1,500 words to rank on Page 1. That isn’t what the study is saying, however.

Unfortunately, this is an example of correlation, not necessarily causation.

Just because the top-ranking pages in a particular study happened to have more words on them than the pages ranking 11th and lower does not make word count a ranking factor.

John Mueller of Google recently dispelled this myth:

tweet on content length myth

Verdict: SEO myth

9. LSI Keywords Will Help You Rank

What exactly are LSI keywords?

LSI stands for “latent semantic indexing.”

It is a technique used in information retrieval that allows concepts within the text to be analyzed and relationships between them identified.

Words have nuances dependent on their context. The word “right” has a different connotation when paired with “left” than when it is paired with “wrong.”

Humans can quickly gauge concepts in text. It is harder for machines to do so.

The ability for machines to understand the context and linking between entities is fundamental to their understanding of concepts.

LSI is a huge step forward for a machine’s ability to understand text.

What it isn’t is synonyms.

Unfortunately, the field of LSI has been devolved by the SEO community into the understanding that using words that are similar or linked thematically will boost rankings for words that aren’t expressly mentioned in the text.

It’s simply not true. Google has gone far beyond LSI in its understanding of text, for instance, the introduction of BERT.

For more about what LSI is, and more importantly, what it isn’t, take a look at Clark Boyd’s article

Verdict: SEO myth

10. SEO Takes 3 Months

It helps us get out of sticky conversations with our bosses or clients.

It leaves a lot of wiggle room if you aren’t getting the results you promised.

“SEO takes at least 3 months to have an effect.”

It is fair to say that there are some changes that will take time for the search engine bots to process.

There is then, of course, some time to see if those changes are having a positive or negative effect. Then more time might be needed to refine and tweak your work.

That doesn’t mean that any activity you carry out in the name of SEO is going to have no effect for three months. Day 90 of your work will not be when the ranking changes kick-in.

There is a lot more to it.

If you are in a very low competition market, targeting niche terms, you might see ranking changes as soon as Google recrawls your page.

A competitive term could take much longer to see changes in rank.

A study by Ahrefs suggested that of the 2 million keywords they analyzed, the average age of pages ranking in position 10 of Google was 650 days. This study indicates that newer pages struggle to rank high.

However, there is more to SEO than ranking in the top 10 of Google.

For instance, a well-positioned Google My Business listing with great reviews can pay dividends for a company.

Bing, Yandex, and Baidu might be easier for your brand to conquer the SERPs in.

A small tweak to a page title could see an improvement in click-through rates. That could be the same day if the search engine were to recrawl the page quickly.

Although it can take a long time to see first page rankings in Google, it is naïve of us to reduce SEO success just down to that.

Therefore, “SEO takes 3 months” simply isn’t accurate.

Verdict: SEO myth

11. Bounce Rate Is a Ranking Factor

Bounce rate is the percentage of visits to your website that result in no interactions beyond landing on the page. It is typically measured by a website’s analytics program such as Google Analytics.

Some SEOs have argued that bounce rate is a ranking factor because it is a measure of quality.

Unfortunately, it is not a good measure of quality.

There are many reasons why a visitor might land on a webpage and leave again without interacting further with the site. They may well have read all the information they needed to on that page and left the site to call the company and book an appointment. In that instance, the visitor bouncing has resulted in a lead for the company.

Although a visitor leaving a page having landed on it could be an indicator of poor quality content, it isn’t always. It, therefore, wouldn’t be reliable enough for a search engine to use as a measure of quality.

“Pogo-sticking,” or a visitor clicking on a search result and then returning to the SERPs, would be a more reliable indicator of the quality of the landing page. It would suggest that the content of the page was not what the user was after, so much so that they have returned to the search results to find another page or re-search.

John Mueller cleared this up in a Google Webmaster Hangout in July 2018 with:

“We try not to use signals like that when it comes to search. So that’s something where there are lots of reasons why users might go back and forth, or look at different things in the search results, or stay just briefly on a page and move back again. I think that’s really hard to refine and say, “well, we could turn this into a ranking factor.”

Verdict: SEO myth

12. It’s All About Backlinks

Backlinks are important, that’s without much contention within the SEO community. However, exactly how important is still debated.

Some SEOs will tell you that backlinks are one of the many tactics that will influence rankings and not the most important. Others will tell you it’s the only real game-changer.

What we do know is that the effectiveness of links has changed over time. Back in the wild pre-Jagger days, link-building consisted of adding a link to your website wherever you could.

Forum comments spun articles, and irrelevant directories were all good sources of links.

It was easy to build effective links.

It’s not so easy now. Google has continued to make changes to its algorithms that reward higher quality, more relevant links, and disregard or penalize “spammy” links.

However, the power of links to affect rankings is still great.

There will be some industries that are so immature in SEO that a site can rank well without investing in link-building, purely through the strength of their content and technical efficiency.

That’s not the case with most industries.

Relevant backlinks will, of course, help with ranking, but they need to go hand-in-hand with other optimizations.

Your website still needs to have relevant copy, and it must be crawlable.

Google’s John Mueller recently stated, “links are definitely not the most important SEO factor.”

link not being the most important SEO factor

If you want your traffic to actually do something when they hit your website, it’s definitely not all about backlinks.

Ranking is only one part of getting converting visitors to your site. The content and usability of the site are extremely important in user engagement.

Verdict: SEO myth

13. Keywords in URLs Are Very Important

Cram your URLs full of keywords. It’ll help.

Unfortunately, it’s not quite as powerful as that.

make URLs for users tweet

John Mueller has said several times that keywords in a URL are a very minor, lightweight ranking signal.

If you are looking to rewrite your URLs to include more keywords, you are likely to do more damage than good.

The process of redirecting URLs en masse should be when necessary as there is always a risk when restructuring a site.

For the sake of adding keywords to a URL? Not worth it.

Verdict: SEO myth

14. Website Migrations Are All About Redirects

It is something that is heard too often by SEOs. If you are migrating a website, all you need to do is remember to redirect any URLs that are changing.

If only this one was true.

In actuality, website migration is one of the most fraught and complicated procedures in SEO.

A website changing its layout, CMS, domain, and/or content can all be considered a website migration.

In each of those examples, there are several aspects that could affect how the search engines perceive the quality and relevance of the pages to their targeted keywords.

As a result of this, there are numerous checks and configurations that need to occur if the site is going to maintain its rankings and organic traffic.

Ensuring tracking hasn’t been lost. Maintaining the same content targeting. Making sure the search engines’ bots can still access the right pages.

All of this needs to be considered when a website is significantly changing.

Redirecting URLs that are changing is a very important part of website migration. It is in no way the only thing to be concerned about.

Verdict: SEO myth

15. Well-Known Websites Will Always Outrank Unknown Websites

It stands to reason that a larger brand will have resources that smaller brands do not. As a result, more can be invested in SEO.

More exciting content pieces can be created, leading to a higher volume of backlinks acquired. The brand name alone can lend more credence to outreach attempts.

The real question is, does Google algorithmically or manually boost big brands because of their fame?

This one is a bit contentious.

Some people say that Google favors big brands. Google says, otherwise.

In 2009, Google released an algorithm update named “Vince.” This update had a huge impact on how brands were treated in the SERPs.

Brands that were well-known offline saw ranking increases for broad competitive keywords.

It’s not necessarily time for smaller brands to throw in the towel.

The Vince update falls very much in-line with other Google moves towards valuing authority and quality.

Big brands are often more authoritative on broad-level keywords than smaller contenders.

However, small brands can still win.

Long-tail keyword targeting, niche product lines, and local presence can all make smaller brands more relevant to a search result than established brands.

Yes, the odds are stacked in favor of big brands, but it’s not impossible to outrank them.

Verdict: Not entirely truth or myth

16. Your Page Needs to Include ‘Near Me’ to Rank Well for Local SEO

It’s understandable that this myth is still prevalent.

There is still a lot of focus on keyword search volumes in the SEO industry. Sometimes at the expense of considering user intent and how the search engines understand it.

When a searcher is looking for something with “local intent,” i.e., a place or service relevant to a physical location, the search engines will take this into consideration when returning results.

With Google, you will likely see the Google Maps results as well as the standard organic listings.

The Maps results are clearly centered around the location searched. However, so are the standard organic listings when the search query denotes local intent.

So why do “near me” searches confuse some?

A typical keyword research exercise might yield something like the following:

  • pizza restaurant manhattan – 110 searches per month
  • pizza restaurants in manhattan – 110 searches per month
  • best pizza restaurant manhattan – 90 searches per month
  • best pizza restaurants in manhattan – 90 searches per month
  • best pizza restaurant in manhattan – 90 searches per month
  • pizza restaurants near me – 90,500 searches per month

With search volume like that, you would think “pizza restaurants near me” would be the one to rank for, right?

It is likely, however, that people searching for “pizza restaurant manhattan” are in the Manhattan area or planning to travel there for pizza.

“pizza restaurant near me” has 90,500 searches across the USA. The likelihood is that the vast majority of those searchers are not looking for Manhattan pizzas.

Google knows this and, therefore, will use location detection and serve pizza restaurant results relevant to the searcher’s location.

Therefore, the “near me” element of the search becomes less about the keyword and more about the intent behind the keyword. Google will just consider it to be the location the searcher is in.

So, do you need to include “near me” in your content to rank for those “near me” searches?

No, you need to be relevant to the location the searcher is in.

Verdict: SEO myth

17. Better Content Equals Better Rankings

It’s prevalent in SEO forums and Twitter threads. The common complaint, “my competitor is ranking above me, but I have amazing content, and theirs is terrible.”

The cry is one of indignation. After all, shouldn’t the search engines be rewarding their site for their “amazing” content?

This is both a myth and, sometimes, a delusion.

The quality of content is a subjective consideration. If it is your own content, it’s harder still to be objective.

Perhaps in Google’s eyes, your content isn’t better than your competitors’ for the search terms you are looking to rank for.

Perhaps you don’t meet searcher intent as well as they do.

Maybe you have “over-optimized” your content and reduced its quality.

In some instances, better content will equal better rankings. In others, the technical performance of the site or its lack of local relevance may cause it to rank lower.

Content is one factor within the ranking algorithms.

Verdict: SEO myth

18. You Need to Blog Every Day

This is a frustrating myth because it is one that seems to have spread outside of the SEO industry.

Google loves frequent content. You should add new content or tweak existing content every day for “freshness.”

Where did this idea come from?

Google had an algorithm update in 2011 that rewards fresher results in the SERPs.

This is because, for some queries, the fresher the results, the better likelihood of accuracy.

For instance, search for “royal baby” in the UK in 2013, and you would be served news articles about Prince George. Search it again in 2015, and you would see pages about Princess Charlotte.

In 2018, you would see reports about Prince Louis at the top of the Google SERPs, and in 2019 it would be baby Archie.

If you were to search “royal baby” in 2019, shortly after the birth of Archie, then seeing news articles on prince George would likely be unhelpful.

In this instance, Google discerns the user’s search intent and decides showing articles related to the newest UK royal baby would be better than showing an article that is arguably more rank-worthy due to authority, etc.

What this algorithm update doesn’t mean is that newer content will always outrank older content. Google decides if the “query deserves freshness” or not.

If it does, then the age of content becomes a more important ranking factor.

This means that if you are creating content purely to make sure it is newer than competitors’ content, you are not necessarily going to benefit.

If the query you are looking to rank for does not deserve freshness, i.e., “who is Prince William’s second child?” a fact that will not change, then the age of content will not play a significant part in rankings.

If you are writing content every day thinking it is keeping your website fresh and, therefore, more rank-worthy, then you are likely wasting time.

It would be better to write well-considered, researched, and useful content pieces less frequently and reserve your resources to making those highly authoritative and shareable.

Verdict: SEO myth

19. You Can Optimize Copy Once & Then It’s Done

The phrase “SEO optimized” copy is a common one in agency-land.

It’s used as a way to explain the process of creating copy that will be relevant to frequently searched queries.

The trouble with this is that it suggests that once you have written that copy, ensured it adequately answers searchers’ queries, you can move on.

Unfortunately, over time how searchers look for content might change. The keywords they use, the type of content they want could alter.

The search engines, too may change what they feel is the most relevant answer to the query. Perhaps the intent behind the keyword is perceived differently.

The layout of the SERPs might alter, meaning videos are being shown at the top of the search results where previously it was just web page results.

If you look at a page only once and then don’t continue to update it and evolve it with user needs, then you risk falling behind.

Verdict: SEO myth

20. There Is a Right Way to Do SEO

This one is probably a myth in many industries, but it seems prevalent in the SEO one. There is a lot of gatekeeping in SEO social media, forums, and chats.

Unfortunately, it’s not that simple.

There are some core tenants that we know about SEO.

Usually, something is stated by a search engine representative that has been dissected, tested, and ultimately declared true.

The rest is a result of personal and collective trial and error, testing, and experience.

Processes are extremely valuable within SEO business functions, but they have to evolve and be applied appropriately.

Different websites within different industries will respond to changes in ways others would not. Altering a meta title, so it is under 60 characters long might help the click-through rate for one page, and not for another.

Ultimately, we have to hold any SEO advice we’re given lightly before deciding whether it is right for the website you are working on.

Verdict: SEO myth

Conclusion

Some myths have their roots in logic, and others have no sense to them.

Now you know what to do when you hear an idea that you can’t say for certain is truth or myth.


Featured Image Credit: Paulo Bobita

Search Engine Journal

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

The Expert SEO Guide To URL Parameter Handling

Published

on

By

The Expert SEO Guide To URL Parameter Handling

In the world of SEO, URL parameters pose a significant problem.

While developers and data analysts may appreciate their utility, these query strings are an SEO headache.

Countless parameter combinations can split a single user intent across thousands of URL variations. This can cause complications for crawling, indexing, visibility and, ultimately, lead to lower traffic.

The issue is we can’t simply wish them away, which means it’s crucial to master how to manage URL parameters in an SEO-friendly way.

To do so, we will explore:

What Are URL Parameters?

Image created by author

URL parameters, also known as query strings or URI variables, are the portion of a URL that follows the ‘?’ symbol. They are comprised of a key and a value pair, separated by an ‘=’ sign. Multiple parameters can be added to a single page when separated by an ‘&’.

The most common use cases for parameters are:

  • Tracking – For example ?utm_medium=social, ?sessionid=123 or ?affiliateid=abc
  • Reordering – For example ?sort=lowest-price, ?order=highest-rated or ?so=latest
  • Filtering – For example ?type=widget, colour=purple or ?price-range=20-50
  • Identifying – For example ?product=small-purple-widget, categoryid=124 or itemid=24AU
  • Paginating – For example, ?page=2, ?p=2 or viewItems=10-30
  • Searching – For example, ?query=users-query, ?q=users-query or ?search=drop-down-option
  • Translating – For example, ?lang=fr or ?language=de

SEO Issues With URL Parameters

1. Parameters Create Duplicate Content

Often, URL parameters make no significant change to the content of a page.

A re-ordered version of the page is often not so different from the original. A page URL with tracking tags or a session ID is identical to the original.

For example, the following URLs would all return a collection of widgets.

  • Static URL: https://www.example.com/widgets
  • Tracking parameter: https://www.example.com/widgets?sessionID=32764
  • Reordering parameter: https://www.example.com/widgets?sort=latest
  • Identifying parameter: https://www.example.com?category=widgets
  • Searching parameter: https://www.example.com/products?search=widget

That’s quite a few URLs for what is effectively the same content – now imagine this over every category on your site. It can really add up.

The challenge is that search engines treat every parameter-based URL as a new page. So, they see multiple variations of the same page, all serving duplicate content and all targeting the same search intent or semantic topic.

While such duplication is unlikely to cause a website to be completely filtered out of the search results, it does lead to keyword cannibalization and could downgrade Google’s view of your overall site quality, as these additional URLs add no real value.

2. Parameters Reduce Crawl Efficacy

Crawling redundant parameter pages distracts Googlebot, reducing your site’s ability to index SEO-relevant pages and increasing server load.

Google sums up this point perfectly.

“Overly complex URLs, especially those containing multiple parameters, can cause a problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site.

As a result, Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site.”

3. Parameters Split Page Ranking Signals

If you have multiple permutations of the same page content, links and social shares may be coming in on various versions.

This dilutes your ranking signals. When you confuse a crawler, it becomes unsure which of the competing pages to index for the search query.

4. Parameters Make URLs Less Clickable

parameter based url clickabilityImage created by author

Let’s face it: parameter URLs are unsightly. They’re hard to read. They don’t seem as trustworthy. As such, they are slightly less likely to be clicked.

This may impact page performance. Not only because CTR influences rankings, but also because it’s less clickable in AI chatbots, social media, in emails, when copy-pasted into forums, or anywhere else the full URL may be displayed.

While this may only have a fractional impact on a single page’s amplification, every tweet, like, share, email, link, and mention matters for the domain.

Poor URL readability could contribute to a decrease in brand engagement.

Assess The Extent Of Your Parameter Problem

It’s important to know every parameter used on your website. But chances are your developers don’t keep an up-to-date list.

So how do you find all the parameters that need handling? Or understand how search engines crawl and index such pages? Know the value they bring to users?

Follow these five steps:

  • Run a crawler: With a tool like Screaming Frog, you can search for “?” in the URL.
  • Review your log files: See if Googlebot is crawling parameter-based URLs.
  • Look in the Google Search Console page indexing report: In the samples of index and relevant non-indexed exclusions, search for ‘?’ in the URL.
  • Search with site: inurl: advanced operators: Know how Google is indexing the parameters you found by putting the key in a site:example.com inurl:key combination query.
  • Look in Google Analytics all pages report: Search for “?” to see how each of the parameters you found are used by users. Be sure to check that URL query parameters have not been excluded in the view setting.

Armed with this data, you can now decide how to best handle each of your website’s parameters.

SEO Solutions To Tame URL Parameters

You have six tools in your SEO arsenal to deal with URL parameters on a strategic level.

Limit Parameter-based URLs

A simple review of how and why parameters are generated can provide an SEO quick win.

You will often find ways to reduce the number of parameter URLs and thus minimize the negative SEO impact. There are four common issues to begin your review.

1. Eliminate Unnecessary Parameters

remove unnecessary parametersImage created by author

Ask your developer for a list of every website’s parameters and their functions. Chances are, you will discover parameters that no longer perform a valuable function.

For example, users can be better identified by cookies than sessionIDs. Yet the sessionID parameter may still exist on your website as it was used historically.

Or you may discover that a filter in your faceted navigation is rarely applied by your users.

Any parameters caused by technical debt should be eliminated immediately.

2. Prevent Empty Values

no empty parameter valuesImage created by author

URL parameters should be added to a URL only when they have a function. Don’t permit parameter keys to be added if the value is blank.

In the above example, key2 and key3 add no value, both literally and figuratively.

3. Use Keys Only Once

single key usageImage created by author

Avoid applying multiple parameters with the same parameter name and a different value.

For multi-select options, it is better to combine the values after a single key.

4. Order URL Parameters

order url parametersImage created by author

If the same URL parameter is rearranged, the pages are interpreted by search engines as equal.

As such, parameter order doesn’t matter from a duplicate content perspective. But each of those combinations burns crawl budget and split ranking signals.

Avoid these issues by asking your developer to write a script to always place parameters in a consistent order, regardless of how the user selected them.

In my opinion, you should start with any translating parameters, followed by identifying, then pagination, then layering on filtering and reordering or search parameters, and finally tracking.

Pros:

  • Ensures more efficient crawling.
  • Reduces duplicate content issues.
  • Consolidates ranking signals to fewer pages.
  • Suitable for all parameter types.

Cons:

  • Moderate technical implementation time.

Rel=”Canonical” Link Attribute

rel=canonical for parameter handlingImage created by author

The rel=”canonical” link attribute calls out that a page has identical or similar content to another. This encourages search engines to consolidate the ranking signals to the URL specified as canonical.

You can rel=canonical your parameter-based URLs to your SEO-friendly URL for tracking, identifying, or reordering parameters.

But this tactic is not suitable when the parameter page content is not close enough to the canonical, such as pagination, searching, translating, or some filtering parameters.

Pros:

  • Relatively easy technical implementation.
  • Very likely to safeguard against duplicate content issues.
  • Consolidates ranking signals to the canonical URL.

Cons:

  • Wastes crawling on parameter pages.
  • Not suitable for all parameter types.
  • Interpreted by search engines as a strong hint, not a directive.

Meta Robots Noindex Tag

meta robots noidex tag for parameter handlingImage created by author

Set a noindex directive for any parameter-based page that doesn’t add SEO value. This tag will prevent search engines from indexing the page.

URLs with a “noindex” tag are also likely to be crawled less frequently and if it’s present for a long time will eventually lead Google to nofollow the page’s links.

Pros:

  • Relatively easy technical implementation.
  • Very likely to safeguard against duplicate content issues.
  • Suitable for all parameter types you do not wish to be indexed.
  • Removes existing parameter-based URLs from the index.

Cons:

  • Won’t prevent search engines from crawling URLs, but will encourage them to do so less frequently.
  • Doesn’t consolidate ranking signals.
  • Interpreted by search engines as a strong hint, not a directive.

Robots.txt Disallow

robots txt disallow for parameter handlingImage created by author

The robots.txt file is what search engines look at first before crawling your site. If they see something is disallowed, they won’t even go there.

You can use this file to block crawler access to every parameter based URL (with Disallow: /*?*) or only to specific query strings you don’t want to be indexed.

Pros:

  • Simple technical implementation.
  • Allows more efficient crawling.
  • Avoids duplicate content issues.
  • Suitable for all parameter types you do not wish to be crawled.

Cons:

  • Doesn’t consolidate ranking signals.
  • Doesn’t remove existing URLs from the index.

Move From Dynamic To Static URLs

Many people think the optimal way to handle URL parameters is to simply avoid them in the first place.

After all, subfolders surpass parameters to help Google understand site structure and static, keyword-based URLs have always been a cornerstone of on-page SEO.

To achieve this, you can use server-side URL rewrites to convert parameters into subfolder URLs.

For example, the URL:

www.example.com/view-product?id=482794

Would become:

www.example.com/widgets/purple

This approach works well for descriptive keyword-based parameters, such as those that identify categories, products, or filters for search engine-relevant attributes. It is also effective for translated content.

But it becomes problematic for non-keyword-relevant elements of faceted navigation, such as an exact price. Having such a filter as a static, indexable URL offers no SEO value.

It’s also an issue for searching parameters, as every user-generated query would create a static page that vies for ranking against the canonical – or worse presents to crawlers low-quality content pages whenever a user has searched for an item you don’t offer.

It’s somewhat odd when applied to pagination (although not uncommon due to WordPress), which would give a URL such as

www.example.com/widgets/purple/page2

Very odd for reordering, which would give a URL such as

www.example.com/widgets/purple/lowest-price

And is often not a viable option for tracking. Google Analytics will not acknowledge a static version of the UTM parameter.

More to the point: Replacing dynamic parameters with static URLs for things like pagination, on-site search box results, or sorting does not address duplicate content, crawl budget, or internal link equity dilution.

Having all the combinations of filters from your faceted navigation as indexable URLs often results in thin content issues. Especially if you offer multi-select filters.

Many SEO pros argue it’s possible to provide the same user experience without impacting the URL. For example, by using POST rather than GET requests to modify the page content. Thus, preserving the user experience and avoiding SEO problems.

But stripping out parameters in this manner would remove the possibility for your audience to bookmark or share a link to that specific page – and is obviously not feasible for tracking parameters and not optimal for pagination.

The crux of the matter is that for many websites, completely avoiding parameters is simply not possible if you want to provide the ideal user experience. Nor would it be best practice SEO.

So we are left with this. For parameters that you don’t want to be indexed in search results (paginating, reordering, tracking, etc) implement them as query strings. For parameters that you do want to be indexed, use static URL paths.

Pros:

  • Shifts crawler focus from parameter-based to static URLs which have a higher likelihood to rank.

Cons:

  • Significant investment of development time for URL rewrites and 301 redirects.
  • Doesn’t prevent duplicate content issues.
  • Doesn’t consolidate ranking signals.
  • Not suitable for all parameter types.
  • May lead to thin content issues.
  • Doesn’t always provide a linkable or bookmarkable URL.

Best Practices For URL Parameter Handling For SEO

So which of these six SEO tactics should you implement?

The answer can’t be all of them.

Not only would that create unnecessary complexity, but often, the SEO solutions actively conflict with one another.

For example, if you implement robots.txt disallow, Google would not be able to see any meta noindex tags. You also shouldn’t combine a meta noindex tag with a rel=canonical link attribute.

Google’s John Mueller, Gary Ilyes, and Lizzi Sassman couldn’t even decide on an approach. In a Search Off The Record episode, they discussed the challenges that parameters present for crawling.

They even suggest bringing back a parameter handling tool in Google Search Console. Google, if you are reading this, please do bring it back!

What becomes clear is there isn’t one perfect solution. There are occasions when crawling efficiency is more important than consolidating authority signals.

Ultimately, what’s right for your website will depend on your priorities.

url parameter handling option pros and consImage created by author

Personally, I take the following plan of attack for SEO-friendly parameter handling:

  • Research user intents to understand what parameters should be search engine friendly, static URLs.
  • Implement effective pagination handling using a ?page= parameter.
  • For all remaining parameter-based URLs, block crawling with a robots.txt disallow and add a noindex tag as backup.
  • Double-check that no parameter-based URLs are being submitted in the XML sitemap.

No matter what parameter handling strategy you choose to implement, be sure to document the impact of your efforts on KPIs.

More resources: 


Featured Image: BestForBest/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

SEO Experts Gather for a Candid Chat About Search [Podcast]

Published

on

By

SEO Experts Gather for a Candid Chat About Search [Podcast]

Wix just celebrated their 100th podcast episode! Congrats, Wix. To quote Mordy Oberstein, Head of SEO Brand at Wix; “we talk a lot.”

You sure do! It’s a good thing you have a lot of interesting stuff to say.

The 100th episode of “SERPs Up” was full of awesome guests. Here’s a summary of the action.

Apart from the usual faces, Oberstein and Crystal Carter, Head Of SEO Communications, it was a powerhouse guestlist:

  • Chima Mmeje.
  • Darren Shaw.
  • Joy Hawkins.
  • Eli Schwartz.
  • Kevin Indig.
  • Barry Schwartz.

Just How Broken Are The SERPs?

The first guest was Chima Mmeje from Moz. She dove into the frustrations that many SEOs have been feeling and spoke plainly about the flaws in Google’s updates.

Mordy Oberstein: “Is the SERP broken?”

Chima Mmeje: “The helpful content update, and I’m saying this here, live, is a farce. There was nothing helpful about that update. … Yes, the SERP is 1,000% broken. … How does anybody even use Google in the U.S.? … I don’t think they are going to release any update that will fix these issues.”

Mordy Oberstein: “There’s no update. … Plopping Reddit all over the SERP was because they saw the content trends … and they said ‘we don’t have any so we’re just going to throw Reddit there’.”

Chima Mmeje: “It was lazy to have Reddit there … Nobody uses their real names. Anybody can go on Reddit and answer questions and then you see these answers populating in People Also Ask, populating in featured snippets, populating all over the SERPs as correct information. It is dangerous, at worst.”

Crystal Carter: “Do you think that one of the reasons why we’ve seen so much upheaval and so much so volatility in the SERPs, which I certainly agree with in the last year … is lots and lots of variables, like lots of new features coming in, so the alignment with Reddit, the AI overviews, the SGE … Do you think it is just too many things being thrown in at the same time and it messing up lots of SERPs as a result? Or do you think it’s something else?”

Chima Mmeje: ” … releasing too many features that they did not test properly. Features that were rushed SGE [testing] did not even last a year and now they brought in Google AI Overviews. I still don’t understand why we have AI Overviews and featured snippets on the same SERP. I feel like it’s like pick one, make a choice.”

Mordy Oberstein’s next question was about what we can do. “As an SEO, how are you supposed to do this? I’ve heard things from people … Yeah, I don’t know what to do. I can’t produce the kind of results that I’ve always wanted to. Can you still be effective as an SEO in an environment like this?”

Chima Mmeje: “I’m going to be honest, we are suffering … It feels like we are trying our best with what we are seeing … because there is no clear guidance. And to be honest, a lot of us are playing a guessing game right now and that is the best that we can do. It’s all a guessing game based on what we’ve seen one or two variables work. And this is not a long-term strategy. If we’re going to be realistic, it’s not going to work in the long-term. I honestly, I don’t know what the answer is … you’re fighting against Reddit. How do you compete against Reddit? Nobody has figured that out yet.”

Crystal Carter: “Thanks for saying it out loud, Chima.” Crystal was reflecting the sentiment of the commenters, who appreciated her candor and willingness to say: we don’t know, but we’re trying our best.

Mordy Oberstein: “The most honest take I’ve heard on that in quite a long time.”

Mmeje also recounted examples of small website owners and small businesses that have had to shut down. She also talked about the pervasive feeling in the SEO community that there is no rhyme or reason to how the algorithms handle websites and content.

What’s Going On In Local SEO?

The next guests were Darren Shaw from Whitespark and Joy Hawkins, owner of Sterling Sky for a segment called “It’s New.” They talked about new developments in local SEO.

Hawkins talked about a new feature in Google Business Profile.

Joy Hawkins: “… There’s a little services section inside the Google business profile dashboard that’s easy to miss, but you can add anything you want in there. … We’ve done a lot of testing on it and they do impact ranking, but I should clarify, it’s like a small impact. So usually we see it for longer-tailed queries that maybe don’t match a category or things that are not super competitive. … So it is a small ranking factor, but still one that is worth filling out.”

Darren Shaw: “ .. this is the question that a lot of people ask. We know that if you go into the services section of your Google business profile, Google will suggest predefined services … And so Joy’s original research was focused on those predefined ones and it definitely identified that when you do put those on your profile, you now rank better for those terms depending on how competitive they’re, as Joy had mentioned. … There is a place where you can add your own custom services. Have you done any testing around that? Will you rank better with the custom services?”

Joy Hawkins: “Yes. They both work. In custom services … I’m trying to remember the keyword that Colin tested it on. It was something super niche like vampire facials. I was Googling, what the hell is that? … Really, really niche … But he just wanted to know if there was any impact whatsoever and there was. [Custom services fields are a] good way to go after longer tail keywords that don’t have crazy high search volume or aren’t super competitive.”

Darren Shaw: “You want to make sure that you’re telling Google what you do … that’s basically what the services section provides. And it’s not a huge ranking factor, but it’s just another step in the local optimization process. … a tip for custom services because custom services often get pulled into the local results as justifications. It’ll say this business provides vampire facials, right? Well, did you know there’s a vampire emoji? So if you put the vampire emoji in the title … Then in the local results you’ll see a whole panel of businesses that all provide that service, but yours has that little vampire emoji which will draw people in.”

There was tons more in this section, including questions from the audiences and some great jokes.

The Obligatory AI Section.

Eli Schwartz And Kevin indig were next up to talk about AI. Oberstein, professional rabble rouser, tried to get them to argue, but despite their very different posting habits, they found a lot to agree on about AI.

Mordy Oberstein: “It wouldn’t be an SEO podcast if we didn’t talk about AI. Where do we currently stand with AI? What can it do? What can’t it do?”

Kevin Indig: “… We’re at a stage where AI basically has the capability to create content, analyze some basic data. It still hallucinates here and there and it still makes mistakes. … If you compare that to when this AI hype started in November, 2022, so it’s almost two years now and we’ve come a really long way, these models are getting exponentially better. … It means different things based on whether you look at it as a tool for yourself to make your work more efficient. And of course, what does it mean from an SEO perspective? How does it change search, not just Google, but also how people search. And I think these are all different questions that are exciting to dive into. … So there is a lot of objective data that indicates efficiencies and benefits from AI. There’s also a lot of hype that promises a little too much about what AI can do. And so I’m generally AI bullish, but I’m not in the camp of AI is going to replace us all the next two years.”

Mordy Oberstein: “I’m setting the stage here a little bit because while your LinkedIn pros are generally like pro ai, a lot of Eli’s posts are a little more skeptical about AI. So Eli, what do you think about what Kevin just said? By the way, I’m like, for those who are listening or watching this, I’m pitting them against each other. They’re friends and they do a podcast together. So it’s cool.”

Eli Schwartz: I think AI is great. I think that there’s a lot of great things you can get out of AI. You can, again, like Kevin said, it can be your thought partner. … I’m anti AI in the way people are using it. And I don’t think people have necessarily changed their behaviors because before … they outsource [content] on Fiverr and Upwork and they bought very cheap content and now they’re getting very free content. So then that’s coming from AI. That behavior hasn’t really changed. The challenge is that now there are more people that think they can copy them.

So I talk to CMOs all the time who are like, well, I just go of my SEO team. A big company reached out to me recently. They wanted to gut check themselves after they already fired their SEO team. So I can’t really help there, but they’re like, AI can do everything. … Well, I’ll see them in a year from now when they have whatever sort of penalty. AI is a very powerful tool. Any tool we have a drill is a very powerful tool. But if you just hold it in the air and just let it go, it’s going to make holes. But if you use it appropriately, it does the thing it’s supposed to do. … We’re humans and we buy stuff and it has to come to a point where humans are talking to humans.

Crystal Carter: “… Most of the gains are coming from productivity. The stuff like Kevin was talking about with being able to write product descriptions more quickly, being able to write lots of posts more quickly and being able to finish your things more quickly, brainstorm, et cetera, in terms of the quality, the quality is still not there. It’s getting there rapidly, but it’s still not there.”

There was lots more AI talk, so you should listen to the whole episode if you want to hear the full range of opinions.

Snappy News About The Google August Update

“The Snappy News” segment featured Barry Schwartz, Contributing Editor to Search Engine Land. It also featured the dreaded SEO phrase “it depends.”

Mordy Oberstein: So the article of the day is from Search Engine Land, basically written by Barry that the core update, the August 2024 core update is done. It is complete. … The issue with Google folks who are trying to figure out, will they see a reversal of their fortunes from the 2023 helpful content update, the September, 2023 helpful content update. It’s a mouthful, to be honest with you. And my question for you, since you’re here, did that happen? Was the August updated reversal?

Barry Schwartz: “It depends on the site. I think the number, I don’t have the exact data, obviously I don’t think anybody does, but I’ve seen examples of some very few sites see complete reversals. … There are a number of sites that saw maybe a 20% bump, a 30% bump, maybe a 5% bump. But very few sites saw a complete reversal, if you want to even call it that. … I’ve been through a lot of Google updates over the years, and it’s sometimes sad to see the stories, but at the same time, if you keep at it and you are true to the content, your audience, generally, you’ll do well in the long run. Not every site, there’s plenty of sites that have been hit, went out of business, and they couldn’t come back. That’s business in general. And things change, like seasonalities and times change. You’re writing about the railroad business a hundred years ago and you keep writing about it today. There’s not many people investing a lot of money in railroads these days. So I dunno, it’s, it’s hard to read those stories, but not everybody deserves to go back to where they were. And then at the same time, Google’s not perfect either, which is why they keep on releasing new updates.”

That’s a wrap!

If you haven’t experienced a SERPs Up episode before, you should absolutely take a listen to experience the full effect of Mordy and Crystal’s banter.

The SERP’s Up podcast is brought to you by Wix Studio

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

OpenAI Claims New “o1” Model Can Reason Like A Human

Published

on

By

OpenAI Claims New "o1" Model Can Reason Like A Human

OpenAI has unveiled its latest language model, “o1,” touting advancements in complex reasoning capabilities.

In an announcement, the company claimed its new o1 model can match human performance on math, programming, and scientific knowledge tests.

However, the true impact remains speculative.

Extraordinary Claims

According to OpenAI, o1 can score in the 89th percentile on competitive programming challenges hosted by Codeforces.

The company insists its model can perform at a level that would place it among the top 500 students nationally on the elite American Invitational Mathematics Examination (AIME).

Further, OpenAI states that o1 exceeds the average performance of human subject matter experts holding PhD credentials on a combined physics, chemistry, and biology benchmark exam.

These are extraordinary claims, and it’s important to remain skeptical until we see open scrutiny and real-world testing.

Reinforcement Learning

The purported breakthrough is o1’s reinforcement learning process, designed to teach the model to break down complex problems using an approach called the “chain of thought.”

By simulating human-like step-by-step logic, correcting mistakes, and adjusting strategies before outputting a final answer, OpenAI contends that o1 has developed superior reasoning skills compared to standard language models.

Implications

It’s unclear how o1’s claimed reasoning could enhance understanding of queries—or generation of responses—across math, coding, science, and other technical topics.

From an SEO perspective, anything that improves content interpretation and the ability to answer queries directly could be impactful. However, it’s wise to be cautious until we see objective third-party testing.

OpenAI must move beyond benchmark browbeating and provide objective, reproducible evidence to support its claims. Adding o1’s capabilities to ChatGPT in planned real-world pilots should help showcase realistic use cases.


Featured Image: JarTee/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending