SEO
The 3 That Really Matter
If only SEO was as simple as having a list of ranking factors that Google applies to its algorithm.
If only SEO was as simple as Google having one algorithm.
If only every niche and location were ranked in the same way.
Long gone are the days when search engines operated on a basic level, where keyword stuffing and a volume of links were obviously direct factors that impacted ranking. Oh, and there was only one algorithm to worry about.
Over the last 25 years, SEO has become an increasingly complex and nuanced discipline.
Ranking factors differ by the vertical and the keyword. YMYL rankings are handled differently from how ecommerce transactional queries are ranked, and local search is different again.
There is only one certainty with SEO: the more you know, the more you realize you don’t know.
There isn’t an official Google ranking factors blueprint or checklist that you can follow. But, what we do know is that there are some factors or signals that Google considers important for ranking pages.
Download our Ranking Factors for 2023 ebook here.
The “Google 200 Ranking Factors” Myth
Before we list what are important factors and signals for ranking, we need to talk about the mythical list of 200 ranking factors that Google allegedly uses.
Do a Google search for “ranking factor,” and you will see in the search engine results pages (SERP) plenty of titles that mention “200 ranking factors” from some well-known blogs.
Most likely, the number 200 originated as a PR attempt by Google to portray its algorithm as complex and having multi-factors. And then it stuck. The only known citation of “200” is from a speech by Matt Cutts at PubCon in 2009.
As we said above, Google and ranking has evolved exponentially over the last 25 years to a point where there are hundreds (maybe thousands) of factors and machine learning overlays.
What Yandex Revealed About Ranking Factors
The Yandex ranking factors leak of January 2023 revealed Yandex uses around 690 ranking factors, give or take.
At the time, this was an insight into how a major search engine applied factors and signals for ranking.
In a direct conversation, Dan Taylor, an expert on Russian search engines, said both Yandex and Google share a number of similarities in how they try to index and rank websites: “They both have the same data points to work with; on-page content, links, meta-data, mobile-friendliness, and user interactions such as SERP clicks and user behaviour.”
He went on to say: “Both search engines also make use of AI for parts of their ranking systems (such as Vega), but have differences in how they weight certain signals, such as backlinks and users clicking on results in the SERPs, and some of these are more easily manipulated than others in comparison to Google.”
Taylor thinks, in theory, that pages can be optimized for both search engines in the same way without compromising on performance. That would mean the Yandex leak could offer insights into ranking on Google.
Factors, Systems, And Signals
Whenever Google documentation is updated – or Gary Illyes, John Mueller, or Danny Sullivan make a comment – SEO professionals obsess over the meaning.
This is an issue for Google and for the SEO industry at large, because SEO pros are often looking way too deeply at the wrong thing and losing focus on what really matters. Nothing seems to be held to more scrutiny than ranking factors.
SEO professionals are becoming fixated on the semantic differences between factors, systems, and signals.
When documentation was updated to remove page experience from the Systems page, Google was forced to put out this statement on X (Twitter): “Ranking *systems* are different than ranking *signals* (systems typically make use of signals). We had some things listed on that page relating to page experience as “systems” that were actually signals. They shouldn’t have been on the page about systems. Taking them off didn’t mean we no longer consider aspects of page experience. It just meant these weren’t ranking *systems* but instead signals used by other systems.”
As it turns out, page experience is still a ranking factor (see below).
Digging into the semantics, Google has two official pages that relate to ranking factors:
A guide to Google Search ranking systems:
“Google uses automated ranking systems that look at many factors and signals about hundreds of billions of web pages and other content in our Search index to present the most relevant, useful results, all in a fraction of a second.”
“To give you the most useful information, Search algorithms look at many factors and signals, including the words of your query, relevance and usability of pages, expertise of sources, and your location and settings. The weight applied to each factor varies depending on the nature of your query.”
Gary Illyes covered the differences between factors, signals, and systems during an Ask Me Anything session at PubCon (September 2023), where he said, “The main difference is just language.”
The easiest way to define between system and signal is to say Google’s ranking systems can be thought of as the machine learning layers that are applied to refine search results. Ranking signals influence the systems and ranking.
In a direct message, SEO expert Ammon Johns clarified this by saying: “Not all things that are classed as signals will be used in any one system. Many things that Google classifies as signals may not be applicable to a particular query, or may be weighted differently to that of another query. For example, even Google’s most famous of all signals, PageRank, isn’t used in Local Search at all.”
The Google “How Search Works” page talks about “key factors that help determine which results are returned for your query.”
On this page, the main factors are summarized as:
- Meaning.
- Relevance.
- Quality.
- Usability.
- Context.
If you can understand the fundamental approach that Google takes, then distracting yourself with semantics is not important. Following a common sense approach to the end goal for the end user is a much more effective and long-term strategy.
Basically, Google is driven by wanting to provide the best search results it can so that it has a market-leading product. It’s a business. Once you understand this, you understand the fundamental concept of SEO.
With all that said, here are the fundamental ranking factors that should all be considered for SERP visibility.
The 3 Ranking Factors That Every SEO Pro Should Focus On
1. High-Quality Content
The first stage of ranking is to understand the user’s query.
The second stage is to match the query to the content on a page.
From How Search Works: “Our systems analyze the content to assess whether it contains information that might be relevant to what you are looking for.”
As long as your site is technically sound enough to be crawled and rendered, quality content continues to be the number one ranking factor.
Content is key not just for ranking, but also for user experience and conversion.
Gary Illyes from Google summarizes this by saying: “Without content it literally is not possible to rank. If you don’t have words on page you’re not going to rank for it. Every site will have something different as the top 2 or 3 ranking factors.”
The internet is literally built from pages of content.
But what is high-quality content? In short, it can best be defined as content that follows E-E-A-T signals, and it demonstrates:
- Experience.
- Expertise.
- Authoritativeness.
- Trustworthiness.
Read more about E-E-A-T below.
An integral part of content are the keywords and words on the page. There are theories circulating that keywords are now obsolete and not needed anymore to rank. But, on a fundamental level, keywords do still matter.
As Google says, “The most basic signal that information is relevant is when content contains the same keywords as your search query. For example, with web pages, if those keywords appear on the page, or if they appear in the headings or body of the text, the information might be more relevant.”
A page must identify what it is about clearly to avoid any ambiguity and to be ranked.
Pedro Dias, a former Googler, explained in a direct conversation: “It’s not that original ranking factors like keywords are obsolete, they are the cornerstone on which we build. It’s just as important as always that these fundamentals are applied and done well.”
Pedro went on to say: “Google has introduced machine learning that is applied on top of the foundations so that they can provide results that take into account far more nuanced intents for queries.”
Google is striving to always surface the best results, so machine learning systems have been developed as part of the move towards parsing natural language queries. Google can understand the difference between “cheat” as a disingenuous person and “cheat” as a way to game a system (as in cheat code). An example Pedro highlighted that Gary Illyes once used.
We can’t mention content and keywords without talking about entities, which Google is using to better understand topics. This article explains in depth why it’s essential to understand entities in SEO.
As explained by Ammon Johns: “Search engines have placed more emphasis on semantic search and entities. For the simplest kind of example, search for ‘History of Munchen’ and not only will Google understand the misspelling of MÜNCHEN, but it will almost certainly mostly show results with the more popular ‘Munich’ keyword in the titles and snippets.”
The systems that have the most impact on ranking content are as follows:
Helpful Content System
Launched in 2022, Google’s helpful content system is focused on providing the best content to the user.
Google’s motivation is for content to demonstrate real-world experience, which circles back to providing the best experience for the reader: “more content by people, for people.”
The system is being updated constantly, and in 2023, we have been through several iterations of updates.
Google states: “The helpful content system aims to better reward content where visitors feel they’ve had a satisfying experience, while content that doesn’t meet a visitor’s expectations won’t perform as well.”
A few of the guidelines for helpful content, which all underline E-E-A-T, include:
- Don’t stray from your main topic.
- Demonstrate first-hand experience.
- Don’t combine multiple topics on one site.
RankBrain
Launched in 2015, RankBrain is one of Google’s machine learning systems that can connect words to concepts and helps Google understand the intent of a search query.
This is part of the rank refining where Google will try to return the most relevant results to a query. It also allows Google to return results for queries with no previous record of searches.
Before RankBrain, Google didn’t understand synonyms and would return literal interpretations of a word. From Google: “…before we had advanced AI, our systems simply looked for matching words. For example, if you searched for “pziza” – unless there was a page with that particular misspelling, you’d likely have to redo the search with the correct spelling to find a slice near you…Now, with advanced machine learning, our systems can more intuitively recognize if a word doesn’t look right and suggest a possible correction.”
BERT
In 2018, BERT created waves in the SEO industry as a significant update for Google that was reported to impact about 10% of search queries at the time.
The system understands how combinations of words can have different meanings, especially stop words. This makes even so-called stop words relevant in search when they contribute to the meaning of a query.
From Google: “BERT was a huge step change in natural language understanding, helping us understand how combinations of words express different meanings and intents.”
Multitask Unified Model (MUM)
In 2021, at Google IO, MUM was announced as a system to take things a step further by being multimodal, which allows it to take information from text, images, and possibly video.
MUM is not applied as a ranking system across all verticals, as Google said: “While we’re still in the early days of tapping into MUM’s potential, we’ve already used it to improve searches for COVID-19 vaccine information.”
It would appear that the main application is going to be for search that can contain text and images in Google Lens.
Google states: “As we introduce more MUM-powered experiences to Search, we’ll begin to shift from advanced language understanding to a more nuanced understanding of information about the world… MUM is capable of both understanding and generating language.”
Content Freshness
Caffeine was introduced in 2010 and was a move away from refreshing the entire index every few weeks. Google’s stated purpose for Caffein was to “analyze the web in small portions and update our search index on a continuous basis, globally.”
As the internet was rapidly expanding, in 2011, Google built on top of Caffeine and introduced “Freshness” by announcing: “…today we’re making a significant improvement to our ranking algorithm that impacts roughly 35 percent of searches and better determines when to give you more up-to-date relevant results for these varying degrees of freshness.”
Content freshness is not applied across all searches. It’s query-dependent and more critical for some niches and queries. For example, think breaking news results, weather, or stock prices.
Most content will see some level of decay over time in search results if it isn’t updated. Ideas, concepts, products, and information are all constantly evolving, and users’ changing expectations are aligned with that.
Personalization & Locality
Although not concerned with quality of content, it’s worth mentioning here that on top of all the other rank refining is a layer of personalization, which takes into consideration user search history and user location.
For example, queries such as “best coffee shop” are considered location-dependent and will deliver a map of results based on your location. Some product queries are served by location to surface local suppliers.
Results for the same query can differ on each device, and knowing the motivation a user might have at a certain stage in their journey makes a difference in what results should be served in the SERPs.
As an example, the query “London Zoo” serves desktop results with an emphasis on research with video and image carousels, while the mobile SERP has a focus on tickets, directions, and location.
As John Mueller said: “If you’re searching on your phone then maybe you want more local information because you’re on the go. Whereas if you’re searching on a desktop maybe you want more images or more videos shown in the search results.”
When you do keyword research and create content, it is important to understand how personalization and locality will impact ranking and take this into consideration in your strategy.
E-E-A-T Is Not A Ranking Factor, But Is Important
Again, not a direct system for ranking, but Experience, Expertise, Authoritativeness, and Trustworthiness – E-E-A-T – is a critically important SEO concept that all content creators must take into account.
Google’s Search Quality Raters Guidelines used to be a closely guarded document at Google that was eventually leaked online. Google now openly publishes the document as an example of what its Quality Raters are looking for when they manually review websites.
E-E-A-T is part of the Google Quality Search Raters Guidelines and not so much a ranking factor, but it is a guideline.
E-E-A-T is made up of a series of refining signals that underline everything that Google has been trying to achieve with better user experience and fighting misinformation.
The concept is important for all niches, but especially for anyone in YMYL niches, such as finance or health, where the results can really impact the user’s life in a significant way.
As mentioned above, quality content is a critical ranking factor, and there is no better blueprint to tell you how to achieve that than the E-E-A-T guidelines. Building a credible reputation as an expert within a field supports Google’s aim and provides a good user experience.
2. Page Experience
Page experience caused ripples in the community when it was removed from Google’s ranking systems page, which forced the Search Liaison team to say: “…As our guidance on page experience says in the first sentence: ‘Google’s core ranking systems look to reward content that provides a good page experience.’”
Page experience rolled out in 2021. Previously to this, Core Web Vitals (CWVs) had been emphasized as an important ranking factor.
CWVs then became part of something bigger in a collective group of ‘signals’ that make up page experience – essentially still a ranking factor, but part of a group of factors now known as ‘Page Experience.’
To understand why this matters is to understand everything that Google wants to achieve.
Google wants to deliver a good user experience. It does not reflect well on its product if it serves pages that take too long to load, don’t load well on certain devices, or are obscured by large ads that obstruct users from getting to the page.
Google says: “Google’s core ranking systems look to reward content that provides a good page experience.”
Page Experience is focused on four main signals:
- HTTPS.
- Page Speed.
- Mobile Friendliness.
- Core Web Vitals.
Page experience is important, but not the most critical factor. In some circumstances, it’s not applied to ranking but is more critical when there are two pages vying for position.
As John Mueller explains: “If all of the content is very similar in the search results page, then probably using Page Experience helps a little bit to understand which of these are fast pages or reasonable pages with regards to the user experience and which of these are kind of the less reasonable pages to show in the search results.”
Google wants to deliver the best product on the market, and this is a critical part of SEO that has been overlooked. Focusing on Google’s motivation and working with this will get you better results for ranking than anything else.
3. Links
Ranking factors and links go hand in hand.
Since Google first launched, SEO professionals have been using links to manipulate rankings. And Google has been fighting link spam to try and improve its results.
Many SEO professionals think that links are being deprecated as a ranking factor. In a 2022 poll by Marie Haynes, 44% of SEOs pros who responded thought that link building was less effective now compared to a few years ago.
If we start by looking at why links have been important historically, in Google Founders Sergey Brin’s and Lawrence Page’s famous Stanford paper, links were given prominence as one of the main factors of ranking in a system that echoed the citations given to academic papers.
In the early days of Google, links quickly became the most leveraged spam technique for ranking. It took Google until 2012 and the legendary Penguin update to wipe out low-quality links, and it has been trying to downgrade the importance of links since this time.
Yet, the first time a Google representative said online that links were a ranking factor was in 2016. In a Q&A with Ammon Johns and others, Google Search Quality Senior Strategist Andrey Lipattsev said the top 3 ranking signals are “Content, Links, RankBrain.”
Skip forward to 2023; in an AMA at PubCon, Gary Illyes then gave a contradicting opinion to say that links are not a “top 3” ranking signal and haven’t been “for some time…there really isn’t a universal top 3.” Illyes went on to say, “It’s absolutely possible to rank without links.”
It’s worth considering that there are many reasons why Google would downplay the importance of links, such as to reduce link spam. Google is not going to outright claim that links are a surefire ranking factor if they can be so easily manipulated. Yes, it might be technically possible to rank without links, but more often, links do improve ranking.
In a direct message conversation, Ammon said in response to his 2016 video: “When Andrey Lipattsev responded with ‘Content, Links and RankBrain’, he was saying what matters is on-page, off-page, and how Google processes a query – which is something anyone should have already known. On that basis, no matter what Gary Illyes has said since, those are indeed the three essential factors still today.”
Apart from the flow of PageRank, one of the reasons that links are important is that Google typically finds pages by crawling, and it traverses pages via links.
This is why a page with no inbound or internal links can be difficult to rank, as it it’s not found by Google via links in order to be crawled and indexed. The potential for the absence of links highlights the importance of submitting a sitemap, which tells Google what pages you want indexed.
Internal linking not only helps Google crawl and index all linked pages on your site – it also helps to interlink topic clusters, which is a valuable SEO content strategy.
What is important is that not all links are equal, and Google focuses on the quality of an individual link, not the volume of links.
John Mueller said: “The number of links may have been an important factor during the early days of PageRank, but Google prioritizes more helpful metrics to evaluate links today.”
Links do not have the same impact as they did in the early days when it was possible to rank with a high-volume of low-quality inbound links. Today, relevance and quality of link matter.
Good quality links do still have an influence on ranking, and a lot of SEO professionals would say they do still count.
At this point, we can confidently say that internal links and inbound links are still considered a ranking factor.
Google Ranking Factors Takeaway
The main thing to take away from this article is that ranking and SERP visibility is not a straightforward application of “here is a list of ranking factors that we can work with.”
It’s one of the reasons why this industry is such an exciting and challenging space to work in.
All that said above, although there is not a clear set of Google ranking factors that you can follow, there are a number of factors and signals that are important to get right to achieve the best ranking you can.
Start by really understanding Google’s motivation and how it works. Then, you can start to understand how to shape your approach to content and SEO strategy in order to rank.
If you want to read more about ranking factors with a focus to prioritize facts and not speculation, then download a copy of Ranking Factors 2023 ebook.
In researching this article, the author spoke directly to Pedro Dias (former Google employee), Ammon Johns (SEO Pioneer), and Dan Taylor (Russian search engine and technical SEO expert). Many thanks to them for their input and expertise.
More resources:
Featured Image: Jeramey Lende/Shutterstock
SEO
The Expert SEO Guide To URL Parameter Handling
In the world of SEO, URL parameters pose a significant problem.
While developers and data analysts may appreciate their utility, these query strings are an SEO headache.
Countless parameter combinations can split a single user intent across thousands of URL variations. This can cause complications for crawling, indexing, visibility and, ultimately, lead to lower traffic.
The issue is we can’t simply wish them away, which means it’s crucial to master how to manage URL parameters in an SEO-friendly way.
To do so, we will explore:
What Are URL Parameters?
URL parameters, also known as query strings or URI variables, are the portion of a URL that follows the ‘?’ symbol. They are comprised of a key and a value pair, separated by an ‘=’ sign. Multiple parameters can be added to a single page when separated by an ‘&’.
The most common use cases for parameters are:
- Tracking – For example ?utm_medium=social, ?sessionid=123 or ?affiliateid=abc
- Reordering – For example ?sort=lowest-price, ?order=highest-rated or ?so=latest
- Filtering – For example ?type=widget, colour=purple or ?price-range=20-50
- Identifying – For example ?product=small-purple-widget, categoryid=124 or itemid=24AU
- Paginating – For example, ?page=2, ?p=2 or viewItems=10-30
- Searching – For example, ?query=users-query, ?q=users-query or ?search=drop-down-option
- Translating – For example, ?lang=fr or ?language=de
SEO Issues With URL Parameters
1. Parameters Create Duplicate Content
Often, URL parameters make no significant change to the content of a page.
A re-ordered version of the page is often not so different from the original. A page URL with tracking tags or a session ID is identical to the original.
For example, the following URLs would all return a collection of widgets.
- Static URL: https://www.example.com/widgets
- Tracking parameter: https://www.example.com/widgets?sessionID=32764
- Reordering parameter: https://www.example.com/widgets?sort=latest
- Identifying parameter: https://www.example.com?category=widgets
- Searching parameter: https://www.example.com/products?search=widget
That’s quite a few URLs for what is effectively the same content – now imagine this over every category on your site. It can really add up.
The challenge is that search engines treat every parameter-based URL as a new page. So, they see multiple variations of the same page, all serving duplicate content and all targeting the same search intent or semantic topic.
While such duplication is unlikely to cause a website to be completely filtered out of the search results, it does lead to keyword cannibalization and could downgrade Google’s view of your overall site quality, as these additional URLs add no real value.
2. Parameters Reduce Crawl Efficacy
Crawling redundant parameter pages distracts Googlebot, reducing your site’s ability to index SEO-relevant pages and increasing server load.
Google sums up this point perfectly.
“Overly complex URLs, especially those containing multiple parameters, can cause a problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site.
As a result, Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site.”
3. Parameters Split Page Ranking Signals
If you have multiple permutations of the same page content, links and social shares may be coming in on various versions.
This dilutes your ranking signals. When you confuse a crawler, it becomes unsure which of the competing pages to index for the search query.
4. Parameters Make URLs Less Clickable
Let’s face it: parameter URLs are unsightly. They’re hard to read. They don’t seem as trustworthy. As such, they are slightly less likely to be clicked.
This may impact page performance. Not only because CTR influences rankings, but also because it’s less clickable in AI chatbots, social media, in emails, when copy-pasted into forums, or anywhere else the full URL may be displayed.
While this may only have a fractional impact on a single page’s amplification, every tweet, like, share, email, link, and mention matters for the domain.
Poor URL readability could contribute to a decrease in brand engagement.
Assess The Extent Of Your Parameter Problem
It’s important to know every parameter used on your website. But chances are your developers don’t keep an up-to-date list.
So how do you find all the parameters that need handling? Or understand how search engines crawl and index such pages? Know the value they bring to users?
Follow these five steps:
- Run a crawler: With a tool like Screaming Frog, you can search for “?” in the URL.
- Review your log files: See if Googlebot is crawling parameter-based URLs.
- Look in the Google Search Console page indexing report: In the samples of index and relevant non-indexed exclusions, search for ‘?’ in the URL.
- Search with site: inurl: advanced operators: Know how Google is indexing the parameters you found by putting the key in a site:example.com inurl:key combination query.
- Look in Google Analytics all pages report: Search for “?” to see how each of the parameters you found are used by users. Be sure to check that URL query parameters have not been excluded in the view setting.
Armed with this data, you can now decide how to best handle each of your website’s parameters.
SEO Solutions To Tame URL Parameters
You have six tools in your SEO arsenal to deal with URL parameters on a strategic level.
Limit Parameter-based URLs
A simple review of how and why parameters are generated can provide an SEO quick win.
You will often find ways to reduce the number of parameter URLs and thus minimize the negative SEO impact. There are four common issues to begin your review.
1. Eliminate Unnecessary Parameters
Ask your developer for a list of every website’s parameters and their functions. Chances are, you will discover parameters that no longer perform a valuable function.
For example, users can be better identified by cookies than sessionIDs. Yet the sessionID parameter may still exist on your website as it was used historically.
Or you may discover that a filter in your faceted navigation is rarely applied by your users.
Any parameters caused by technical debt should be eliminated immediately.
2. Prevent Empty Values
URL parameters should be added to a URL only when they have a function. Don’t permit parameter keys to be added if the value is blank.
In the above example, key2 and key3 add no value, both literally and figuratively.
3. Use Keys Only Once
Avoid applying multiple parameters with the same parameter name and a different value.
For multi-select options, it is better to combine the values after a single key.
4. Order URL Parameters
If the same URL parameter is rearranged, the pages are interpreted by search engines as equal.
As such, parameter order doesn’t matter from a duplicate content perspective. But each of those combinations burns crawl budget and split ranking signals.
Avoid these issues by asking your developer to write a script to always place parameters in a consistent order, regardless of how the user selected them.
In my opinion, you should start with any translating parameters, followed by identifying, then pagination, then layering on filtering and reordering or search parameters, and finally tracking.
Pros:
- Ensures more efficient crawling.
- Reduces duplicate content issues.
- Consolidates ranking signals to fewer pages.
- Suitable for all parameter types.
Cons:
- Moderate technical implementation time.
Rel=”Canonical” Link Attribute
The rel=”canonical” link attribute calls out that a page has identical or similar content to another. This encourages search engines to consolidate the ranking signals to the URL specified as canonical.
You can rel=canonical your parameter-based URLs to your SEO-friendly URL for tracking, identifying, or reordering parameters.
But this tactic is not suitable when the parameter page content is not close enough to the canonical, such as pagination, searching, translating, or some filtering parameters.
Pros:
- Relatively easy technical implementation.
- Very likely to safeguard against duplicate content issues.
- Consolidates ranking signals to the canonical URL.
Cons:
- Wastes crawling on parameter pages.
- Not suitable for all parameter types.
- Interpreted by search engines as a strong hint, not a directive.
Meta Robots Noindex Tag
Set a noindex directive for any parameter-based page that doesn’t add SEO value. This tag will prevent search engines from indexing the page.
URLs with a “noindex” tag are also likely to be crawled less frequently and if it’s present for a long time will eventually lead Google to nofollow the page’s links.
Pros:
- Relatively easy technical implementation.
- Very likely to safeguard against duplicate content issues.
- Suitable for all parameter types you do not wish to be indexed.
- Removes existing parameter-based URLs from the index.
Cons:
- Won’t prevent search engines from crawling URLs, but will encourage them to do so less frequently.
- Doesn’t consolidate ranking signals.
- Interpreted by search engines as a strong hint, not a directive.
Robots.txt Disallow
The robots.txt file is what search engines look at first before crawling your site. If they see something is disallowed, they won’t even go there.
You can use this file to block crawler access to every parameter based URL (with Disallow: /*?*) or only to specific query strings you don’t want to be indexed.
Pros:
- Simple technical implementation.
- Allows more efficient crawling.
- Avoids duplicate content issues.
- Suitable for all parameter types you do not wish to be crawled.
Cons:
- Doesn’t consolidate ranking signals.
- Doesn’t remove existing URLs from the index.
Move From Dynamic To Static URLs
Many people think the optimal way to handle URL parameters is to simply avoid them in the first place.
After all, subfolders surpass parameters to help Google understand site structure and static, keyword-based URLs have always been a cornerstone of on-page SEO.
To achieve this, you can use server-side URL rewrites to convert parameters into subfolder URLs.
For example, the URL:
www.example.com/view-product?id=482794
Would become:
www.example.com/widgets/purple
This approach works well for descriptive keyword-based parameters, such as those that identify categories, products, or filters for search engine-relevant attributes. It is also effective for translated content.
But it becomes problematic for non-keyword-relevant elements of faceted navigation, such as an exact price. Having such a filter as a static, indexable URL offers no SEO value.
It’s also an issue for searching parameters, as every user-generated query would create a static page that vies for ranking against the canonical – or worse presents to crawlers low-quality content pages whenever a user has searched for an item you don’t offer.
It’s somewhat odd when applied to pagination (although not uncommon due to WordPress), which would give a URL such as
www.example.com/widgets/purple/page2
Very odd for reordering, which would give a URL such as
www.example.com/widgets/purple/lowest-price
And is often not a viable option for tracking. Google Analytics will not acknowledge a static version of the UTM parameter.
More to the point: Replacing dynamic parameters with static URLs for things like pagination, on-site search box results, or sorting does not address duplicate content, crawl budget, or internal link equity dilution.
Having all the combinations of filters from your faceted navigation as indexable URLs often results in thin content issues. Especially if you offer multi-select filters.
Many SEO pros argue it’s possible to provide the same user experience without impacting the URL. For example, by using POST rather than GET requests to modify the page content. Thus, preserving the user experience and avoiding SEO problems.
But stripping out parameters in this manner would remove the possibility for your audience to bookmark or share a link to that specific page – and is obviously not feasible for tracking parameters and not optimal for pagination.
The crux of the matter is that for many websites, completely avoiding parameters is simply not possible if you want to provide the ideal user experience. Nor would it be best practice SEO.
So we are left with this. For parameters that you don’t want to be indexed in search results (paginating, reordering, tracking, etc) implement them as query strings. For parameters that you do want to be indexed, use static URL paths.
Pros:
- Shifts crawler focus from parameter-based to static URLs which have a higher likelihood to rank.
Cons:
- Significant investment of development time for URL rewrites and 301 redirects.
- Doesn’t prevent duplicate content issues.
- Doesn’t consolidate ranking signals.
- Not suitable for all parameter types.
- May lead to thin content issues.
- Doesn’t always provide a linkable or bookmarkable URL.
Best Practices For URL Parameter Handling For SEO
So which of these six SEO tactics should you implement?
The answer can’t be all of them.
Not only would that create unnecessary complexity, but often, the SEO solutions actively conflict with one another.
For example, if you implement robots.txt disallow, Google would not be able to see any meta noindex tags. You also shouldn’t combine a meta noindex tag with a rel=canonical link attribute.
Google’s John Mueller, Gary Ilyes, and Lizzi Sassman couldn’t even decide on an approach. In a Search Off The Record episode, they discussed the challenges that parameters present for crawling.
They even suggest bringing back a parameter handling tool in Google Search Console. Google, if you are reading this, please do bring it back!
What becomes clear is there isn’t one perfect solution. There are occasions when crawling efficiency is more important than consolidating authority signals.
Ultimately, what’s right for your website will depend on your priorities.
Personally, I take the following plan of attack for SEO-friendly parameter handling:
- Research user intents to understand what parameters should be search engine friendly, static URLs.
- Implement effective pagination handling using a ?page= parameter.
- For all remaining parameter-based URLs, block crawling with a robots.txt disallow and add a noindex tag as backup.
- Double-check that no parameter-based URLs are being submitted in the XML sitemap.
No matter what parameter handling strategy you choose to implement, be sure to document the impact of your efforts on KPIs.
More resources:
Featured Image: BestForBest/Shutterstock
SEO
SEO Experts Gather for a Candid Chat About Search [Podcast]
Wix just celebrated their 100th podcast episode! Congrats, Wix. To quote Mordy Oberstein, Head of SEO Brand at Wix; “we talk a lot.”
You sure do! It’s a good thing you have a lot of interesting stuff to say.
The 100th episode of “SERPs Up” was full of awesome guests. Here’s a summary of the action.
Apart from the usual faces, Oberstein and Crystal Carter, Head Of SEO Communications, it was a powerhouse guestlist:
- Chima Mmeje.
- Darren Shaw.
- Joy Hawkins.
- Eli Schwartz.
- Kevin Indig.
- Barry Schwartz.
Just How Broken Are The SERPs?
The first guest was Chima Mmeje from Moz. She dove into the frustrations that many SEOs have been feeling and spoke plainly about the flaws in Google’s updates.
Mordy Oberstein: “Is the SERP broken?”
Chima Mmeje: “The helpful content update, and I’m saying this here, live, is a farce. There was nothing helpful about that update. … Yes, the SERP is 1,000% broken. … How does anybody even use Google in the U.S.? … I don’t think they are going to release any update that will fix these issues.”
Mordy Oberstein: “There’s no update. … Plopping Reddit all over the SERP was because they saw the content trends … and they said ‘we don’t have any so we’re just going to throw Reddit there’.”
Chima Mmeje: “It was lazy to have Reddit there … Nobody uses their real names. Anybody can go on Reddit and answer questions and then you see these answers populating in People Also Ask, populating in featured snippets, populating all over the SERPs as correct information. It is dangerous, at worst.”
Crystal Carter: “Do you think that one of the reasons why we’ve seen so much upheaval and so much so volatility in the SERPs, which I certainly agree with in the last year … is lots and lots of variables, like lots of new features coming in, so the alignment with Reddit, the AI overviews, the SGE … Do you think it is just too many things being thrown in at the same time and it messing up lots of SERPs as a result? Or do you think it’s something else?”
Chima Mmeje: ” … releasing too many features that they did not test properly. Features that were rushed SGE [testing] did not even last a year and now they brought in Google AI Overviews. I still don’t understand why we have AI Overviews and featured snippets on the same SERP. I feel like it’s like pick one, make a choice.”
Mordy Oberstein’s next question was about what we can do. “As an SEO, how are you supposed to do this? I’ve heard things from people … Yeah, I don’t know what to do. I can’t produce the kind of results that I’ve always wanted to. Can you still be effective as an SEO in an environment like this?”
Chima Mmeje: “I’m going to be honest, we are suffering … It feels like we are trying our best with what we are seeing … because there is no clear guidance. And to be honest, a lot of us are playing a guessing game right now and that is the best that we can do. It’s all a guessing game based on what we’ve seen one or two variables work. And this is not a long-term strategy. If we’re going to be realistic, it’s not going to work in the long-term. I honestly, I don’t know what the answer is … you’re fighting against Reddit. How do you compete against Reddit? Nobody has figured that out yet.”
Crystal Carter: “Thanks for saying it out loud, Chima.” Crystal was reflecting the sentiment of the commenters, who appreciated her candor and willingness to say: we don’t know, but we’re trying our best.
Mordy Oberstein: “The most honest take I’ve heard on that in quite a long time.”
Mmeje also recounted examples of small website owners and small businesses that have had to shut down. She also talked about the pervasive feeling in the SEO community that there is no rhyme or reason to how the algorithms handle websites and content.
What’s Going On In Local SEO?
The next guests were Darren Shaw from Whitespark and Joy Hawkins, owner of Sterling Sky for a segment called “It’s New.” They talked about new developments in local SEO.
Hawkins talked about a new feature in Google Business Profile.
Joy Hawkins: “… There’s a little services section inside the Google business profile dashboard that’s easy to miss, but you can add anything you want in there. … We’ve done a lot of testing on it and they do impact ranking, but I should clarify, it’s like a small impact. So usually we see it for longer-tailed queries that maybe don’t match a category or things that are not super competitive. … So it is a small ranking factor, but still one that is worth filling out.”
Darren Shaw: “ .. this is the question that a lot of people ask. We know that if you go into the services section of your Google business profile, Google will suggest predefined services … And so Joy’s original research was focused on those predefined ones and it definitely identified that when you do put those on your profile, you now rank better for those terms depending on how competitive they’re, as Joy had mentioned. … There is a place where you can add your own custom services. Have you done any testing around that? Will you rank better with the custom services?”
Joy Hawkins: “Yes. They both work. In custom services … I’m trying to remember the keyword that Colin tested it on. It was something super niche like vampire facials. I was Googling, what the hell is that? … Really, really niche … But he just wanted to know if there was any impact whatsoever and there was. [Custom services fields are a] good way to go after longer tail keywords that don’t have crazy high search volume or aren’t super competitive.”
Darren Shaw: “You want to make sure that you’re telling Google what you do … that’s basically what the services section provides. And it’s not a huge ranking factor, but it’s just another step in the local optimization process. … a tip for custom services because custom services often get pulled into the local results as justifications. It’ll say this business provides vampire facials, right? Well, did you know there’s a vampire emoji? So if you put the vampire emoji in the title … Then in the local results you’ll see a whole panel of businesses that all provide that service, but yours has that little vampire emoji which will draw people in.”
There was tons more in this section, including questions from the audiences and some great jokes.
The Obligatory AI Section.
Eli Schwartz And Kevin indig were next up to talk about AI. Oberstein, professional rabble rouser, tried to get them to argue, but despite their very different posting habits, they found a lot to agree on about AI.
Mordy Oberstein: “It wouldn’t be an SEO podcast if we didn’t talk about AI. Where do we currently stand with AI? What can it do? What can’t it do?”
Kevin Indig: “… We’re at a stage where AI basically has the capability to create content, analyze some basic data. It still hallucinates here and there and it still makes mistakes. … If you compare that to when this AI hype started in November, 2022, so it’s almost two years now and we’ve come a really long way, these models are getting exponentially better. … It means different things based on whether you look at it as a tool for yourself to make your work more efficient. And of course, what does it mean from an SEO perspective? How does it change search, not just Google, but also how people search. And I think these are all different questions that are exciting to dive into. … So there is a lot of objective data that indicates efficiencies and benefits from AI. There’s also a lot of hype that promises a little too much about what AI can do. And so I’m generally AI bullish, but I’m not in the camp of AI is going to replace us all the next two years.”
Mordy Oberstein: “I’m setting the stage here a little bit because while your LinkedIn pros are generally like pro ai, a lot of Eli’s posts are a little more skeptical about AI. So Eli, what do you think about what Kevin just said? By the way, I’m like, for those who are listening or watching this, I’m pitting them against each other. They’re friends and they do a podcast together. So it’s cool.”
Eli Schwartz: I think AI is great. I think that there’s a lot of great things you can get out of AI. You can, again, like Kevin said, it can be your thought partner. … I’m anti AI in the way people are using it. And I don’t think people have necessarily changed their behaviors because before … they outsource [content] on Fiverr and Upwork and they bought very cheap content and now they’re getting very free content. So then that’s coming from AI. That behavior hasn’t really changed. The challenge is that now there are more people that think they can copy them.
So I talk to CMOs all the time who are like, well, I just go of my SEO team. A big company reached out to me recently. They wanted to gut check themselves after they already fired their SEO team. So I can’t really help there, but they’re like, AI can do everything. … Well, I’ll see them in a year from now when they have whatever sort of penalty. AI is a very powerful tool. Any tool we have a drill is a very powerful tool. But if you just hold it in the air and just let it go, it’s going to make holes. But if you use it appropriately, it does the thing it’s supposed to do. … We’re humans and we buy stuff and it has to come to a point where humans are talking to humans.
Crystal Carter: “… Most of the gains are coming from productivity. The stuff like Kevin was talking about with being able to write product descriptions more quickly, being able to write lots of posts more quickly and being able to finish your things more quickly, brainstorm, et cetera, in terms of the quality, the quality is still not there. It’s getting there rapidly, but it’s still not there.”
There was lots more AI talk, so you should listen to the whole episode if you want to hear the full range of opinions.
Snappy News About The Google August Update
“The Snappy News” segment featured Barry Schwartz, Contributing Editor to Search Engine Land. It also featured the dreaded SEO phrase “it depends.”
Mordy Oberstein: So the article of the day is from Search Engine Land, basically written by Barry that the core update, the August 2024 core update is done. It is complete. … The issue with Google folks who are trying to figure out, will they see a reversal of their fortunes from the 2023 helpful content update, the September, 2023 helpful content update. It’s a mouthful, to be honest with you. And my question for you, since you’re here, did that happen? Was the August updated reversal?
Barry Schwartz: “It depends on the site. I think the number, I don’t have the exact data, obviously I don’t think anybody does, but I’ve seen examples of some very few sites see complete reversals. … There are a number of sites that saw maybe a 20% bump, a 30% bump, maybe a 5% bump. But very few sites saw a complete reversal, if you want to even call it that. … I’ve been through a lot of Google updates over the years, and it’s sometimes sad to see the stories, but at the same time, if you keep at it and you are true to the content, your audience, generally, you’ll do well in the long run. Not every site, there’s plenty of sites that have been hit, went out of business, and they couldn’t come back. That’s business in general. And things change, like seasonalities and times change. You’re writing about the railroad business a hundred years ago and you keep writing about it today. There’s not many people investing a lot of money in railroads these days. So I dunno, it’s, it’s hard to read those stories, but not everybody deserves to go back to where they were. And then at the same time, Google’s not perfect either, which is why they keep on releasing new updates.”
That’s a wrap!
If you haven’t experienced a SERPs Up episode before, you should absolutely take a listen to experience the full effect of Mordy and Crystal’s banter.
The SERP’s Up podcast is brought to you by Wix Studio.
SEO
OpenAI Claims New “o1” Model Can Reason Like A Human
OpenAI has unveiled its latest language model, “o1,” touting advancements in complex reasoning capabilities.
In an announcement, the company claimed its new o1 model can match human performance on math, programming, and scientific knowledge tests.
However, the true impact remains speculative.
Extraordinary Claims
According to OpenAI, o1 can score in the 89th percentile on competitive programming challenges hosted by Codeforces.
The company insists its model can perform at a level that would place it among the top 500 students nationally on the elite American Invitational Mathematics Examination (AIME).
Further, OpenAI states that o1 exceeds the average performance of human subject matter experts holding PhD credentials on a combined physics, chemistry, and biology benchmark exam.
These are extraordinary claims, and it’s important to remain skeptical until we see open scrutiny and real-world testing.
Reinforcement Learning
The purported breakthrough is o1’s reinforcement learning process, designed to teach the model to break down complex problems using an approach called the “chain of thought.”
By simulating human-like step-by-step logic, correcting mistakes, and adjusting strategies before outputting a final answer, OpenAI contends that o1 has developed superior reasoning skills compared to standard language models.
Implications
It’s unclear how o1’s claimed reasoning could enhance understanding of queries—or generation of responses—across math, coding, science, and other technical topics.
From an SEO perspective, anything that improves content interpretation and the ability to answer queries directly could be impactful. However, it’s wise to be cautious until we see objective third-party testing.
OpenAI must move beyond benchmark browbeating and provide objective, reproducible evidence to support its claims. Adding o1’s capabilities to ChatGPT in planned real-world pilots should help showcase realistic use cases.
Featured Image: JarTee/Shutterstock
-
SEARCHENGINES7 days ago
Google Search Volatility Still Heated After August Core Update Rollout
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 9, 2024
-
SEO6 days ago
Mediavine Bans Publisher For Overuse Of AI-Generated Content
-
SEARCHENGINES4 days ago
Daily Search Forum Recap: September 10, 2024
-
SEO5 days ago
Expert Embedding Techniques for SEO Success
-
WORDPRESS5 days ago
Roadmap Update – WordPress.com News
-
WORDPRESS5 days ago
The Ultimate eCommerce Launch Checklist for WordPress
-
AFFILIATE MARKETING6 days ago
One $40 Payment Can Get You Lifetime Access to Microsoft Office Professional 2021