SEO
24-Point Enterprise SEO Audit For Large Sites & Organizations
If your website is struggling to rank in search engine results pages, an enterprise SEO audit can help you identify why.
For any SEO provider or in-house marketer who wants to audit an enterprise website, these 24 items should be on your checklist before moving forward with any SEO campaign.
What Is An Enterprise SEO Audit?
An SEO audit is an evaluation of a website to identify issues preventing it from ranking in search engine results.
Enterprise SEO audits are focused primarily on large, enterprise websites, meaning those with hundreds to thousands of landing pages.
Why Perform An SEO Audit?
Auditing your website is the first step in developing a successful SEO strategy.
Why?
Understanding the strengths and weaknesses of a website can help you tailor your SEO campaigns accordingly.
Performing an audit also helps your team direct your time, resources, and budget to the optimizations that will have the greatest impact.
What To Include In Your Enterprise SEO Audit
Auditing a large website can be very demanding, with over 200+ ranking factors in Google’s algorithm.
So to run a more sufficient audit, separate your audit into five parts: content, backlinks, technical SEO, page experience, and industry-specific standards.
You will need to rely on SEO software tools to run your audit successfully.
SEO platforms like SearchAtlas, Ahrefs, Screaming Frog, and others are a must for auditing any large website.
Content
Are You Targeting The Right Keywords?
The foundation of all successful SEO is strategic keyword targeting.
Not only do your target keywords need to be relevant to your products and services, but they also need to be realistic goals for your website.
So before you analyze whether your enterprise website is properly optimized, make sure your keyword goals are indeed reachable.
It’s possible that your target keywords are too competitive, or that you’re not targeting keywords with high enough search volume or conversion potential.
You can utilize a keyword tracking tool to see what search terms your enterprise website is already ranking for and earning organic traffic from.
Then, perform any necessary keyword research to find keywords that may be a better fit for your website.
Once you’ve ensured that improper keyword targeting is not the source of your poor SEO performance, you can move on through the remainder of your audit.
2. Do You Have SEO-Friendly URLs?
The URLs of your enterprise web pages should be unique, descriptive, short, and keyword-rich.
URLs are visible at the top of search results and can influence whether or not a user clicks through to the page.
Use hyphens between words to keep the URL paths readable and omit any unnecessary numbers.
3. Are Your Meta Tags Properly Optimized?
Google crawlers look to the title tags and meta descriptions to understand what your web content is about and its relevance to specific keywords.
Like URLs, these tags are also visible in the SERPs and influence whether or not a searcher clicks.
Your audit should include checking that your web page’s meta tags are unique and following SEO best practices.
Use a site auditor tool to identify the web pages to address to speed up the process.
Make sure to check for:
- Original and unique titles and meta descriptions for each web page.
- Proper character length: 50-60 for title tags and 150-160 for meta descriptions.
- Keyword or variation of target keyword including in both title and meta descriptions.
Google sometimes rewrites page titles and meta descriptions, but this only happens a small portion of the time.
Optimizing these meta tags is still an essential step in on-page SEO.
4. Is Your Content High-Quality And In-Depth?
Although content length is not a ranking factor, in-depth content often displays characteristics that Google likes, such as original insight, reporting, in-depth analysis, and comprehensive exploration of the topic.
There is no magic number when it comes to content length. Google’s Quality Rater Guidelines state that web pages should have a “satisfying” amount of content.
They also state:
“The amount of content necessary for the page to be satisfying depends on the topic and purpose of the page. A high quality page on a broad topic with a lot of available information will have more content than a high quality page on a narrower topic.”
Still not sure whether your content is long enough?
Look to your enterprise competitors who are already ranking and measure how long their content is compared to yours.
Then, aim to have content equal to or more in-depth than theirs.
5. Are Your Landing Pages Internally Linking To Each Other?
Internal links help Google find and index your pages. They also communicate website hierarchy, relevance signals, topical breadth, and spread around your PageRank.
Make sure you evaluate whether or not your pages are leveraging an internal linking strategy. Also, take a close look at the anchor text used to link to other pages of your website.
Your pages need to have internal links pointing to other pages, and be sure internal links are pointing to that page.
Otherwise, you will have orphan pages, meaning pages that Google cannot find and index because there is no linking pathway to them.
6. Does Your Content Show Expert Sourcing With External Links To Relevant Content?
Google also looks to external links to understand website content and the authority of websites.
External links should be only to relevant, authoritative sources, and your website links out to sites with higher Domain Authority scores than your own.
Otherwise, Google is less likely to trust your enterprise website if you appear to be keeping low-quality websites in your link neighborhood.
7. Are You Using Rich Media And Interactive Elements?
Google likes to see images, videos, and interactive elements like jump links on the page. These elements make content more engaging and easier to navigate.
However, if these elements slow down your page load times, they are counterproductive to your SEO efforts. That will be addressed in a later part of your audit.
8. Do Your Images Include Keyword-rich Alt Text?
Enterprise websites – particularly ecommerce sites – may feature thousands of images.
But because Google cannot see images, they rely on alt text to understand how those images relate to your web page content.
Your audit should include confirming that image alt text is not only present but descriptive and keyword-rich.
9. Are Your Pages Suffering From Keyword Cannibalization?
With hundreds to thousands of landing pages, there may be times when your landing pages are not only competing against your competitors but other landing pages on your website.
This is called keyword cannibalization and it happens when Google crawlers aren’t sure which page on your enterprise site is the most relevant.
Some tips for resolving keyword cannibalization:
- Find another keyword and re-optimize one of the competing pages.
- Consolidate the competing pages into one longer, in-depth page.
- Use a 301 redirect to point to the higher-performing or higher-converting page.
Backlinks
10. Do You Have Fewer Backlinks Than Your Competitors?
Google’s #1 ranking factor still remains the same: Backlinks.
If your enterprise website is competing against well-known incumbents in your industry, it’s likely they have a robust backlink profile, making it difficult for your website to compete in the SERPs.
You can use a backlink tool like Ahrefs to identify your competitor’s total backlinks and unique referring domains.
If there is a significant gap in backlinks or referring domains, this is likely a source for fewer keyword rankings or lower-ranking positions.
Dedicate a significant portion of your SEO campaign to link building and digital PR if you want to outrank your competition.
11. Does Your Website Have Toxic Backlinks?
Although backlinks are important for improving site authority, the wrong type of backlinks can also harm a website.
If your website has toxic backlinks from spammy, low-quality websites, Google may suspect your enterprise website to be guilty of backlink manipulation.
Google has gotten better at recognizing low-quality links, and after their 2021 Link Spam Update, Google also claims to nullify spammy links and not count them against websites.
However, there may still be moments when toxic backlinks should be disavowed.
You will want to focus on identifying where those toxic links are coming from and take the necessary steps to create and submit a disavow file.
Some SEO software tools can identify toxic links and create disavow text for you.
Disavowing the wrong way can actually harm your SEO performance, so if you are unfamiliar with this Google tool, make sure you seek the assistance of an SEO provider.
12. Is Your Anchor Text Distribution Diverse?
Google also looks to the anchor text of your backlinks to understand relevance and authority. Relevant anchor text is important, but not all webmasters will link to your pages in the same way.
If the majority of your anchor text is branded, that’s okay.
Look for too much exact-match anchor text or high CPC anchor text that Google crawlers may flag for suspected backlink manipulation.
If your anchor text does not display a healthy level of diversity, design a link building campaign around earning links with anchor text that improves diversity and signals healthy backlink practices to Google.
Technical SEO
13. Have You Submitted An XML Sitemap And Did it Include The Right Pages?
Because enterprise websites have thousands of landing pages, one of the most common issues uncovered in enterprise SEO audits is related to search engine indexing.
That’s why generating and submitting an XML sitemap is important. It communicates your website hierarchy to search engine crawlers.
It also tells them which pages of your website are the most important to crawl regularly and index.
If your enterprise website adds new product pages or content to your website, you can also use your sitemap to show Google the new pages rather than wait for crawlers to discover them through internal links.
14. Have You Maxed Out Your Crawl Budget?
Google’s web crawlers will only spend so much time crawling your web pages, meaning your enterprise website may have pages that don’t end up in Google’s index.
Although improving your page speed and your site authority can lead to Google increasing your crawl budget, that takes time. So in the meantime, focus on making sure you’re using your crawl budget wisely.
If your audit uncovers essential pages that are not being indexed, your enterprise website may benefit from crawl budget optimization. You want your highest value, highest converting pages to end up in Google’s index.
15. Is Your Schema Markup Properly Setup?
A very powerful optimization that your enterprise website can utilize is schema markup.
If your enterprise website already includes schema on some of your pages, you will want to confirm that your schema is validated and eligible for rich results.
You can use Google’s rich result to test your pages that include schema markup to confirm they are properly validated but to be even more efficient, use your favorite site crawler to test all of your pages at once.
16. Do You Have Excessive Broken Links Or Redirects?
Over time, links naturally break as websites update their content or delete old pages.
It’s important to check your enterprise website to confirm your external and internal links are pointing to live pages.
Otherwise, it will appear to Google that your website is not well-maintained, and Google will be less likely to promote your web pages in the SERPs.
17. Do Similar Pages Include Canonical Tags?
Enterprise websites (particularly ecommerce sites) may have duplicate content that targets different regions or is programmatically built out.
If those pages don’t have canonical tags, they will look to Google as duplicate content.
It’s important to confirm that the best, most in-depth version of the page has a self-referential canonical tag. All of the similar pages include canonical tags that identify the master version of the page.
A site auditor tool like SearchAtlas can confirm whether your canonical URLs are properly implemented and if Google crawlers understand which page to promote in the SERPs.
18. Does Your Multilingual Content Leverage Hreflang Tags?
For multilingual enterprise websites, hreflang tags can help you show the right language content to the right searchers.
This improves your relevance signals and can have a huge impact on your conversion rates.
However, it’s easy to make mistakes when implementing hreflang and canonical tags.
As a general rule, only add hreflang tags to your web pages with self-referencing canonicals – not duplicate copies of the page.
Page Experience
19. Do Your Pages Meet Google’s Core Web Vitals Standards?
If your pages do not meet Google’s Core Web Vitals standards, they are unlikely to rank.
Google knows that load times, responsiveness, and visual stability impact the quality of a user’s experience, and thus the quality of a web page and whether or not it’s rank-worthy.
You can see your Core Web Vitals metrics in your Google Search Console account.
You can also use the free platform to validate any fixes and see whether or not they improve your CWV metrics.
20. Do Your Web Pages Include HTTP Or HTTPS Protocols?
A secure website is also essential to the quality of the user’s experience.
If your web pages are not utilizing HTTPS protocols, you are not providing users with a secure browsing experience.
As a result, Google is less likely to promote your pages.
21. Are Your Mobile Pages Responsive And High-Performing?
The majority of searches now happen on mobile devices.
Also, with mobile-first indexing, Google predominantly uses mobile pages in its index.
It is also more likely to use your mobile pages when determining where to rank your pages compared to your competitors.
Some common mobile mistakes that occur include:
- Unresponsive design.
- Intrusive pop-ups.
- Bad UI/UX elements like button size.
- Unplayable or missing content.
Industry
22. Are You Considered A Your-Money-Your-Life (YMYL) Website?
If your enterprise website is considered a health, financial, legal, or other YMYL website, Google has extremely high standards for the content that it will promote to searchers.
Although this does not impact all enterprise websites, it’s important to know whether your website falls under this banner to make sure you meet Google’s specific standards for your YMYL industry.
23. Does Your Website Show High Levels of E-A-T?
Google wants to see that your content is relevant to users’ keywords.
It also wants to see that your website, as a whole, displays industry expertise.
E-A-T stands for expertise, authority, and trust. It’s hard to quantify, but some more tangible factors include:
- In-depth, well-researched content (e.g. blogs, ebooks, long-form articles).
- Expert authorship and sourcing (e.g. an author byline that shows industry-specific expertise and credentials).
- A clear purpose and focus for each page.
- Off-site reputation signals (e.g. an article in a reputable online publication that mentions or links to your website).
If you’re still not sure what E-A-T looks like in your industry, look to the top-ranking content of your competitors to see the topical depth, expertise, and sourcing, and model your content accordingly.
24. Does Your Website Have Strong Reputation Signals?
If your goal is to rank for branded searches, other authoritative websites may feature content about your brand competing with yours in the SERPs.
If your enterprise has a Wikipedia page or press in online publications with high Domain Authority, those digital locations may rank higher than your domain.
If this is the case, you may need to take a more unique approach to link building to improve the branded signals of your content.
Optimizations like schema can also help ensure that information about your brand is featured at the top of the SERP, particularly if those websites that mention your enterprise brand do so negatively.
Final Thoughts On Conducting Your Enterprise Audit
Sitewide audits can be daunting, but they are worth the time and effort to craft a tailored, custom SEO campaign strategy.
Make sure to leverage the best SEO software tools throughout your audit to speed up the process and ensure the most accurate evaluation of your website.
Once your audit is complete, you can easily prioritize those optimizations that will be the most impactful.
More resources:
Featured Image: Yuriy K/Shutterstock
!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);
if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}
fbq(‘init’, ‘1321385257908563’);
fbq(‘track’, ‘PageView’);
fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘enterprise-seo-audit’,
content_category: ‘analytics-data enterprise’
});
SEO
The Expert SEO Guide To URL Parameter Handling
In the world of SEO, URL parameters pose a significant problem.
While developers and data analysts may appreciate their utility, these query strings are an SEO headache.
Countless parameter combinations can split a single user intent across thousands of URL variations. This can cause complications for crawling, indexing, visibility and, ultimately, lead to lower traffic.
The issue is we can’t simply wish them away, which means it’s crucial to master how to manage URL parameters in an SEO-friendly way.
To do so, we will explore:
What Are URL Parameters?
URL parameters, also known as query strings or URI variables, are the portion of a URL that follows the ‘?’ symbol. They are comprised of a key and a value pair, separated by an ‘=’ sign. Multiple parameters can be added to a single page when separated by an ‘&’.
The most common use cases for parameters are:
- Tracking – For example ?utm_medium=social, ?sessionid=123 or ?affiliateid=abc
- Reordering – For example ?sort=lowest-price, ?order=highest-rated or ?so=latest
- Filtering – For example ?type=widget, colour=purple or ?price-range=20-50
- Identifying – For example ?product=small-purple-widget, categoryid=124 or itemid=24AU
- Paginating – For example, ?page=2, ?p=2 or viewItems=10-30
- Searching – For example, ?query=users-query, ?q=users-query or ?search=drop-down-option
- Translating – For example, ?lang=fr or ?language=de
SEO Issues With URL Parameters
1. Parameters Create Duplicate Content
Often, URL parameters make no significant change to the content of a page.
A re-ordered version of the page is often not so different from the original. A page URL with tracking tags or a session ID is identical to the original.
For example, the following URLs would all return a collection of widgets.
- Static URL: https://www.example.com/widgets
- Tracking parameter: https://www.example.com/widgets?sessionID=32764
- Reordering parameter: https://www.example.com/widgets?sort=latest
- Identifying parameter: https://www.example.com?category=widgets
- Searching parameter: https://www.example.com/products?search=widget
That’s quite a few URLs for what is effectively the same content – now imagine this over every category on your site. It can really add up.
The challenge is that search engines treat every parameter-based URL as a new page. So, they see multiple variations of the same page, all serving duplicate content and all targeting the same search intent or semantic topic.
While such duplication is unlikely to cause a website to be completely filtered out of the search results, it does lead to keyword cannibalization and could downgrade Google’s view of your overall site quality, as these additional URLs add no real value.
2. Parameters Reduce Crawl Efficacy
Crawling redundant parameter pages distracts Googlebot, reducing your site’s ability to index SEO-relevant pages and increasing server load.
Google sums up this point perfectly.
“Overly complex URLs, especially those containing multiple parameters, can cause a problems for crawlers by creating unnecessarily high numbers of URLs that point to identical or similar content on your site.
As a result, Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all the content on your site.”
3. Parameters Split Page Ranking Signals
If you have multiple permutations of the same page content, links and social shares may be coming in on various versions.
This dilutes your ranking signals. When you confuse a crawler, it becomes unsure which of the competing pages to index for the search query.
4. Parameters Make URLs Less Clickable
Let’s face it: parameter URLs are unsightly. They’re hard to read. They don’t seem as trustworthy. As such, they are slightly less likely to be clicked.
This may impact page performance. Not only because CTR influences rankings, but also because it’s less clickable in AI chatbots, social media, in emails, when copy-pasted into forums, or anywhere else the full URL may be displayed.
While this may only have a fractional impact on a single page’s amplification, every tweet, like, share, email, link, and mention matters for the domain.
Poor URL readability could contribute to a decrease in brand engagement.
Assess The Extent Of Your Parameter Problem
It’s important to know every parameter used on your website. But chances are your developers don’t keep an up-to-date list.
So how do you find all the parameters that need handling? Or understand how search engines crawl and index such pages? Know the value they bring to users?
Follow these five steps:
- Run a crawler: With a tool like Screaming Frog, you can search for “?” in the URL.
- Review your log files: See if Googlebot is crawling parameter-based URLs.
- Look in the Google Search Console page indexing report: In the samples of index and relevant non-indexed exclusions, search for ‘?’ in the URL.
- Search with site: inurl: advanced operators: Know how Google is indexing the parameters you found by putting the key in a site:example.com inurl:key combination query.
- Look in Google Analytics all pages report: Search for “?” to see how each of the parameters you found are used by users. Be sure to check that URL query parameters have not been excluded in the view setting.
Armed with this data, you can now decide how to best handle each of your website’s parameters.
SEO Solutions To Tame URL Parameters
You have six tools in your SEO arsenal to deal with URL parameters on a strategic level.
Limit Parameter-based URLs
A simple review of how and why parameters are generated can provide an SEO quick win.
You will often find ways to reduce the number of parameter URLs and thus minimize the negative SEO impact. There are four common issues to begin your review.
1. Eliminate Unnecessary Parameters
Ask your developer for a list of every website’s parameters and their functions. Chances are, you will discover parameters that no longer perform a valuable function.
For example, users can be better identified by cookies than sessionIDs. Yet the sessionID parameter may still exist on your website as it was used historically.
Or you may discover that a filter in your faceted navigation is rarely applied by your users.
Any parameters caused by technical debt should be eliminated immediately.
2. Prevent Empty Values
URL parameters should be added to a URL only when they have a function. Don’t permit parameter keys to be added if the value is blank.
In the above example, key2 and key3 add no value, both literally and figuratively.
3. Use Keys Only Once
Avoid applying multiple parameters with the same parameter name and a different value.
For multi-select options, it is better to combine the values after a single key.
4. Order URL Parameters
If the same URL parameter is rearranged, the pages are interpreted by search engines as equal.
As such, parameter order doesn’t matter from a duplicate content perspective. But each of those combinations burns crawl budget and split ranking signals.
Avoid these issues by asking your developer to write a script to always place parameters in a consistent order, regardless of how the user selected them.
In my opinion, you should start with any translating parameters, followed by identifying, then pagination, then layering on filtering and reordering or search parameters, and finally tracking.
Pros:
- Ensures more efficient crawling.
- Reduces duplicate content issues.
- Consolidates ranking signals to fewer pages.
- Suitable for all parameter types.
Cons:
- Moderate technical implementation time.
Rel=”Canonical” Link Attribute
The rel=”canonical” link attribute calls out that a page has identical or similar content to another. This encourages search engines to consolidate the ranking signals to the URL specified as canonical.
You can rel=canonical your parameter-based URLs to your SEO-friendly URL for tracking, identifying, or reordering parameters.
But this tactic is not suitable when the parameter page content is not close enough to the canonical, such as pagination, searching, translating, or some filtering parameters.
Pros:
- Relatively easy technical implementation.
- Very likely to safeguard against duplicate content issues.
- Consolidates ranking signals to the canonical URL.
Cons:
- Wastes crawling on parameter pages.
- Not suitable for all parameter types.
- Interpreted by search engines as a strong hint, not a directive.
Meta Robots Noindex Tag
Set a noindex directive for any parameter-based page that doesn’t add SEO value. This tag will prevent search engines from indexing the page.
URLs with a “noindex” tag are also likely to be crawled less frequently and if it’s present for a long time will eventually lead Google to nofollow the page’s links.
Pros:
- Relatively easy technical implementation.
- Very likely to safeguard against duplicate content issues.
- Suitable for all parameter types you do not wish to be indexed.
- Removes existing parameter-based URLs from the index.
Cons:
- Won’t prevent search engines from crawling URLs, but will encourage them to do so less frequently.
- Doesn’t consolidate ranking signals.
- Interpreted by search engines as a strong hint, not a directive.
Robots.txt Disallow
The robots.txt file is what search engines look at first before crawling your site. If they see something is disallowed, they won’t even go there.
You can use this file to block crawler access to every parameter based URL (with Disallow: /*?*) or only to specific query strings you don’t want to be indexed.
Pros:
- Simple technical implementation.
- Allows more efficient crawling.
- Avoids duplicate content issues.
- Suitable for all parameter types you do not wish to be crawled.
Cons:
- Doesn’t consolidate ranking signals.
- Doesn’t remove existing URLs from the index.
Move From Dynamic To Static URLs
Many people think the optimal way to handle URL parameters is to simply avoid them in the first place.
After all, subfolders surpass parameters to help Google understand site structure and static, keyword-based URLs have always been a cornerstone of on-page SEO.
To achieve this, you can use server-side URL rewrites to convert parameters into subfolder URLs.
For example, the URL:
www.example.com/view-product?id=482794
Would become:
www.example.com/widgets/purple
This approach works well for descriptive keyword-based parameters, such as those that identify categories, products, or filters for search engine-relevant attributes. It is also effective for translated content.
But it becomes problematic for non-keyword-relevant elements of faceted navigation, such as an exact price. Having such a filter as a static, indexable URL offers no SEO value.
It’s also an issue for searching parameters, as every user-generated query would create a static page that vies for ranking against the canonical – or worse presents to crawlers low-quality content pages whenever a user has searched for an item you don’t offer.
It’s somewhat odd when applied to pagination (although not uncommon due to WordPress), which would give a URL such as
www.example.com/widgets/purple/page2
Very odd for reordering, which would give a URL such as
www.example.com/widgets/purple/lowest-price
And is often not a viable option for tracking. Google Analytics will not acknowledge a static version of the UTM parameter.
More to the point: Replacing dynamic parameters with static URLs for things like pagination, on-site search box results, or sorting does not address duplicate content, crawl budget, or internal link equity dilution.
Having all the combinations of filters from your faceted navigation as indexable URLs often results in thin content issues. Especially if you offer multi-select filters.
Many SEO pros argue it’s possible to provide the same user experience without impacting the URL. For example, by using POST rather than GET requests to modify the page content. Thus, preserving the user experience and avoiding SEO problems.
But stripping out parameters in this manner would remove the possibility for your audience to bookmark or share a link to that specific page – and is obviously not feasible for tracking parameters and not optimal for pagination.
The crux of the matter is that for many websites, completely avoiding parameters is simply not possible if you want to provide the ideal user experience. Nor would it be best practice SEO.
So we are left with this. For parameters that you don’t want to be indexed in search results (paginating, reordering, tracking, etc) implement them as query strings. For parameters that you do want to be indexed, use static URL paths.
Pros:
- Shifts crawler focus from parameter-based to static URLs which have a higher likelihood to rank.
Cons:
- Significant investment of development time for URL rewrites and 301 redirects.
- Doesn’t prevent duplicate content issues.
- Doesn’t consolidate ranking signals.
- Not suitable for all parameter types.
- May lead to thin content issues.
- Doesn’t always provide a linkable or bookmarkable URL.
Best Practices For URL Parameter Handling For SEO
So which of these six SEO tactics should you implement?
The answer can’t be all of them.
Not only would that create unnecessary complexity, but often, the SEO solutions actively conflict with one another.
For example, if you implement robots.txt disallow, Google would not be able to see any meta noindex tags. You also shouldn’t combine a meta noindex tag with a rel=canonical link attribute.
Google’s John Mueller, Gary Ilyes, and Lizzi Sassman couldn’t even decide on an approach. In a Search Off The Record episode, they discussed the challenges that parameters present for crawling.
They even suggest bringing back a parameter handling tool in Google Search Console. Google, if you are reading this, please do bring it back!
What becomes clear is there isn’t one perfect solution. There are occasions when crawling efficiency is more important than consolidating authority signals.
Ultimately, what’s right for your website will depend on your priorities.
Personally, I take the following plan of attack for SEO-friendly parameter handling:
- Research user intents to understand what parameters should be search engine friendly, static URLs.
- Implement effective pagination handling using a ?page= parameter.
- For all remaining parameter-based URLs, block crawling with a robots.txt disallow and add a noindex tag as backup.
- Double-check that no parameter-based URLs are being submitted in the XML sitemap.
No matter what parameter handling strategy you choose to implement, be sure to document the impact of your efforts on KPIs.
More resources:
Featured Image: BestForBest/Shutterstock
SEO
SEO Experts Gather for a Candid Chat About Search [Podcast]
Wix just celebrated their 100th podcast episode! Congrats, Wix. To quote Mordy Oberstein, Head of SEO Brand at Wix; “we talk a lot.”
You sure do! It’s a good thing you have a lot of interesting stuff to say.
The 100th episode of “SERPs Up” was full of awesome guests. Here’s a summary of the action.
Apart from the usual faces, Oberstein and Crystal Carter, Head Of SEO Communications, it was a powerhouse guestlist:
- Chima Mmeje.
- Darren Shaw.
- Joy Hawkins.
- Eli Schwartz.
- Kevin Indig.
- Barry Schwartz.
Just How Broken Are The SERPs?
The first guest was Chima Mmeje from Moz. She dove into the frustrations that many SEOs have been feeling and spoke plainly about the flaws in Google’s updates.
Mordy Oberstein: “Is the SERP broken?”
Chima Mmeje: “The helpful content update, and I’m saying this here, live, is a farce. There was nothing helpful about that update. … Yes, the SERP is 1,000% broken. … How does anybody even use Google in the U.S.? … I don’t think they are going to release any update that will fix these issues.”
Mordy Oberstein: “There’s no update. … Plopping Reddit all over the SERP was because they saw the content trends … and they said ‘we don’t have any so we’re just going to throw Reddit there’.”
Chima Mmeje: “It was lazy to have Reddit there … Nobody uses their real names. Anybody can go on Reddit and answer questions and then you see these answers populating in People Also Ask, populating in featured snippets, populating all over the SERPs as correct information. It is dangerous, at worst.”
Crystal Carter: “Do you think that one of the reasons why we’ve seen so much upheaval and so much so volatility in the SERPs, which I certainly agree with in the last year … is lots and lots of variables, like lots of new features coming in, so the alignment with Reddit, the AI overviews, the SGE … Do you think it is just too many things being thrown in at the same time and it messing up lots of SERPs as a result? Or do you think it’s something else?”
Chima Mmeje: ” … releasing too many features that they did not test properly. Features that were rushed SGE [testing] did not even last a year and now they brought in Google AI Overviews. I still don’t understand why we have AI Overviews and featured snippets on the same SERP. I feel like it’s like pick one, make a choice.”
Mordy Oberstein’s next question was about what we can do. “As an SEO, how are you supposed to do this? I’ve heard things from people … Yeah, I don’t know what to do. I can’t produce the kind of results that I’ve always wanted to. Can you still be effective as an SEO in an environment like this?”
Chima Mmeje: “I’m going to be honest, we are suffering … It feels like we are trying our best with what we are seeing … because there is no clear guidance. And to be honest, a lot of us are playing a guessing game right now and that is the best that we can do. It’s all a guessing game based on what we’ve seen one or two variables work. And this is not a long-term strategy. If we’re going to be realistic, it’s not going to work in the long-term. I honestly, I don’t know what the answer is … you’re fighting against Reddit. How do you compete against Reddit? Nobody has figured that out yet.”
Crystal Carter: “Thanks for saying it out loud, Chima.” Crystal was reflecting the sentiment of the commenters, who appreciated her candor and willingness to say: we don’t know, but we’re trying our best.
Mordy Oberstein: “The most honest take I’ve heard on that in quite a long time.”
Mmeje also recounted examples of small website owners and small businesses that have had to shut down. She also talked about the pervasive feeling in the SEO community that there is no rhyme or reason to how the algorithms handle websites and content.
What’s Going On In Local SEO?
The next guests were Darren Shaw from Whitespark and Joy Hawkins, owner of Sterling Sky for a segment called “It’s New.” They talked about new developments in local SEO.
Hawkins talked about a new feature in Google Business Profile.
Joy Hawkins: “… There’s a little services section inside the Google business profile dashboard that’s easy to miss, but you can add anything you want in there. … We’ve done a lot of testing on it and they do impact ranking, but I should clarify, it’s like a small impact. So usually we see it for longer-tailed queries that maybe don’t match a category or things that are not super competitive. … So it is a small ranking factor, but still one that is worth filling out.”
Darren Shaw: “ .. this is the question that a lot of people ask. We know that if you go into the services section of your Google business profile, Google will suggest predefined services … And so Joy’s original research was focused on those predefined ones and it definitely identified that when you do put those on your profile, you now rank better for those terms depending on how competitive they’re, as Joy had mentioned. … There is a place where you can add your own custom services. Have you done any testing around that? Will you rank better with the custom services?”
Joy Hawkins: “Yes. They both work. In custom services … I’m trying to remember the keyword that Colin tested it on. It was something super niche like vampire facials. I was Googling, what the hell is that? … Really, really niche … But he just wanted to know if there was any impact whatsoever and there was. [Custom services fields are a] good way to go after longer tail keywords that don’t have crazy high search volume or aren’t super competitive.”
Darren Shaw: “You want to make sure that you’re telling Google what you do … that’s basically what the services section provides. And it’s not a huge ranking factor, but it’s just another step in the local optimization process. … a tip for custom services because custom services often get pulled into the local results as justifications. It’ll say this business provides vampire facials, right? Well, did you know there’s a vampire emoji? So if you put the vampire emoji in the title … Then in the local results you’ll see a whole panel of businesses that all provide that service, but yours has that little vampire emoji which will draw people in.”
There was tons more in this section, including questions from the audiences and some great jokes.
The Obligatory AI Section.
Eli Schwartz And Kevin indig were next up to talk about AI. Oberstein, professional rabble rouser, tried to get them to argue, but despite their very different posting habits, they found a lot to agree on about AI.
Mordy Oberstein: “It wouldn’t be an SEO podcast if we didn’t talk about AI. Where do we currently stand with AI? What can it do? What can’t it do?”
Kevin Indig: “… We’re at a stage where AI basically has the capability to create content, analyze some basic data. It still hallucinates here and there and it still makes mistakes. … If you compare that to when this AI hype started in November, 2022, so it’s almost two years now and we’ve come a really long way, these models are getting exponentially better. … It means different things based on whether you look at it as a tool for yourself to make your work more efficient. And of course, what does it mean from an SEO perspective? How does it change search, not just Google, but also how people search. And I think these are all different questions that are exciting to dive into. … So there is a lot of objective data that indicates efficiencies and benefits from AI. There’s also a lot of hype that promises a little too much about what AI can do. And so I’m generally AI bullish, but I’m not in the camp of AI is going to replace us all the next two years.”
Mordy Oberstein: “I’m setting the stage here a little bit because while your LinkedIn pros are generally like pro ai, a lot of Eli’s posts are a little more skeptical about AI. So Eli, what do you think about what Kevin just said? By the way, I’m like, for those who are listening or watching this, I’m pitting them against each other. They’re friends and they do a podcast together. So it’s cool.”
Eli Schwartz: I think AI is great. I think that there’s a lot of great things you can get out of AI. You can, again, like Kevin said, it can be your thought partner. … I’m anti AI in the way people are using it. And I don’t think people have necessarily changed their behaviors because before … they outsource [content] on Fiverr and Upwork and they bought very cheap content and now they’re getting very free content. So then that’s coming from AI. That behavior hasn’t really changed. The challenge is that now there are more people that think they can copy them.
So I talk to CMOs all the time who are like, well, I just go of my SEO team. A big company reached out to me recently. They wanted to gut check themselves after they already fired their SEO team. So I can’t really help there, but they’re like, AI can do everything. … Well, I’ll see them in a year from now when they have whatever sort of penalty. AI is a very powerful tool. Any tool we have a drill is a very powerful tool. But if you just hold it in the air and just let it go, it’s going to make holes. But if you use it appropriately, it does the thing it’s supposed to do. … We’re humans and we buy stuff and it has to come to a point where humans are talking to humans.
Crystal Carter: “… Most of the gains are coming from productivity. The stuff like Kevin was talking about with being able to write product descriptions more quickly, being able to write lots of posts more quickly and being able to finish your things more quickly, brainstorm, et cetera, in terms of the quality, the quality is still not there. It’s getting there rapidly, but it’s still not there.”
There was lots more AI talk, so you should listen to the whole episode if you want to hear the full range of opinions.
Snappy News About The Google August Update
“The Snappy News” segment featured Barry Schwartz, Contributing Editor to Search Engine Land. It also featured the dreaded SEO phrase “it depends.”
Mordy Oberstein: So the article of the day is from Search Engine Land, basically written by Barry that the core update, the August 2024 core update is done. It is complete. … The issue with Google folks who are trying to figure out, will they see a reversal of their fortunes from the 2023 helpful content update, the September, 2023 helpful content update. It’s a mouthful, to be honest with you. And my question for you, since you’re here, did that happen? Was the August updated reversal?
Barry Schwartz: “It depends on the site. I think the number, I don’t have the exact data, obviously I don’t think anybody does, but I’ve seen examples of some very few sites see complete reversals. … There are a number of sites that saw maybe a 20% bump, a 30% bump, maybe a 5% bump. But very few sites saw a complete reversal, if you want to even call it that. … I’ve been through a lot of Google updates over the years, and it’s sometimes sad to see the stories, but at the same time, if you keep at it and you are true to the content, your audience, generally, you’ll do well in the long run. Not every site, there’s plenty of sites that have been hit, went out of business, and they couldn’t come back. That’s business in general. And things change, like seasonalities and times change. You’re writing about the railroad business a hundred years ago and you keep writing about it today. There’s not many people investing a lot of money in railroads these days. So I dunno, it’s, it’s hard to read those stories, but not everybody deserves to go back to where they were. And then at the same time, Google’s not perfect either, which is why they keep on releasing new updates.”
That’s a wrap!
If you haven’t experienced a SERPs Up episode before, you should absolutely take a listen to experience the full effect of Mordy and Crystal’s banter.
The SERP’s Up podcast is brought to you by Wix Studio.
SEO
OpenAI Claims New “o1” Model Can Reason Like A Human
OpenAI has unveiled its latest language model, “o1,” touting advancements in complex reasoning capabilities.
In an announcement, the company claimed its new o1 model can match human performance on math, programming, and scientific knowledge tests.
However, the true impact remains speculative.
Extraordinary Claims
According to OpenAI, o1 can score in the 89th percentile on competitive programming challenges hosted by Codeforces.
The company insists its model can perform at a level that would place it among the top 500 students nationally on the elite American Invitational Mathematics Examination (AIME).
Further, OpenAI states that o1 exceeds the average performance of human subject matter experts holding PhD credentials on a combined physics, chemistry, and biology benchmark exam.
These are extraordinary claims, and it’s important to remain skeptical until we see open scrutiny and real-world testing.
Reinforcement Learning
The purported breakthrough is o1’s reinforcement learning process, designed to teach the model to break down complex problems using an approach called the “chain of thought.”
By simulating human-like step-by-step logic, correcting mistakes, and adjusting strategies before outputting a final answer, OpenAI contends that o1 has developed superior reasoning skills compared to standard language models.
Implications
It’s unclear how o1’s claimed reasoning could enhance understanding of queries—or generation of responses—across math, coding, science, and other technical topics.
From an SEO perspective, anything that improves content interpretation and the ability to answer queries directly could be impactful. However, it’s wise to be cautious until we see objective third-party testing.
OpenAI must move beyond benchmark browbeating and provide objective, reproducible evidence to support its claims. Adding o1’s capabilities to ChatGPT in planned real-world pilots should help showcase realistic use cases.
Featured Image: JarTee/Shutterstock
-
SEO7 days ago
Google’s Guidance About The Recent Ranking Update
-
SEARCHENGINES6 days ago
Google Search Volatility Still Heated After August Core Update Rollout
-
SEARCHENGINES5 days ago
Daily Search Forum Recap: September 9, 2024
-
SEO6 days ago
Mediavine Bans Publisher For Overuse Of AI-Generated Content
-
SEO4 days ago
Expert Embedding Techniques for SEO Success
-
SEARCHENGINES4 days ago
Daily Search Forum Recap: September 10, 2024
-
AFFILIATE MARKETING7 days ago
Invest in Yourself with a Lifetime of StackSkills Courses for $29.97
-
AFFILIATE MARKETING6 days ago
One $40 Payment Can Get You Lifetime Access to Microsoft Office Professional 2021
You must be logged in to post a comment Login