SEO
11 SEO Tips & Tricks To Improve Search Indexation
The SEO game has so many moving parts that it often seems like, as soon as we’re done optimizing one part of a website, we have to move back to the part we were just working on.
Once you’re out of the “I’m new here” stage and feel that you have some real SEO experience under your belt, you might start to feel that there are some things you can devote less time to correcting.
Indexability and crawl budgets could be two of those things, but forgetting about them would be a mistake.
I always like to say that a website with indexability issues is a site that’s in its own way; that website is inadvertently telling Google not to rank its pages because they don’t load correctly or they redirect too many times.
If you think you can’t or shouldn’t be devoting time to the decidedly not-so-glamorous task of fixing your site’s indexability, think again.
Indexability problems can cause your rankings to plummet and your site traffic to dry up quickly.
So, your crawl budget has to be top of mind.
In this post, I’ll present you with 11 tips to consider as you go about improving your website’s indexability.
1. Track Crawl Status With Google Search Console
Errors in your crawl status could be indicative of a deeper issue on your site.
Checking your crawl status every 30-60 days is important to identify potential errors that are impacting your site’s overall marketing performance.
It’s literally the first step of SEO; without it, all other efforts are null.
Right there on the sidebar, you’ll be able to check your crawl status under the index tab.
Now, if you want to remove access to a certain webpage, you can tell Search Console directly. This is useful if a page is temporarily redirected or has a 404 error.
A 410 parameter will permanently remove a page from the index, so beware of using the nuclear option.
Common Crawl Errors & Solutions
If your website is unfortunate enough to be experiencing a crawl error, it may require an easy solution or be indicative of a much larger technical problem on your site.
The most common crawl errors I see are:
To diagnose some of these errors, you can leverage the URL Inspection tool to see how Google views your site.
Failure to properly fetch and render a page could be indicative of a deeper DNS error that will need to be resolved by your DNS provider.
Resolving a server error requires diagnosing a specific error. The most common errors include:
- Timeout.
- Connection refused.
- Connect failed.
- Connect timeout.
- No response.
Most of the time, a server error is usually temporary, although a persistent problem could require you to contact your hosting provider directly.
Robots.txt errors, on the other hand, could be more problematic for your site. If your robots.txt file is returning a 200 or 404 error, it means search engines are having difficulty retrieving this file.
You could submit a robots.txt sitemap or avoid the protocol altogether, opting to manually noindex pages that could be problematic for your crawl.
Resolving these errors quickly will ensure that all of your target pages are crawled and indexed the next time search engines crawl your site.
2. Create Mobile-Friendly Webpages
With the arrival of the mobile-first index, we must also optimize our pages to display mobile-friendly copies on the mobile index.
The good news is that a desktop copy will still be indexed and displayed under the mobile index if a mobile-friendly copy does not exist. The bad news is that your rankings may suffer as a result.
There are many technical tweaks that can instantly make your website more mobile-friendly including:
- Implementing responsive web design.
- Inserting the viewpoint meta tag in content.
- Minifying on-page resources (CSS and JS).
- Tagging pages with the AMP cache.
- Optimizing and compressing images for faster load times.
- Reducing the size of on-page UI elements.
Be sure to test your website on a mobile platform and run it through Google PageSpeed Insights. Page speed is an important ranking factor and can affect the speed at which search engines can crawl your site.
3. Update Content Regularly
Search engines will crawl your site more regularly if you produce new content on a regular basis.
This is especially useful for publishers who need new stories published and indexed on a regular basis.
Producing content on a regular basis signal to search engines that your site is constantly improving and publishing new content, and therefore needs to be crawled more often to reach its intended audience.
4. Submit A Sitemap To Each Search Engine
One of the best tips for indexation to this day remains to submit a sitemap to Google Search Console and Bing Webmaster Tools.
You can create an XML version using a sitemap generator or manually create one in Google Search Console by tagging the canonical version of each page that contains duplicate content.
5. Optimize Your Interlinking Scheme
Establishing a consistent information architecture is crucial to ensuring that your website is not only properly indexed, but also properly organized.
Creating main service categories where related webpages can sit can further help search engines properly index webpage content under certain categories when the intent may not be clear.
6. Deep Link To Isolated Webpages
If a webpage on your site or a subdomain is created in isolation or an error preventing it from being crawled, you can get it indexed by acquiring a link on an external domain.
This is an especially useful strategy for promoting new pieces of content on your website and getting it indexed quicker.
Beware of syndicating content to accomplish this as search engines may ignore syndicated pages, and it could create duplicate errors if not properly canonicalized.
7. Minify On-Page Resources & Increase Load Times
Forcing search engines to crawl large and unoptimized images will eat up your crawl budget and prevent your site from being indexed as often.
Search engines also have difficulty crawling certain backend elements of your website. For example, Google has historically struggled to crawl JavaScript.
Even certain resources like Flash and CSS can perform poorly over mobile devices and eat up your crawl budget.
In a sense, it’s a lose-lose scenario where page speed and crawl budget are sacrificed for obtrusive on-page elements.
Be sure to optimize your webpage for speed, especially over mobile, by minifying on-page resources, such as CSS. You can also enable caching and compression to help spiders crawl your site faster.
8. Fix Pages With Noindex Tags
Over the course of your website’s development, it may make sense to implement a noindex tag on pages that may be duplicated or only meant for users who take a certain action.
Regardless, you can identify webpages with noindex tags that are preventing them from being crawled by using a free online tool like Screaming Frog.
The Yoast plugin for WordPress allows you to easily switch a page from index to noindex. You could also do this manually in the backend of pages on your site.
9. Set A Custom Crawl Rate
In the old version of Google Search Console, you can actually slow or customize the speed of your crawl rates if Google’s spiders are negatively impacting your site.
This also gives your website time to make necessary changes if it is going through a significant redesign or migration.
10. Eliminate Duplicate Content
Having massive amounts of duplicate content can significantly slow down your crawl rate and eat up your crawl budget.
You can eliminate these problems by either blocking these pages from being indexed or placing a canonical tag on the page you wish to be indexed.
Along the same lines, it pays to optimize the meta tags of each individual page to prevent search engines from mistaking similar pages as duplicate content in their crawl.
11. Block Pages You Don’t Want Spiders To Crawl
There may be instances where you want to prevent search engines from crawling a specific page. You can accomplish this by the following methods:
- Placing a noindex tag.
- Placing the URL in a robots.txt file.
- Deleting the page altogether.
This can also help your crawls run more efficiently, instead of forcing search engines to pour through duplicate content.
Conclusion
The state of your website’s crawlability problems will more or less depend on how much you’ve been staying current with your own SEO.
If you’re tinkering in the back end all the time, you may have identified these issues before they got out of hand and started affecting your rankings.
If you’re not sure, though, run a quick scan in Google Search Console to see how you’re doing.
The results can really be educational!
More Resources:
Featured Image: Ernie Janes/Shutterstock
SEO
Google Ads To Phase Out Enhanced CPC Bidding Strategy
Google has announced plans to discontinue its Enhanced Cost-Per-Click (eCPC) bidding strategy for search and display ad campaigns.
This change, set to roll out in stages over the coming months, marks the end of an era for one of Google’s earliest smart bidding options.
Dates & Changes
Starting October 2024, new search and display ad campaigns will no longer be able to select Enhanced CPC as a bidding strategy.
However, existing eCPC campaigns will continue to function normally until March 2025.
From March 2025, all remaining search and display ad campaigns using Enhanced CPC will be automatically migrated to manual CPC bidding.
Advertisers who prefer not to change their campaigns before this date will see their bidding strategy default to manual CPC.
Impact On Display Campaigns
No immediate action is required for advertisers running display campaigns with the Maximize Clicks strategy and Enhanced CPC enabled.
These campaigns will automatically transition to the Maximize Clicks bidding strategy in March 2025.
Rationale Behind The Change
Google introduced Enhanced CPC over a decade ago as its first Smart Bidding strategy. The company has since developed more advanced machine learning-driven bidding options, such as Maximize Conversions with an optional target CPA and Maximize Conversion Value with an optional target ROAS.
In an email to affected advertisers, Google stated:
“These strategies have the potential to deliver comparable or superior outcomes. As we transition to these improved strategies, search and display ads campaigns will phase out Enhanced CPC.”
What This Means for Advertisers
This update signals Google’s continued push towards more sophisticated, AI-driven bidding strategies.
In the coming months, advertisers currently relying on Enhanced CPC will need to evaluate their options and potentially adapt their campaign management approaches.
While the change may require some initial adjustments, it also allows advertisers to explore and leverage Google’s more advanced bidding strategies, potentially improving campaign performance and efficiency.
FAQ
What change is Google implementing for Enhanced CPC bidding?
Google will discontinue the Enhanced Cost-Per-Click (eCPC) bidding strategy for search and display ad campaigns.
- New search and display ad campaigns can’t select eCPC starting October 2024.
- Existing campaigns will function with eCPC until March 2025.
- From March 2025, remaining eCPC campaigns will switch to manual CPC bidding.
How will this update impact existing campaigns using Enhanced CPC?
Campaigns using Enhanced CPC will continue as usual until March 2025. After that:
- Search and display ad campaigns employing eCPC will automatically migrate to manual CPC bidding.
- Display campaigns with Maximize Clicks and eCPC enabled will transition to the Maximize Clicks strategy in March 2025.
What are the recommended alternatives to Enhanced CPC?
Google suggests using its more advanced, AI-driven bidding strategies:
- Maximize Conversions – Can include an optional target CPA (Cost Per Acquisition).
- Maximize Conversion Value – Can include an optional target ROAS (Return on Ad Spend).
These strategies are expected to deliver comparable or superior outcomes compared to Enhanced CPC.
What should advertisers do in preparation for this change?
Advertisers need to evaluate their current reliance on Enhanced CPC and explore alternatives:
- Assess how newer AI-driven bidding strategies can be integrated into their campaigns.
- Consider transitioning some campaigns earlier to adapt to the new strategies gradually.
- Leverage tools and resources provided by Google to maximize performance and efficiency.
This proactive approach will help manage changes smoothly and explore potential performance improvements.
Featured Image: Vladimka production/Shutterstock
SEO
The 25 Biggest Traffic Losers in SaaS
We analyzed the organic traffic growth of 1,600 SaaS companies to discover the SEO strategies that work best in 2024…
…and those that work the worst.
In this article, we’re looking at the companies that lost the greatest amount of estimated organic traffic, year over year.
- We analyzed 1,600 SaaS companies and used the Ahrefs API to pull estimated monthly organic traffic data for August 2023 and August 2024.
- Companies were ranked by estimated monthly organic traffic loss as a percentage of their starting traffic.
- We’ve filtered out traffic loss caused by website migrations and URL redirects and set a minimum starting traffic threshold of 10,000 monthly organic pageviews.
This is a list of the SaaS companies that had the greatest estimated monthly organic traffic loss from August 2023 to August 2024.
Sidenote.
Our organic traffic metrics are estimates, and not necessarily reflective of the company’s actual traffic (only they know that). Traffic loss is not always bad, and there are plenty of reasons why companies may choose to delete pages and sacrifice keyword rankings.
Rank | Company | Change | Monthly Organic Traffic 2023 | Monthly Organic Traffic 2024 | Traffic Loss |
---|---|---|---|---|---|
1 | Causal | -99.52% | 307,158 | 1,485 | -305,673 |
2 | Contently | -97.16% | 276,885 | 7,866 | -269,019 |
3 | Datanyze | -95.46% | 486,626 | 22,077 | -464,549 |
4 | BetterCloud | -94.14% | 42,468 | 2,489 | -39,979 |
5 | Ricotta Trivia | -91.46% | 193,713 | 16,551 | -177,162 |
6 | Colourbox | -85.43% | 67,883 | 9,888 | -57,995 |
7 | Tabnine | -84.32% | 160,328 | 25,142 | -135,186 |
8 | AppFollow | -83.72% | 35,329 | 5,753 | -29,576 |
9 | Serverless | -80.61% | 37,896 | 7,348 | -30,548 |
10 | UserGuiding | -80.50% | 115,067 | 22,435 | -92,632 |
11 | Hopin | -79.25% | 19,581 | 4,064 | -15,517 |
12 | Writer | -78.32% | 2,460,359 | 533,288 | -1,927,071 |
13 | NeverBounce by ZoomInfo | -77.91% | 552,780 | 122,082 | -430,698 |
14 | ZoomInfo | -76.11% | 5,192,624 | 1,240,481 | -3,952,143 |
15 | Sakari | -73.76% | 27,084 | 7,106 | -19,978 |
16 | Frase | -71.39% | 83,569 | 23,907 | -59,662 |
17 | LiveAgent | -70.03% | 322,613 | 96,700 | -225,913 |
18 | Scoro | -70.01% | 51,701 | 15,505 | -36,196 |
19 | accessiBe | -69.45% | 111,877 | 34,177 | -77,700 |
20 | Olist | -67.51% | 204,298 | 66,386 | -137,912 |
21 | Hevo Data | -66.96% | 235,427 | 77,781 | -157,646 |
22 | TextGears | -66.68% | 19,679 | 6,558 | -13,121 |
23 | Unbabel | -66.40% | 45,987 | 15,450 | -30,537 |
24 | Courier | -66.03% | 35,300 | 11,992 | -23,308 |
25 | G2 | -65.74% | 4,397,226 | 1,506,545 | -2,890,681 |
For each of the top five companies, I ran a five-minute analysis using Ahrefs Site Explorer to understand what may have caused their traffic decline.
Possible explanations include Google penalties, programmatic SEO, and AI content.
Causal | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 307,158 | 1,485 | -305,673 | -99.52% |
Organic pages | 5,868 | 547 | -5,321 | -90.68% |
Organic keywords | 222,777 | 4,023 | -218,754 | -98.19% |
Keywords in top 3 | 8,969 | 26 | -8943 | -99.71% |
Causal is a finance platform for startups. They lost an estimated 99.52% of their organic traffic as a result of a Google manual penalty:
This story might sound familiar. Causal became internet-famous for an “SEO heist” that saw them clone a competitor’s sitemap and use generative AI to publish 1,800 low-quality articles like this:
Google caught wind and promptly issued a manual penalty. Causal lost hundreds of rankings and hundreds of thousands of pageviews, virtually overnight:
As the Ahrefs SEO Toolbar shows, the offending blog posts are now 301 redirected to the company’s (now much better, much more human-looking) blog homepage:
Contently | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 276,885 | 7,866 | -269,019 | -97.16% |
Organic pages | 32,752 | 1,121 | -31,631 | -96.58% |
Organic keywords | 94,706 | 12,000 | -82,706 | -87.33% |
Keywords in top 3 | 1,874 | 68 | -1,806 | -96.37% |
Contently is a content marketing platform. They lost 97% of their estimated organic traffic by removing thousands of user-generated pages.
Almost all of the website’s traffic loss seems to stem from deindexing the subdomains used to host their members’ writing portfolios:
A quick Google search for “contently writer portfolios” suggests that the company made the deliberate decision to deindex all writer portfolios by default, and only relist them once they’ve been manually vetted and approved:
We can see that these portfolio subdomains are now 302 redirected back to Contently’s homepage:
And looking at the keyword rankings Contently lost in the process, it’s easy to guess why this change was necessary. It looks like the free portfolio subdomains were being abused to promote CBD gummies and pirated movies:
Datanyze | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 486,626 | 22,077 | -464,549 | -95.46% |
Organic pages | 1,168,889 | 377,142 | -791,747 | -67.74% |
Organic keywords | 2,565,527 | 712,270 | -1,853,257 | -72.24% |
Keywords in top 3 | 7,475 | 177 | -7,298 | -97.63% |
Datanyze provides contact data for sales prospecting. They lost 96% of their estimated organic traffic, possibly as a result of programmatic content that Google has since deemed too low quality to rank.
Looking at the Site Structure report in Ahrefs, we can see over 80% of the website’s organic traffic loss is isolated to the /companies and /people subfolders:
Looking at some of the pages in these subfolders, it looks like Datanyze built thousands of programmatic landing pages to help promote the people and companies the company offers data for:
As a result, the majority of Datanyze’s dropped keyword rankings are names of people and companies:
Many of these pages still return 200 HTTP status codes, and a Google site search still shows hundreds of indexed pages:
In this case, not all of the programmatic pages have been deleted—instead, it’s possible that Google has decided to rerank these pages into much lower positions and drop them from most SERPs.
BetterCloud | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 42,468 | 2,489 | -39,979 | -94.14% |
Organic pages | 1,643 | 504 | -1,139 | -69.32% |
Organic keywords | 107,817 | 5,806 | -102,011 | -94.61% |
Keywords in top 3 | 1,550 | 32 | -1,518 | -97.94% |
Bettercloud is a SaaS spend management platform. They lost 94% of their estimated organic traffic around the time of Google’s November Core Update:
Looking at the Top Pages report for BetterCloud, most of the traffic loss can be traced back to a now-deleted /academy subfolder:
The pages in the subfolder are now deleted, but by using Ahrefs’ Page Inspect feature, it’s possible to look at a snapshot of some of the pages’ HTML content.
This short, extremely generic article on “How to Delete an Unwanted Page in Google Docs” looks a lot like basic AI-generated content:
This is the type of content that Google has been keen to demote from the SERPs.
Given the timing of the website’s traffic drop (a small decline after the October core update, and a precipitous decline after the November core update), it’s possible that Google demoted the site after an AI content generation experiment.
Ricotta Trivia | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 193,713 | 16,551 | -177,162 | -91.46% |
Organic pages | 218 | 231 | 13 | 5.96% |
Organic keywords | 83,988 | 37,640 | -46,348 | -55.18% |
Keywords in top 3 | 3,124 | 275 | -2,849 | -91.20% |
Ricotta Trivia is a Slack add-on that offers icebreakers and team-building games. They lost an estimated 91% of their monthly organic traffic, possibly because of thin content and poor on-page experience on their blog.
Looking at the Site Structure report, 99.7% of the company’s traffic loss is isolated to the /blog subfolder:
Digging into the Organic keywords report, we can see that the website has lost hundreds of first-page rankings for high-volume keywords like get to know you questions, funny team names, and question of the day:
While these keywords seem strongly related to the company’s core business, the article content itself seems very thin—and the page is covered with intrusive advertising banners and pop-ups (a common hypothesis for why some sites were negatively impacted by recent Google updates):
The site seems to show a small recovery on the back of the August 2024 core update—so there may be hope yet.
Final thoughts
All of the data for this article comes from Ahrefs. Want to research your competitors in the same way? Check out Site Explorer.
SEO
Mediavine Bans Publisher For Overuse Of AI-Generated Content
According to details surfacing online, ad management firm Mediavine is terminating publishers’ accounts for overusing AI.
Mediavine is a leading ad management company providing products and services to help website publishers monetize their content.
The company holds elite status as a Google Certified Publishing Partner, which indicates that it meets Google’s highest standards and requirements for ad networks and exchanges.
AI Content Triggers Account Terminations
The terminations came to light in a post on the Reddit forum r/Blogging, where a user shared an email they received from Mediavine citing “overuse of artificially created content.”
Trista Jensen, Mediavine’s Director of Ad Operations & Market Quality, states in the email:
“Our third party content quality tools have flagged your sites for overuse of artificially created content. Further internal investigation has confirmed those findings.”
Jensen stated that due to the overuse of AI content, “our top partners will stop spending on your sites, which will negatively affect future monetization efforts.”
Consequently, Mediavine terminated the publisher’s account “effective immediately.”
The Risks Of Low-Quality AI Content
This strict enforcement aligns with Mediavine’s publicly stated policy prohibiting websites from using “low-quality, mass-produced, unedited or undisclosed AI content that is scraped from other websites.”
In a March 7 blog post titled “AI and Our Commitment to a Creator-First Future,” the company declared opposition to low-value AI content that could “devalue the contributions of legitimate content creators.”
Mediavine warned in the post:
“Without publishers, there is no open web. There is no content to train the models that power AI. There is no internet.”
The company says it’s using its platform to “advocate for publishers” and uphold quality standards in the face of AI’s disruptive potential.
Mediavine states:
“We’re also developing faster, automated tools to help us identify low-quality, mass-produced AI content across the web.”
Targeting ‘AI Clickbait Kingpin’ Tactics
While the Reddit user’s identity wasn’t disclosed, the incident has drawn connections to the tactics of Nebojša Vujinović Vujo, who was dubbed an “AI Clickbait Kingpin” in a recent Wired exposé.
According to Wired, Vujo acquired over 2,000 dormant domains and populated them with AI-generated, search-optimized content designed purely to capture ad revenue.
His strategies represent the low-quality, artificial content Mediavine has vowed to prohibit.
Potential Implications
Lost Revenue
Mediavine’s terminations highlight potential implications for publishers that rely on artificial intelligence to generate website content at scale.
Perhaps the most immediate and tangible implication is the risk of losing ad revenue.
For publishers that depend heavily on programmatic advertising or sponsored content deals as key revenue drivers, being blocked from major ad networks could devastate their business models.
Devalued Domains
Another potential impact is the devaluation of domains and websites built primarily on AI-generated content.
If this pattern of AI content overuse triggers account terminations from companies like Mediavine, it could drastically diminish the value proposition of scooping up these domains.
Damaged Reputations & Brands
Beyond the lost monetization opportunities, publishers leaning too heavily into automated AI content also risk permanent reputational damage to their brands.
Once a determining authority flags a website for AI overuse, it could impact how that site is perceived by readers, other industry partners, and search engines.
In Summary
AI has value as an assistive tool for publishers, but relying heavily on automated content creation poses significant risks.
These include monetization challenges, potential reputation damage, and increasing regulatory scrutiny. Mediavine’s strict policy illustrates the possible consequences for publishers.
It’s important to note that Mediavine’s move to terminate publisher accounts over AI content overuse represents an independent policy stance taken by the ad management firm itself.
The action doesn’t directly reflect the content policies or enforcement positions of Google, whose publishing partner program Mediavine is certified under.
We have reached out to Mediavine requesting a comment on this story. We’ll update this article with more information when it’s provided.
Featured Image: Simple Line/Shutterstock
-
SEO7 days ago
How to Market When Information is Dirt Cheap
-
SEO5 days ago
Early Analysis & User Feedback
-
SEARCHENGINES7 days ago
Daily Search Forum Recap: September 2, 2024
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 3, 2024
-
SEO6 days ago
Google Trends Subscriptions Quietly Canceled
-
WORDPRESS7 days ago
MyDataNinja
-
AFFILIATE MARKETING4 days ago
What Is Founder Mode and Why Is It Better Than Manager Mode?
-
SEARCHENGINES4 days ago
Daily Search Forum Recap: September 6, 2024
You must be logged in to post a comment Login