SEO
How to Create SEO SOPs to Scale Organic Traffic
Standard operating procedures make your work faster, easier, and more scalable. This is particularly true in the case of SEO, where a lot of the tasks are simple and repeatable.
Things like adding meta tags, maintaining a proper URL structure, adding internal links, and optimizing your images are all easy to do… but also easy to forget.
Recording SEO tasks in SOP documents means they will never be forgotten—and, once documented, you can easily delegate these simple tasks to others in your team.
If you want to grow your organic traffic while working less, you want SEO SOPs.
Let’s dive into it.
A standard operating procedure (SOP) is a document that outlines how a task is done step by step. It often includes screenshots to visually show what you’re explaining, but the best SOPs include videos as well.
I typically make my SOPs in a Google Doc with screenshots. Then I use Loom to record my screen and create video explanations.
SEO SOPs allow you to:
- Never forget important SEO steps when publishing content.
- Hire an affordable assistant to have simple tasks taken off your plate.
- Scale your business in ways that aren’t possible without standardization.
McDonald’s was one of the first businesses to master standardization. And its burger line was much more complex than simply creating a document. You, too, can harness the power of a standardized process to increase output and efficiency.
There are a lot of processes in business that can benefit from SOPs. However, the five most critical SEO tasks that should have SOPs include:
- Content creation and on-page SEO.
- Internal linking procedures.
- Image optimization.
- Email outreach for link building.
- Tracking your rankings and making updates.
Below, I break down each one and give you example templates you can copy and mold to fit your business.
1. Content creation and on-page SEO
Content is the backbone of any good SEO strategy. However, creating great content takes time and effort. If you want to cut down on how long it takes, an SOP will do that.
At Ahrefs, our SOP for content includes these eight steps:
- Create a home for the post in the shared drive
- Fill in/update the Notion card
- Assign the header illustration
- Write the content outline
- Write the draft
- Edit the draft
- Assign custom images and screenshot annotations
- Prepare for editing + upload
Of course, these eight steps are specific to our systems and styles. We use Google Drive and Notion to keep track of everything, have a header illustration for each article, and have a rigorous editing process to ensure high-quality content.
Here’s what that document looks like:
As you can see, it is a well-documented and easy-to-follow process with examples and links.
Within this document, we also link to our writing guidelines SOP with even more detailed information on our actual writing process.
This includes our style and writing guidelines with actual examples…
… as well as screenshots to show, not just tell.
And lastly, we have SOPs in place to cover on-page SEO, which includes how to properly name images, add image alt text, and set the metadata for each page.
By creating these simple documents, you can make content creation much easier and quicker.
2. Internal linking procedures
Internal linking is crucial for ranking highly on Google. Every single page on your site, aside from perhaps landing pages, should have internal links. This is especially true for blog content.
Your SOP can look something like this:
- You should aim to include a relevant internal link anytime it can benefit the reader.
- Internal link anchor text should be relevant to the content you’re writing about and the content you’re linking to. For example, DO link to “RV accessories” from “RV water pump buyers guide.” DO NOT link to “van life builds” from “how to keep your RV cool in the summer.” Keep it related.
- Use Ahrefs’ Internal Link Opportunities tool in Site Audit to do the work. Alternatively, you can find internal links by performing a Google search for
site:[yoursitehere.com] “related keyword”
to see all the content that contains that related keyword or phrase.
3. Image optimization
Optimizing your images for Google (and for users) is often overlooked. However, it’s easy to do and also important if you want to reach that coveted first page.
There are three steps to ensuring good image optimization:
- Having a proper title that describes the image (without keyword stuffing)
- Adding alt text that describes the image in a bit more detail for those who can’t download and view the image
- Using the proper file format and reducing the overall data size of the image for faster load speeds
Follow our guide to image SEO for more information.
4. Email outreach for link building
Link building is key to a good SEO strategy—links are one of the most important Google ranking factors.
While building backlinks isn’t exactly easy, it does involve a lot of repeatable steps. This makes it a perfect SOP task. In fact, you’ll probably want multiple SOPs—one for each link building strategy, including:
Each of these procedures is different. So either turn your current process into a document or follow one of the linked guides above and create a document for it.
5. Tracking your rankings and making updates
Finally, we have something almost all SEOs love: watching your rankings go up (hopefully).
While it’s easy enough to spontaneously check your Ahrefs account or Google Search Console account to see whether or not your rankings are moving from your efforts, there’s a better way.
It still involves checking your Ahrefs account. But instead of doing it with random excitement like a kid in a candy store, you use a methodical approach that tracks your changes and their effects. After all, a lot of SEO is trial and error.
First, if you haven’t already, sign up for Ahrefs’ Rank Tracker. Our reports will show you your ranking changes over time with visualized data through charts and competitor reports.
You can track specific keywords and pages over time as well:
Once you have your account, you can make an SOP to check these on a daily, weekly, or monthly basis. Keep track of any changes you make to your pages, such as metadata, adding or changing content, improving your internal links, etc. Then document how these changes impact your rankings over time.
By making a habit of performing SEO tests like these and tracking the effects, you can see what works and what doesn’t—then scale up what works and stop wasting time on what doesn’t.
Final thoughts
By documenting your SEO processes in the form of SOPs, you can more easily scale up your business and hire others to perform the easier tasks.
If you want your business (and your organic traffic) to grow, having SOPs is one of the surest ways to do it. Learn what works, document the process, and scale it up. These five SEO tasks are just the beginning—you can have an SOP for every repeatable task in your business.
SEO
The 25 Biggest Traffic Losers in SaaS
We analyzed the organic traffic growth of 1,600 SaaS companies to discover the SEO strategies that work best in 2024…
…and those that work the worst.
In this article, we’re looking at the companies that lost the greatest amount of estimated organic traffic, year over year.
- We analyzed 1,600 SaaS companies and used the Ahrefs API to pull estimated monthly organic traffic data for August 2023 and August 2024.
- Companies were ranked by estimated monthly organic traffic loss as a percentage of their starting traffic.
- We’ve filtered out traffic loss caused by website migrations and URL redirects and set a minimum starting traffic threshold of 10,000 monthly organic pageviews.
This is a list of the SaaS companies that had the greatest estimated monthly organic traffic loss from August 2023 to August 2024.
Sidenote.
Our organic traffic metrics are estimates, and not necessarily reflective of the company’s actual traffic (only they know that). Traffic loss is not always bad, and there are plenty of reasons why companies may choose to delete pages and sacrifice keyword rankings.
Rank | Company | Change | Monthly Organic Traffic 2023 | Monthly Organic Traffic 2024 | Traffic Loss |
---|---|---|---|---|---|
1 | Causal | -99.52% | 307,158 | 1,485 | -305,673 |
2 | Contently | -97.16% | 276,885 | 7,866 | -269,019 |
3 | Datanyze | -95.46% | 486,626 | 22,077 | -464,549 |
4 | BetterCloud | -94.14% | 42,468 | 2,489 | -39,979 |
5 | Ricotta Trivia | -91.46% | 193,713 | 16,551 | -177,162 |
6 | Colourbox | -85.43% | 67,883 | 9,888 | -57,995 |
7 | Tabnine | -84.32% | 160,328 | 25,142 | -135,186 |
8 | AppFollow | -83.72% | 35,329 | 5,753 | -29,576 |
9 | Serverless | -80.61% | 37,896 | 7,348 | -30,548 |
10 | UserGuiding | -80.50% | 115,067 | 22,435 | -92,632 |
11 | Hopin | -79.25% | 19,581 | 4,064 | -15,517 |
12 | Writer | -78.32% | 2,460,359 | 533,288 | -1,927,071 |
13 | NeverBounce by ZoomInfo | -77.91% | 552,780 | 122,082 | -430,698 |
14 | ZoomInfo | -76.11% | 5,192,624 | 1,240,481 | -3,952,143 |
15 | Sakari | -73.76% | 27,084 | 7,106 | -19,978 |
16 | Frase | -71.39% | 83,569 | 23,907 | -59,662 |
17 | LiveAgent | -70.03% | 322,613 | 96,700 | -225,913 |
18 | Scoro | -70.01% | 51,701 | 15,505 | -36,196 |
19 | accessiBe | -69.45% | 111,877 | 34,177 | -77,700 |
20 | Olist | -67.51% | 204,298 | 66,386 | -137,912 |
21 | Hevo Data | -66.96% | 235,427 | 77,781 | -157,646 |
22 | TextGears | -66.68% | 19,679 | 6,558 | -13,121 |
23 | Unbabel | -66.40% | 45,987 | 15,450 | -30,537 |
24 | Courier | -66.03% | 35,300 | 11,992 | -23,308 |
25 | G2 | -65.74% | 4,397,226 | 1,506,545 | -2,890,681 |
For each of the top five companies, I ran a five-minute analysis using Ahrefs Site Explorer to understand what may have caused their traffic decline.
Possible explanations include Google penalties, programmatic SEO, and AI content.
Causal | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 307,158 | 1,485 | -305,673 | -99.52% |
Organic pages | 5,868 | 547 | -5,321 | -90.68% |
Organic keywords | 222,777 | 4,023 | -218,754 | -98.19% |
Keywords in top 3 | 8,969 | 26 | -8943 | -99.71% |
Causal is a finance platform for startups. They lost an estimated 99.52% of their organic traffic as a result of a Google manual penalty:
This story might sound familiar. Causal became internet-famous for an “SEO heist” that saw them clone a competitor’s sitemap and use generative AI to publish 1,800 low-quality articles like this:
Google caught wind and promptly issued a manual penalty. Causal lost hundreds of rankings and hundreds of thousands of pageviews, virtually overnight:
As the Ahrefs SEO Toolbar shows, the offending blog posts are now 301 redirected to the company’s (now much better, much more human-looking) blog homepage:
Contently | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 276,885 | 7,866 | -269,019 | -97.16% |
Organic pages | 32,752 | 1,121 | -31,631 | -96.58% |
Organic keywords | 94,706 | 12,000 | -82,706 | -87.33% |
Keywords in top 3 | 1,874 | 68 | -1,806 | -96.37% |
Contently is a content marketing platform. They lost 97% of their estimated organic traffic by removing thousands of user-generated pages.
Almost all of the website’s traffic loss seems to stem from deindexing the subdomains used to host their members’ writing portfolios:
A quick Google search for “contently writer portfolios” suggests that the company made the deliberate decision to deindex all writer portfolios by default, and only relist them once they’ve been manually vetted and approved:
We can see that these portfolio subdomains are now 302 redirected back to Contently’s homepage:
And looking at the keyword rankings Contently lost in the process, it’s easy to guess why this change was necessary. It looks like the free portfolio subdomains were being abused to promote CBD gummies and pirated movies:
Datanyze | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 486,626 | 22,077 | -464,549 | -95.46% |
Organic pages | 1,168,889 | 377,142 | -791,747 | -67.74% |
Organic keywords | 2,565,527 | 712,270 | -1,853,257 | -72.24% |
Keywords in top 3 | 7,475 | 177 | -7,298 | -97.63% |
Datanyze provides contact data for sales prospecting. They lost 96% of their estimated organic traffic, possibly as a result of programmatic content that Google has since deemed too low quality to rank.
Looking at the Site Structure report in Ahrefs, we can see over 80% of the website’s organic traffic loss is isolated to the /companies and /people subfolders:
Looking at some of the pages in these subfolders, it looks like Datanyze built thousands of programmatic landing pages to help promote the people and companies the company offers data for:
As a result, the majority of Datanyze’s dropped keyword rankings are names of people and companies:
Many of these pages still return 200 HTTP status codes, and a Google site search still shows hundreds of indexed pages:
In this case, not all of the programmatic pages have been deleted—instead, it’s possible that Google has decided to rerank these pages into much lower positions and drop them from most SERPs.
BetterCloud | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 42,468 | 2,489 | -39,979 | -94.14% |
Organic pages | 1,643 | 504 | -1,139 | -69.32% |
Organic keywords | 107,817 | 5,806 | -102,011 | -94.61% |
Keywords in top 3 | 1,550 | 32 | -1,518 | -97.94% |
Bettercloud is a SaaS spend management platform. They lost 94% of their estimated organic traffic around the time of Google’s November Core Update:
Looking at the Top Pages report for BetterCloud, most of the traffic loss can be traced back to a now-deleted /academy subfolder:
The pages in the subfolder are now deleted, but by using Ahrefs’ Page Inspect feature, it’s possible to look at a snapshot of some of the pages’ HTML content.
This short, extremely generic article on “How to Delete an Unwanted Page in Google Docs” looks a lot like basic AI-generated content:
This is the type of content that Google has been keen to demote from the SERPs.
Given the timing of the website’s traffic drop (a small decline after the October core update, and a precipitous decline after the November core update), it’s possible that Google demoted the site after an AI content generation experiment.
Ricotta Trivia | 2023 | 2024 | Absolute change | Percent change |
---|---|---|---|---|
Organic traffic | 193,713 | 16,551 | -177,162 | -91.46% |
Organic pages | 218 | 231 | 13 | 5.96% |
Organic keywords | 83,988 | 37,640 | -46,348 | -55.18% |
Keywords in top 3 | 3,124 | 275 | -2,849 | -91.20% |
Ricotta Trivia is a Slack add-on that offers icebreakers and team-building games. They lost an estimated 91% of their monthly organic traffic, possibly because of thin content and poor on-page experience on their blog.
Looking at the Site Structure report, 99.7% of the company’s traffic loss is isolated to the /blog subfolder:
Digging into the Organic keywords report, we can see that the website has lost hundreds of first-page rankings for high-volume keywords like get to know you questions, funny team names, and question of the day:
While these keywords seem strongly related to the company’s core business, the article content itself seems very thin—and the page is covered with intrusive advertising banners and pop-ups (a common hypothesis for why some sites were negatively impacted by recent Google updates):
The site seems to show a small recovery on the back of the August 2024 core update—so there may be hope yet.
Final thoughts
All of the data for this article comes from Ahrefs. Want to research your competitors in the same way? Check out Site Explorer.
SEO
Mediavine Bans Publisher For Overuse Of AI-Generated Content
According to details surfacing online, ad management firm Mediavine is terminating publishers’ accounts for overusing AI.
Mediavine is a leading ad management company providing products and services to help website publishers monetize their content.
The company holds elite status as a Google Certified Publishing Partner, which indicates that it meets Google’s highest standards and requirements for ad networks and exchanges.
AI Content Triggers Account Terminations
The terminations came to light in a post on the Reddit forum r/Blogging, where a user shared an email they received from Mediavine citing “overuse of artificially created content.”
Trista Jensen, Mediavine’s Director of Ad Operations & Market Quality, states in the email:
“Our third party content quality tools have flagged your sites for overuse of artificially created content. Further internal investigation has confirmed those findings.”
Jensen stated that due to the overuse of AI content, “our top partners will stop spending on your sites, which will negatively affect future monetization efforts.”
Consequently, Mediavine terminated the publisher’s account “effective immediately.”
The Risks Of Low-Quality AI Content
This strict enforcement aligns with Mediavine’s publicly stated policy prohibiting websites from using “low-quality, mass-produced, unedited or undisclosed AI content that is scraped from other websites.”
In a March 7 blog post titled “AI and Our Commitment to a Creator-First Future,” the company declared opposition to low-value AI content that could “devalue the contributions of legitimate content creators.”
Mediavine warned in the post:
“Without publishers, there is no open web. There is no content to train the models that power AI. There is no internet.”
The company says it’s using its platform to “advocate for publishers” and uphold quality standards in the face of AI’s disruptive potential.
Mediavine states:
“We’re also developing faster, automated tools to help us identify low-quality, mass-produced AI content across the web.”
Targeting ‘AI Clickbait Kingpin’ Tactics
While the Reddit user’s identity wasn’t disclosed, the incident has drawn connections to the tactics of Nebojša Vujinović Vujo, who was dubbed an “AI Clickbait Kingpin” in a recent Wired exposé.
According to Wired, Vujo acquired over 2,000 dormant domains and populated them with AI-generated, search-optimized content designed purely to capture ad revenue.
His strategies represent the low-quality, artificial content Mediavine has vowed to prohibit.
Potential Implications
Lost Revenue
Mediavine’s terminations highlight potential implications for publishers that rely on artificial intelligence to generate website content at scale.
Perhaps the most immediate and tangible implication is the risk of losing ad revenue.
For publishers that depend heavily on programmatic advertising or sponsored content deals as key revenue drivers, being blocked from major ad networks could devastate their business models.
Devalued Domains
Another potential impact is the devaluation of domains and websites built primarily on AI-generated content.
If this pattern of AI content overuse triggers account terminations from companies like Mediavine, it could drastically diminish the value proposition of scooping up these domains.
Damaged Reputations & Brands
Beyond the lost monetization opportunities, publishers leaning too heavily into automated AI content also risk permanent reputational damage to their brands.
Once a determining authority flags a website for AI overuse, it could impact how that site is perceived by readers, other industry partners, and search engines.
In Summary
AI has value as an assistive tool for publishers, but relying heavily on automated content creation poses significant risks.
These include monetization challenges, potential reputation damage, and increasing regulatory scrutiny. Mediavine’s strict policy illustrates the possible consequences for publishers.
It’s important to note that Mediavine’s move to terminate publisher accounts over AI content overuse represents an independent policy stance taken by the ad management firm itself.
The action doesn’t directly reflect the content policies or enforcement positions of Google, whose publishing partner program Mediavine is certified under.
We have reached out to Mediavine requesting a comment on this story. We’ll update this article with more information when it’s provided.
Featured Image: Simple Line/Shutterstock
SEO
Google’s Guidance About The Recent Ranking Update
Google’s Danny Sullivan explained the recent update, addressing site recoveries and cautioning against making radical changes to improve rankings. He also offered advice for publishes whose rankings didn’t improve after the last update.
Google’s Still Improving The Algorithm
Danny said that Google is still working on their ranking algorithm, indicating that more changes (for the positive) are likely on the way. The main idea he was getting across is that they’re still trying to fill the gaps in surfacing high quality content from independent sites. Which is good because big brand sites don’t necessarily have the best answers.
He wrote:
“…the work to connect people with “a range of high quality sites, including small or independent sites that are creating useful, original content” is not done with this latest update. We’re continuing to look at this area and how to improve further with future updates.”
A Message To Those Who Were Left Behind
There was a message to those publishers whose work failed to recover with the latest update, to let them know that Google is still working to surface more of the independent content and that there may be relief on the next go.
Danny advised:
“…if you’re feeling confused about what to do in terms of rankings…if you know you’re producing great content for your readers…If you know you’re producing it, keep doing that…it’s to us to keep working on our systems to better reward it.”
Google Cautions Against “Improving” Sites
Something really interesting that he mentioned was a caution against trying to improve rankings of something that’s already on page one in order to rank even higher. Tweaking a site to get from position six or whatever to something higher has always been a risky thing to do for many reasons I won’t elaborate on here. But Danny’s warning increases the pressure to not just think twice before trying to optimize a page for search engines but to think three times and then some more.
Danny cautioned that sites that make it to the top of the SERPs should consider that a win and to let it ride instead of making changes right now in order to improve their rankings. The reason for that caution is that the search results continue to change and the implication is that changing a site now may negatively impact the rankings in a newly updated search index.
He wrote:
“If you’re showing in the top results for queries, that’s generally a sign that we really view your content well. Sometimes people then wonder how to move up a place or two. Rankings can and do change naturally over time. We recommend against making radical changes to try and move up a spot or two”
How Google Handled Feedback
There was also some light shed on what Google did with all the feedback they received from publishers who lost rankings. Danny wrote that the feedback and site examples he received was summarized, with examples, and sent to the search engineers for review. They continue to use that feedback for the next round of improvements.
He explained:
“I went through it all, by hand, to ensure all the sites who submitted were indeed heard. You were, and you continue to be. …I summarized all that feedback, pulling out some of the compelling examples of where our systems could do a better job, especially in terms of rewarding open web creators. Our search engineers have reviewed it and continue to review it, along with other feedback we receive, to see how we can make search better for everyone, including creators.”
Feedback Itself Didn’t Lead To Recovery
Danny also pointed out that sites that recovered their rankings did not do so because of they submitted feedback to Google. Danny wasn’t specific about this point but it conforms with previous statements about Google’s algorithms that they implement fixes at scale. So instead of saying, “Hey let’s fix the rankings of this one site” it’s more about figuring out if the problem is symptomatic of something widescale and how to change things for everybody with the same problem.
Danny wrote:
“No one who submitted, by the way, got some type of recovery in Search because they submitted. Our systems don’t work that way.”
That feedback didn’t lead to recovery but was used as data shouldn’t be surprising. Even as far back as the 2004 Florida Update Matt Cutts collected feedback from people, including myself, and I didn’t see a recovery for a false positive until everyone else also got back their rankings.
Takeaways
Google’s work on their algorithm is ongoing:
Google is continuing to tune its algorithms to improve its ability to rank high quality content, especially from smaller publishers. Danny Sullivan emphasized that this is an ongoing process.
What content creators should focus on:
Danny’s statement encouraged publishers to focus on consistently creating high quality content and not to focus on optimizing for algorithms. Focusing on quality should be the priority.
What should publishers do if their high-quality content isn’t yet rewarded with better rankings?
Publishers who are certain of the quality of their content are encouraged to hold steady and keep it coming because Google’s algorithms are still being refined.
Featured Image by Shutterstock/Cast Of Thousands
-
SEO6 days ago
How to Market When Information is Dirt Cheap
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 2, 2024
-
SEO4 days ago
Early Analysis & User Feedback
-
SEARCHENGINES5 days ago
Daily Search Forum Recap: September 3, 2024
-
SEO6 days ago
Google Trends Subscriptions Quietly Canceled
-
SEO6 days ago
What Is Largest Contentful Paint: An Easy Explanation
-
WORDPRESS6 days ago
MyDataNinja
-
AFFILIATE MARKETING3 days ago
What Is Founder Mode and Why Is It Better Than Manager Mode?
You must be logged in to post a comment Login