Connect with us

SEO

How & Why To Prevent Bots From Crawling Your Site

Published

on

For the most part, bots and spiders are relatively harmless.

You want Google’s bot, for example, to crawl and index your website.

However, bots and spiders can sometimes be a problem and provide unwanted traffic.

This kind of unwanted traffic can result in:

  • Obfuscation of where the traffic is coming from.
  • Confusing and hard to understand reports.
  • Misattribution in Google Analytics.
  • Increased bandwidth costs that you pay for.
  • Other nuisances.

There are good bots and bad bots.

Good bots run in the background, seldom attacking another user or website.

Bad bots break the security behind a website or are used as a wide, large-scale botnet to deliver DDOS attacks against a large organization (something that a single machine cannot take down).

Here’s what you should know about bots and how to prevent the bad ones from crawling your site.

What Is A Bot?

Looking at exactly what a bot is can help identify why we need to block it and keep it from crawling our site.

A bot, short for “robot,” is a software application designed to repeat a specific task repeatedly.

For many SEO professionals, utilizing bots goes along with scaling an SEO campaign.

“Scaling” means you automate as much work as possible to get better results faster.

Common Misconceptions About Bots

You may have run into the misconception that all bots are evil and must be banned unequivocally from your site.

But this could not be further from the truth.

Google is a bot.

If you block Google, can you guess what will happen to your search engine rankings?

Some bots can be malicious, designed to create fake content or posing as legit websites to steal your data.

However, bots are not always malicious scripts run by bad actors.

Some can be great tools that help make work easier for SEO professionals, such as automating common repetitive tasks or scraping useful information from search engines.

Some common bots SEO professionals use are Semrush and Ahrefs.

These bots scrape useful data from the search engines, help SEO pros automate and complete tasks, and can help make your job easier when it comes to SEO tasks.

Why Would You Need to Block Bots From Crawling Your Site?

While there are many good bots, there are also bad bots.

Bad bots can help steal your private data or take down an otherwise operating website.

We want to block any bad bots we can uncover.

It’s not easy to discover every bot that may crawl your site but with a little bit of digging, you can find malicious ones that you don’t want to visit your site anymore.

So why would you need to block bots from crawling your website?

Some common reasons why you may want to block bots from crawling your site could include:

Protecting Your Valuable Data

Perhaps you found that a plugin is attracting a number of malicious bots who want to steal your valuable consumer data.

Or, you found that a bot took advantage of a security vulnerability to add bad links all over your site.

Or, someone keeps trying to spam your contact form with a bot.

This is where you need to take certain steps to protect your valuable data from getting compromised by a bot.

Bandwidth Overages

If you get an influx of bot traffic, chances are your bandwidth will skyrocket as well, leading to unforeseen overages and charges you would rather not have.

You absolutely want to block the offending bots from crawling your site in these cases.

You don’t want a situation where you’re paying thousands of dollars for bandwidth you don’t deserve to be charged for.

What’s bandwidth?

Bandwidth is the transfer of data from your server to the client-side (web browser).

Every time data is sent over a connection attempt you use bandwidth.

When bots access your site and you waste bandwidth, you could incur overage charges from exceeding your monthly allotted bandwidth.

You should have been given at least some detailed information from your host when you signed up for your hosting package.

Limiting Bad Behavior

If a malicious bot somehow started targeting your site, it would be appropriate to take steps to control this.

For example, you would want to ensure that this bot would not be able to access your contact forms. You want to make sure the bot can’t access your site.

Do this before the bot can compromise your most critical files.

By ensuring your site is properly locked down and secure, it is possible to block these bots so they don’t cause too much damage.

How To Block Bots From Your Site Effectively

You can use two methods to block bots from your site effectively.

The first is through robots.txt.

This is a file that sits at the root of your web server. Usually, you may not have one by default, and you would have to create one.

These are a few highly useful robots.txt codes that you can use to block most spiders and bots from your site:

Disallow Googlebot From Your Server

If, for some reason, you want to stop Googlebot from crawling your server at all, the following code is the code you would use:

User-agent: Googlebot
Disallow: /

You only want to use this code to keep your site from being indexed at all.

Don’t use this on a whim!

Have a specific reason for making sure you don’t want bots crawling your site at all.

For example, a common issue is wanting to keep your staging site out of the index.

You don’t want Google crawling the staging site and your real site because you are doubling up on your content and creating duplicate content issues as a result.

Disallowing All Bots From Your Server

If you want to keep all bots from crawling your site at all, the following code is the one you will want to use:

User-agent: *
Disallow: /

This is the code to disallow all bots. Remember our staging site example from above?

Perhaps you want to exclude the staging site from all bots before fully deploying your site to all of them.

Or perhaps you want to keep your site private for a time before launching it to the world.

Either way, this will keep your site hidden from prying eyes.

Keeping Bots From Crawling a Specific Folder

If for some reason, you want to keep bots from crawling a specific folder that you want to designate, you can do that too.

The following is the code you would use:

User-agent: *
Disallow: /folder-name/

There are many reasons someone would want to exclude bots from a folder. Perhaps you want to ensure that certain content on your site isn’t indexed.

Or maybe that particular folder will cause certain types of duplicate content issues, and you want to exclude it from crawling entirely.

Either way, this will help you do that.

Common Mistakes With Robots.txt

There are several mistakes that SEO professionals make with robots.txt. The top common mistakes include:

  • Using both disallow in robots.txt and noindex.
  • Using the forward slash / (all folders down from root), when you really mean a specific URL.
  • Not including the correct path.
  • Not testing your robots.txt file.
  • Not knowing the correct name of the user-agent you want to block.

Using Both Disallow In Robots.txt And Noindex On The Page

Google’s John Mueller has stated you should not be using both disallow in robots.txt and noindex on the page itself.

If you do both, Google cannot crawl the page to see the noindex, so it could potentially still index the page anyway.

This is why you should only use one or the other, and not both.

Using The Forward Slash When You Really Mean A Specific URL

The forward slash after Disallow means “from this root folder on down, completely and entirely for eternity.”

Every page on your site will be blocked forever until you change it.

One of the most common issues I find in website audits is that someone accidentally added a forward slash to “Disallow:” and blocked Google from crawling their entire site.

Not Including The Correct Path

We understand. Sometimes coding robots.txt can be a tough job.

You couldn’t remember the exact correct path initially, so you went through the file and winging it.

The problem is that these similar paths all result in 404s because they are one character off.

This is why it’s important always to double-check the paths you use on specific URLs.

You don’t want to run the risk of adding a URL to robots.txt that isn’t going to work in robots.txt.

Not Knowing The Correct Name Of The User-Agent

If you want to block a particular user-agent but you don’t know the name of that user-agent, that’s a problem.

Rather than using the name you think you remember, do some research and figure out the exact name of the user-agent that you need.

If you are trying to block specific bots, then that name becomes extremely important in your efforts.

Why Else Would You Block Bots And Spiders?

There are other reasons SEO pros would want to block bots from crawling their site.

Perhaps they are deep into gray hat (or black hat) PBNs, and they want to hide their private blog network from prying eyes (especially their competitors).

They can do this by utilizing robots.txt to block common bots that SEO professionals use to assess their competition.

For example Semrush and Ahrefs.

If you wanted to block Ahrefs, this is the code to do so:

User-agent: AhrefsBot
Disallow: /

This will block AhrefsBot from crawling your entire site.

If you want to block Semrush, this is the code to do so.

There are also other instructions here.

There are a lot of lines of code to add, so be careful when adding these:

To block SemrushBot from crawling your site for different SEO and technical issues:

User-agent: SiteAuditBot
Disallow: /

To block SemrushBot from crawling your site for Backlink Audit tool:

User-agent: SemrushBot-BA
Disallow: /

To block SemrushBot from crawling your site for On Page SEO Checker tool and similar tools:

User-agent: SemrushBot-SI
Disallow: /

To block SemrushBot from checking URLs on your site for SWA tool:

User-agent: SemrushBot-SWA
Disallow: /

To block SemrushBot from crawling your site for Content Analyzer and Post Tracking tools:

User-agent: SemrushBot-CT
Disallow: /

To block SemrushBot from crawling your site for Brand Monitoring:

User-agent: SemrushBot-BM
Disallow: /

To block SplitSignalBot from crawling your site for SplitSignal tool:

User-agent: SplitSignalBot
Disallow: /

To block SemrushBot-COUB from crawling your site for Content Outline Builder tool:

User-agent: SemrushBot-COUB
Disallow: /

Using Your HTACCESS File To Block Bots

If you are on an APACHE web server, you can utilize your site’s htaccess file to block specific bots.

For example, here is how you would use code in htaccess to block ahrefsbot.

Please note: be careful with this code.

If you don’t know what you are doing, you could bring down your server.

We only provide this code here for example purposes.

Make sure you do your research and practice on your own before adding it to a production server.

Order Allow,Deny
Deny from 51.222.152.133
Deny from 54.36.148.1
Deny from 195.154.122
Allow from all

For this to work properly, make sure you block all the IP ranges listed in this article on the Ahrefs blog.

If you want a comprehensive introduction to .htaccess, look no further than this tutorial on Apache.org.

If you need help using your htaccess file to block specific types of bots, you can follow the tutorial here.

Blocking Bots and Spiders Can Require Some Work

But it’s well worth it in the end.

By making sure you block bots and spiders from crawling your site, you don’t fall into the same trap as others.

You can rest easy knowing your site is immune to certain automated processes.

When you can control these particular bots, it makes things that much better for you, the SEO professional.

If you have to, always make sure that block the required bots and spiders from crawling your site.

This will result in enhanced security, a better overall online reputation, and a much better site that will be there in the years to come.

More resources:


Featured Image: Roman Samborskyi/Shutterstock

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘prevent-bot-crawling’,
content_category: ‘technical-seo web-development’
});

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

GA4 Update Brings Alignment With Google Ads Targeting

Published

on

By

GA4 Update Brings Alignment With Google Ads Targeting

Google announced an update to the advertising section within Google Analytics 4 (GA4).

The enhancement aims to clarify and align user counts eligible for remarketing and ad personalization.

Under the change, advertisers can now quickly view the size of their “Advertising Segments” within GA4’s interface.

These segments represent the pool of users whose data can be leveraged for remarketing campaigns and personalized ad targeting through products like Google Ads.

Improved Synchronization For Unified Insights

Previously, there could be discrepancies between the user counts shown as eligible for advertising use cases in GA4 and the Google Ads Audience Manager.

With this update, Google says the numbers will be fully aligned, allowing marketers to confidently make data-driven advertising decisions.

Expanding Advertising Segment Visibility

Along with the alignment fix, the update expands visibility into advertising segment sizes within the GA4 interface.

A new “Advertising segments” panel under the “Advertising” section reports the number of users GA4 collects and sends to ad products for personalization.

An “advertising segment” is a list of GA4 users synchronized with Google advertising products for remarketing and personalized ad targeting purposes.

Segment sizes can vary based on targeting requirements for different ad networks.

Why SEJ Cares

This update from Google addresses a key pain point for advertisers utilizing GA4 and Google Ads.

Full alignment between advertising audience sizes across products eliminates confusion and enables more data-driven strategies.

The added transparency into advertising segment sizes directly in GA4 is also a welcomed upgrade.

How This Can Help You

With aligned user counts, advertisers can plan and forecast remarketing campaigns with greater precision using GA4 data.

This unified view means you can make media investment decisions based on accurate reach projections.

Additionally, the new advertising segments panel provides extra context about the scope of your audiences for ad personalization.

This visibility allows for more informed strategies tailored to your specific segment sizes.


Featured Image: Lightspring/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

OpenAI’s Rockset Acquisition And How It May Impact Digital Marketing

Published

on

By

OpenAI's Rockset Acquisition And How It May Impact Digital Marketing

OpenAI acquired a technology from Rockset that will enable the creation of new products, real-time data analysis, and recommendation systems, possibly signaling a new phase for OpenAI that could change the face of search marketing in the near future.

What Is Rockset And Why It’s Important

Rockset describes its technology as a Hybrid Search, a type of multi-faceted approach to search (integrating vector search, text search and metadata filtering) to retrieve documents that can augment the generation process in RAG systems. RAG is a technique that combines search with generative AI that is intended to create more factually accurate and contextually relevant results. It’s a technology that plays a role in BING’s AI search and Google’s AI Overviews.

Rockset’s research paper about the Rockset Hybrid Search Architecture notes:

“All vector search is becoming hybrid search as it drives the most relevant, real-time application experiences. Hybrid search involves incorporating vector search and text
search as well as metadata filtering, all in a single query. Hybrid search is used in search, recommendations and retrieval augmented generation (RAG) applications.

…Rockset is designed and optimized to ingest data in real time, index different data types and run retrieval and ranking algorithms.”

What makes Rockset’s hybrid search important is that it allows the indexing and use of multiple data types (vectors, text, geospatial data about objects & events), including real-time data use. That powerful flexibility allows the technology to interact with different kinds of data that can be used for in-house and consumer-facing applications related to contextually relevant product recommendations, customer segmentation and analysis for targeted marketing campaigns, personalization, personalized content aggregation, location-based recommendations (restaurants, services, etc.) and in applications that increase user engagement (Rockset lists numerous case studies of how their technology is used).

OpenAI’s announcement explained:

“AI has the opportunity to transform how people and organizations leverage their own data. That’s why we’ve acquired Rockset, a leading real-time analytics database that provides world-class data indexing and querying capabilities.

Rockset enables users, developers, and enterprises to better leverage their own data and access real-time information as they use AI products and build more intelligent applications.

…Rockset’s infrastructure empowers companies to transform their data into actionable intelligence. We’re excited to bring these benefits to our customers…”

OpenAI’s announcement also explains that they intend to integrate Rockset’s technology into their own retrieval infrastructure.

At this point we know the transformative quality of hybrid search and the possibilities but OpenAI is at this point only offering general ideas of how this will translate into APIs and products that companies and individuals can create and use.

The official announcement of the acquisition from Rockset, penned by one of the cofounders, offered these clues:

“We are thrilled to join the OpenAI team and bring our technology and expertise to building safe and beneficial AGI.

…Advanced retrieval infrastructure like Rockset will make AI apps more powerful and useful. With this acquisition, what we’ve developed over the years will help make AI accessible to all in a safe and beneficial way.

Rockset will become part of OpenAI and power the retrieval infrastructure backing OpenAI’s product suite. We’ll be helping OpenAI solve the hard database problems that AI apps face at massive scale.”

What Exactly Does The Acquisition Mean?

Duane Forrester, formerly of Bing Search and Yext (LinkedIn profile), shared his thoughts:

“Sam Altman has stated openly a couple times that they’re not chasing Google. I get the impression he’s not really keen on being seen as a search engine. More like they want to redefine the meaning of the phrase “search engine”. Reinvent the category and outpace Google that way. And Rockset could be a useful piece in that approach.

Add in Apple is about to make “ChatGPT” a mainstream thing with consumers when they launch the updated Siri this Fall, and we could very easily see query starts migrate away from traditional search engine boxes. Started with TikTok/social, now moving to ai-assistants.”

Another approach, which could impact SEO, is that OpenAI could create a product based on an API that can be used by companies to power in-house and consumer facing applications. With that approach, OpenAI provides the infrastructure (like they currently do with ChatGPT and foundation models) and let the world innovate all over the place with OpenAI at the center (as it currently does) as the infrastructure.

I asked Duane about that scenario and he agreed but also remained open to an even wider range of possibilities:

“Absolutely, a definite possibility. As I’ve been approaching this topic, I’ve had to go up a level. Or conceptually switch my thinking. Search is, at its heart, information retrieval. So if I go down the IR path, how could one reinvent  “search” with today’s systems and structures that redefine how information retrieval happens?

This is also – it should be noted- a description for the next-gen advanced site search.  They could literally take over site search across a wide range of mid-to-enterprise level companies. It’s easily as advanced as the currently most advanced site-search systems. Likely more advanced if they launch it. So ultimately, this could herald a change to consumer search (IR) and site-search-based systems.

Expanding from that, apps, as they allude to.  So I can see their direction here.”

Deedy Das of Menlo Ventures (Poshmark, Roku, Uber) speculated on Twitter about how this acquisition may transform OpenAI:

“This is speculation but I imagine Rockset will power all their enterprise search offerings to compete with Glean and / or a consumer search offering to compete with Perplexity / Google. Permissioning capabilities of Rockset make me think more the former than latter”

Others on Twitter offered their take on how this will affect the future of AI:

“I doubt OpenAI will jump into the enterprise search fray. It’s just far too challenging and something that Microsoft and Google are best positioned to go after.

This is a play to accelerate agentic behaviors and make deep experts within the enterprise. You might argue it’s the same thing an enterprise search but taking an agent first approach is much more inline with the OpenAI mission.”

A Consequential Development For OpenAI And Beyond

The acquisition of Rockset may prove to be the foundation of one of the most consequential changes to how businesses use and deploy AI, which in turn, like many other technological developments, could also have an effect on the business of digital marketing.

Read how Rockset customers power recommendation systems, real-time personalization, real-time analytics, and other applications:

Featured Case Studies

Read the official Rockset announcement:

OpenAI Acquires Rockset

Read the official OpenAI announcement:

OpenAI acquires Rockset
Enhancing our retrieval infrastructure to make AI more helpful

Read the original Rockset research paper:

Rockset Hybrid Search Architecture (PDF)

Featured Image by Shutterstock/Iconic Bestiary

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Results-Driven SEO Project Management: From Chaos to Cash

Published

on

Results-Driven SEO Project Management: From Chaos to Cash

Making a profit in SEO relies on good project management. That means doing things that get results rather than just drowning yourself in endless tasks.

Below, I’ll walk you through a 7-step process to do exactly that.

Having clear goals keeps your team unified toward a specific direction. For example, if your boss allocates $5,000/month for the SEO project, you need to translate this into meaningful results and milestones you can report on. 

A goal you can easily set is to increase the website’s organic traffic value. This is a metric unique to Ahrefs that estimates a dollar value of SEO traffic. 

If you invest $5,000/month in SEO for six months, you could aim to increase your website’s organic traffic value by $30,000 ($5,000 ✕ 6 months). 

This isn’t the most accurate method because traffic value doesn’t necessarily correlate with real-world revenues, but it works as an easy starting point for setting targets. 

A better solution is to use conversion data and average order or deal value to set goals around delivering a return on investment. You can find these metrics in your analytics tool, like Google Analytics, if conversion tracking is set up: 

Tracking conversion rate and average order value in Google Analytics.Tracking conversion rate and average order value in Google Analytics.

Sidenote.

If you don’t have access to conversion metrics like this, to be conservative, use 1% as a ballpark conversion rate and the cheapest product or service price for the average order value.

Using these metrics, you can calculate the number of sales needed to break even on the SEO campaign.

# of monthly sales to break even = monthly SEO cost average order value 

Since this project’s monthly SEO cost is $5,000, we’ll need to grow sales from organic traffic by 32.25 for each month of the project’s duration. 

Here’s the formula to discover roughly how much traffic or projected organic sessions you’ll need: 

projected organic sessions = transactions needed to break even conversion rate

So in this example, we divide 32.25 transactions by the conversion rate of 0.86% to learn that we need at least 3,750 organic monthly sessions to break even. Of course, not all traffic is created equal, so keep that in mind going forward. 

So far, so good! (Save this number, we’ll need it in a moment). 

In many cases, the timeline will be decided for you by your boss or client. For instance, if you take on a client with a six-month contract, that’s the timeframe in which you generally have to deliver results. 

The question at this stage is whether it’s possible to reach your performance goal in that time. 

Truthfully, there’s no way to know for sure, but you can look to your competitors for an idea. 

Sure, you have no idea what their SEO budgets are (they could be spending 10x what you are), but if you see multiple competitors of a similar caliber getting similar results over a similar timeframe, that’s a good sign. 

For instance, in its first six months of SEO, Webflow reached just shy of 24,000 organic monthly traffic, with a traffic value of $76,510 (according to Ahrefs). 

Webflow's SEO performance in the first 6 months.Webflow's SEO performance in the first 6 months.

By comparison, Duda’s first-year performance is also fairly close to Webflow’s.

Duda's SEO performance in the first 6 months.Duda's SEO performance in the first 6 months.

So, if these are your competitors and your target is to reach a traffic value of $30,000 in six months or to increase monthly traffic by 3,750 sessions, it certainly seems achievable. 

If you don’t see competitors hitting your target in your timeframe like this, you’ll need to rethink your goals and communicate them to key stakeholders. Communication is critical for setting the right expectations with your boss or clients. 

Now that you’ve set an achievable goal for the project timeline, the next step is to plan what tasks actually need to be done to get you there. 

You’ll need to spend some time on strategic tasks to help you determine the correct implementations for the project. 

Don’t be tempted to skip this part!! 

If you don’t spend enough time on strategic tasks like competitor analysis, keyword research, and auditing the current website, no matter how much action you take, it’ll be useless if you’re heading in the wrong direction. 

But don’t overdo it, either. You need to balance strategy with implementation to get results. 

For example, there’s generally a notable difference in performance between a project that spends one month on strategy and publishes content ASAP compared to a project that front-loads strategic tasks and implements content a few months later. 

SEO project management strategy differencesSEO project management strategy differences

I recommend spending ⅙ of the project timeline on strategy and ⅚ on implementation for the best balance. 

As for what specific tasks you can plan, there are many things you could focus on here. The right things for your website will vary depending on your available skills and resources, plus what’s working best in your industry… but here’s where I’d start, given that the target is to increase traffic. 

a) Fill content gaps

Start by finding pages that deliver traffic to competitors that your website doesn’t have. 

Using Ahrefs’ Competitive Analysis tool, make sure you select the “keywords” tab and then enter your website along with a handful of your top competitors, like so: 

Adding competitors to Ahrefs' Competitive Analysis report .Adding competitors to Ahrefs' Competitive Analysis report .

Then check out the results to find topics your competitors have written about that you haven’t. Make sure you qualify the topics according to what has business value for you. 

For instance, let’s look at design-related keywords that Wix or Squarespace rank for but Webflow doesn’t.

Finding keywords competitors rank for that your website doesn't.Finding keywords competitors rank for that your website doesn't.

Many of these keywords hold very little business value for a company like Webflow, like any related to logo makers and generators. However, keywords related to design trends and principles might be topics Webflow can consider for its blog since designers are a staple part of its audience demographic. 

For topics that have business value, create new content targeting these keywords. 

There can be a lot of data to sift through here, so I recommend my content gap analysis template for a faster and smoother process 😉 

b) Boost authority of top pages

This task is about identifying which of your content is already performing well and sending more internal links and backlinks to those pages. 

You can find the best pages to promote by using the Top Pages report in Site Explorer. Here you’ll see which pages on your site get the most traffic: 

Using Ahrefs' Top Pages report to quickly identify pages with the most traffic on your website.Using Ahrefs' Top Pages report to quickly identify pages with the most traffic on your website.

Then, navigate to the Internal Link Opportunities report in Site Audit. You can set an advanced filter to narrow down the opportunities to the pages you care most about. Check out the suggested anchor text and keyword contexts and implement all the internal links that make sense in your content. 

Ahrefs' Internal Link Opportunities report.Ahrefs' Internal Link Opportunities report.

You should also build backlinks to these pages. You can use the Competitive Analysis report again, but this time, set it to referring domains or pages. 

Sidenote.

Setting it to referring domains will give you a list of websites you can add to an outreach list. Setting it to referring pages will give you the exact URLs where the links to your competitors are. These links can be included in outreach messages to make them more customized.

Also, instead of using the homepage, add the exact page you want to link to and compare it to your competitors’ pages on the same topic. Make sure you set all pages to “exact URL” to get the page-level (instead of website-level) backlink data. 

Using Ahrefs' Competitive Analysis report to find backlink gaps.Using Ahrefs' Competitive Analysis report to find backlink gaps.

There are many different backlinking techniques you can consider implementing. Check out our video on how to get your first 100 links if you’re unsure where to start: 

c) Update content with low-hanging fruit opportunities

For an established website with a decent amount of existing content, you can also look for opportunities to quickly update existing content and boost performance with little effort. 

In Ahrefs’ Site Explorer, check out your pages that are already ranking in positions 4-15 by using Opportunities > Low-Hanging Fruit Keywords: 

Finding low hanging fruit keyword in Ahrefs.Finding low hanging fruit keyword in Ahrefs.

Find pages with many keywords in this range and try to close topic gaps on those pages. For example, let’s take our post on affiliate marketing and look at its low-hanging fruit opportunities. 

We could isolate similar keywords that don’t already have a dedicated section in our article, like the following about becoming an affiliate. 

Example of low hanging fruit keywords to add to an article.Example of low hanging fruit keywords to add to an article.

These are already hovering around the middle of page one on Google. With a small, dedicated section about this topic, we can likely improve rankings for these keywords with minimal effort. 

Most tasks aren’t a one-time thing. For example, you’ll probably create or update multiple pieces of content during an SEO project. 

So, the next step is to create a library of repeatable task templates that you can duplicate in your project. 

Example of SEO task templates using ClickUp.Example of SEO task templates using ClickUp.
Source: Screenshot taken in ClickUp

If you don’t do this and just assume your team knows what to do, it can cause chaos, and there’s a high chance your project won’t succeed. 

Here’s what you should add to each task template: 

  • Who → assignees, reviewers, watchers, key stakeholders
  • What → what’s the goal of the task + what exactly needs to be done
  • When → dates to start and finish a task, estimated hours to complete
  • Where → what tools should be used, where should deliverables be added, where can templates/relevant info be found
  • Why → connect the task to a strategic objective
  • How → SOP or process outlined in a clear and detailed brief

Obviously, the exact details for some of these will need to be filled in on a task-by-task basis as you duplicate them into your project. For example, instead of assigning the template tasks to a specific person, indicate the role that is responsible for the task until you’re ready to assign it to someone. 

Likewise, with the due dates. In the template, instead of adding exact due dates, indicate an estimated length of time each task should take and a general rule for when the task is due after it’s been assigned. 

Not every project will need every task, so the idea is to pull in what’s required as needed and have the bulk of the info pre-filled to reduce the time it takes to brief the task. 

With your tasks set and templates created, it’s now time to start doing.

This is where things can often fall apart unless you distribute responsibility and ownership of tasks and processes throughout your team. 

SEO project management doesn’t fail because there aren’t SOPs and processes in place. It fails because the people executing the processes aren’t given ownership of them.

Mads SingersMads Singers

Here are 3 reasons why this can happen: 

  1. Without clear ownership, all team members rely on you for approvals before they can complete a task or start another. It slows everything down, and very little gets done efficiently.
  2. A “not my job” mentality can take root in your team. Unless team members take ownership of their tasks, you will be responsible for micromanaging everything to ensure your team is doing what it’s supposed to be.
  3. The people best placed to decide upon and update processes aren’t the ones doing so. They’re just doing whatever “management” tells them to do even if they see a better way.

You can solve the first two problems by clearly identifying who is responsible for specific tasks and processes and allowing them to get on with those tasks without having to run every tiny thing through you. 

You can solve the third problem by letting the people on the ground decide how their tasks are done and giving them responsibility for updating SOPs and relevant task templates. This again frees up your time and attention to focus on strategy, not micromanaging. 

PRO TIP

It also helps to break up bigger tasks into sub-components when multiple people are involved, like: 

  • Briefing → SEO Strategist or Account Manager
  • Implementation → often, a non-SEO professional like a writer, developer, or designer 
  • Review → Senior SEO
  • Final approval → Client
Example SEO project management timelineExample SEO project management timeline

For the love of all things good, please don’t manage SEO projects via email. It’s horrible. 

Invest in setting up a proper project management tool to scale with you. Consider your needs before you start planning all your tasks and projects. 

There’s no tool that’s best for everyone, but I recommend you check out Asana, ClickUp, or Monday to get you started. 

In any of these tools, you can easily set up separate projects and task templates. For example, here’s a basic setup of the first month’s tasks you can consider in Clickup: 

Example of Month 1 SEO tasks created in ClickUp.Example of Month 1 SEO tasks created in ClickUp.

Within each task, you can pre-fill certain fields and add a description, like so: 

Example of a task template for SEO project management.Example of a task template for SEO project management.

This is where you can add your brief, relevant links, and the essential details needed to turn the task into a template. Of course, there are nuances of how this works between different project management tools, but the basic idea remains the same. 

It’s worth spending time setting up your tasks and templates correctly so you can save time down the track as your project or team grows. 

The last piece of this framework is tracking resources spent and results achieved. 

Tracking resources

The easiest way to track resources is to create custom fields in your project management tool that measure specific resources allocated for each task. Some tools also let you build out reports to see how your resource allocation is going across different time frames, teams, or projects. 

The types of resources you might consider tracking include: 

  • Cost of the task
  • Planned time allocated
  • Actual time spent
  • Cost of tools required to do the task
  • Credits the task is worth (if you use a credit system)
  • Sprint points (if you work in sprints)

For more in-depth insights on where your resources are going, consider tagging tasks according to whether they’re strategic, implementation, or administrative. This way, you can quickly and easily spot imbalances like investing too much in tasks that don’t contribute to results. 

Tracking results

Measuring your results requires going beyond your project management tool and using a combination of your analytics software and an SEO tool like Ahrefs. 

When you start working on a new campaign, make sure you record a benchmark of the existing performance of the website. Then, keep regular tabs on the metrics that matter for the project and performance milestones you’ve achieved along the way. 

For example, you can use Ahrefs Webmaster Tools to monitor performance across your entire portfolio for free. 

The dashboard allows you to quickly see how performance is trending for key SEO metrics across all projects you’ve added: 

Ahrefs' dashboard showing quick performance stats across multiple projects.Ahrefs' dashboard showing quick performance stats across multiple projects.

Key Takeaways

Results-driven SEO project management starts with the end in mind and works backward. It doesn’t assume you’ll see performance improvements just because you’re doing lots of stuff. 

Instead, it is very intentional about figuring out exactly what needs to be done and linking those actions to realistic and achievable outcomes. 

In the words of Mads Singers: 

The starting point is figuring out how to deliver a return on investment. This is the most important thing. Then it’s about giving your team ownership and control over the tasks related to their roles. 

Once these foundations are in place, only then is it about documenting processes. But it shouldn’t be a business owner or manager who does the documentation. Processes should be owned by the people doing the work and who can keep SOPs current.

Mads SingersMads Singers

The process shared above allows you to do all of this and more. If you have any questions about your SEO project management goals or processes, feel free to contact me on LinkedIn anytime. 

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending