Connect with us

SEO

How to Use Ahrefs to Improve SEO

Published

on

How to Use Ahrefs to Improve SEO

Whether you’re new or old to Ahrefs, you’re in the right place.

This tutorial will walk you through the most practical, repeatable, and actionable Ahrefs use cases from our six core tools that will help improve your SEO.

Site Explorer is our competitive research tool. With Site Explorer, you can see a website’s:

  • Backlinks
  • Keywords it ranks for in Google
  • Site structure in a tree format
  • Pages that are responsible for generating the most search traffic
  • Google ads campaigns

And more.

Because there are so many things you can do in Site Explorer, we won’t be able to go through every use case. Instead, we’ll cover a few low-hanging fruits:

1. Restore lost link equity from broken backlinks

If there are broken pages with backlinks on your website, that link equity is wasted.

You can reclaim the value of the link equity by either restoring those pages or redirecting the broken URLs to relevant live pages.

Here’s how to find broken pages with backlinks on your website:

  1. Enter your domain 
  2. Go to the Best by links report
  3. Set the HTTP code filter to 404 

For example, we could redirect this blog post about asking for tweets to this one on blogger outreach to reclaim around 42 referring domains:

One of our dead blog posts with 42 referring domainsOne of our dead blog posts with 42 referring domains

2. Find featured snippet opportunities

Featured snippets are full or partial answers to a query directly on the SERPs.

An example of a featured snippetAn example of a featured snippet

If you can grab the snippet, you can jump ahead of everyone else. That means more search traffic to your site.

Here’s how to find low-hanging featured snippet opportunities:

  1. Enter your domain 
  2. Go to the Organic keywords report
  3. Set a Positions filter from 1 – 10 (you need to be on the first page to win it) 
  4. Set a SERP features filter to “where target doesn’t rank” and check featured snippet 
Low-hanging featured snippet opportunities for our website, via Ahrefs' Site ExplorerLow-hanging featured snippet opportunities for our website, via Ahrefs' Site Explorer

You can now see thousands of keywords where you can try and optimize your pages to win the featured snippet.

Follow the tutorial below to learn how to capture featured snippets.

3. Reverse engineer a site’s structure

Investigating a site’s structure shows you which parts of the site attract the most search traffic. 

Here’s how to see a high-level overview of a website’s structure:

  1. Enter your competitor’s domain 
  2. Go to the Site structure report 
Mailchimp's site structureMailchimp's site structure

For example, we can see that most of Mailchimp’s search traffic goes to its root domain. We can also click on the arrow to see a more detailed breakdown.

A breakdown of Mailchimp's subfoldersA breakdown of Mailchimp's subfolders

From here, we learn that Mailchimp has a subfolder called “marketing-glossary” that gets an estimated 600K monthly search visits. 

MailChimp's marketing glossary subfolder gets around 600,000 monthly search visitsMailChimp's marketing glossary subfolder gets around 600,000 monthly search visits

If you’re a competitor, creating a glossary could be a potential strategy you might want to replicate.

4. Replicate your competitors’ top pages

If competitors get lots of traffic to pages about certain topics, you probably can, too.

Here’s how to find your competitors’ top pages:

  1. Enter your competitor’s domain 
  2. Go to the Top pages report 
The pages that attract the most search traffic for MailchimpThe pages that attract the most search traffic for Mailchimp

This report shows you the pages that attract the most search traffic for your competitor. 

For example, Mailchimp’s email marketing guide gets an estimated 53,000 monthly search visits. The keyword that sends them the most traffic is “email marketing,” which they rank #1 for in the US. 

Mailchimp ranks #1 for email marketing and gets a estimated total of 53,000 monthly search visitsMailchimp ranks #1 for email marketing and gets a estimated total of 53,000 monthly search visits

If you’re a competitor, this might be a topic worth targeting.

5. Analyze your competitors’ backlinks for link-building opportunities

If you want to rank for anything remotely competitive, merely publishing content isn’t enough. You need backlinks.

One way to do this is to analyze your competitor’s backlink profile to see how they’ve been acquiring theirs. 

Here’s how to see your competitor’s backlinks:

  1. Enter your competitor’s domain 
  2. Go to the Backlinks report
The backlink profile for MailchimpThe backlink profile for Mailchimp

You can see that Mailchimp has close to 800K backlinks. 

Here’s the thing: Your chances of finding anything useful by manually sifting through all 800K is slim. But if you know what you’re looking for, you can add the right filters and find the right link prospects.

For example, if you’re looking for resource page opportunities, you can add a “Ref. page URL” filter to search for terms like resources.html, resources.php, resources.asp, links.html, links.php, and links.asp.

Using the referring page URL filter in Site Explorer to find resource page link building opportunitiesUsing the referring page URL filter in Site Explorer to find resource page link building opportunities

Apply the filters and hit show results, and you now have over 700 potential opportunities for resource page link building.

There are 761 resource page link building opportunities, discovered by analyzing Mailchimp's backlink profileThere are 761 resource page link building opportunities, discovered by analyzing Mailchimp's backlink profile

6. Analyze competing pages’ backlink profiles for link opportunities

You can also do the same backlink analysis as above but on a page level. 

For example, say we want to analyze the backlink profile of HubSpot’s blog post on email marketing statistics. We want to create a competing page targeting that topic, so we want to know how HubSpot got so many links to their page.

HubSpot's email marketing statistics blog post has around 13,000 backlinksHubSpot's email marketing statistics blog post has around 13,000 backlinks

Manually sifting through 14K backlinks is a huge waste of time, so we’ll go to the Anchors report to see all anchor texts of backlinks pointing at HubSpot’s article.

Anchor texts of links pointing at HubSpot's article on email marketing statisticsAnchor texts of links pointing at HubSpot's article on email marketing statistics

Eyeballing the report tells us that most people are linking to HubSpot’s page because of some specific stats:

People are linking to HubSpot's page because of specific stats they featuredPeople are linking to HubSpot's page because of specific stats they featured

There are two actionable takeaways:

  1. We should include similar stats on our page, as these earn links. 
  2. We should replace outdated stats so we can use them in our outreach campaign. 

FYI, that’s exactly what we did for our SEO statistics post. Since then, we’ve earned thousands of backlinks and ranked #1 consistently.

Our SEO statistics post has 2,000 referring domains and ranks #1 on GoogleOur SEO statistics post has 2,000 referring domains and ranks #1 on Google

Learn how we did that in our video series below.

7. Find broken link building opportunities

Broken link building is where you:

  1. Find a broken page that has backlinks 
  2. Create your own page on the topic 
  3. Reach out to those linking to the broken page to link to you instead 

Here’s how you can find broken link building opportunities:

  1. Enter your competitor’s domain 
  2. Go to the Best by links report 
  3. Set a HTTP code filter to 404 not found 
GetResponse's broken pages, which are potential broken link building opportunities for a competitorGetResponse's broken pages, which are potential broken link building opportunities for a competitor

For example, if you’re a competitor to GetResponse, this topic on “what are popups” might make sense for you to cover because 21 websites are linking to it.

GetResponse's article on what are popups, which is now deadGetResponse's article on what are popups, which is now dead

To see who’s linking to these pages, click on a caret beside the URL and go to the Backlinks report.

Clicking the Backlinks report to see the backlinks pointing at GetResponse's dead articleClicking the Backlinks report to see the backlinks pointing at GetResponse's dead article
Backlinks report for GetResponse's dead articleBacklinks report for GetResponse's dead article

From here, you can reach out to the people linking to these broken pages and ask them to link to your new guide on the topic. 

Keywords Explorer is our keyword research tool. 

Let’s look at a few ways to find good keywords to target, fast.

1. Find keywords by search intent

Search intent is the reason behind a searcher’s query. To rank high on Google, you’ll need to match search intent.

But, analyzing the SERPs for thousands of keywords manually can be incredibly time-consuming. So, a quick way is to use keyword modifiers like “best,” “how,” and “buy.”

So, let’s say we have an ecommerce store that sells camping equipment. Here’s how we would find keywords by search intent:

  1. Enter a few broad seed keywords (e.g., camping, tent, sleeping bag, campfire) 
  2. Go to the Matching terms report
  3. Add an Include filter for these modifiers (how, what, when, where, why, tutorial, tips) 
Finding informational keywords using the Include filter in Keywords ExplorerFinding informational keywords using the Include filter in Keywords Explorer

This will show us a list of informational keywords we can create content for on our blog.

If we want to find commercial investigation keywords, we can simply add an Include filter for words like “best,” “vs,” and “review.”

Finding commercial keywords using the Include filter in Keywords ExplorerFinding commercial keywords using the Include filter in Keywords Explorer

9. Find low-competition keywords

There are two ways to find low-competition keywords in Keywords Explorer. 

The first way is to set a Keyword difficulty (KD) filter. Set it to a low number, like 10, and you’ll see low-difficulty keywords you can target:

Applying a low KD filter to find low difficulty keywordsApplying a low KD filter to find low difficulty keywords

The second way is to set a Domain Rating (DR) filter. DR is widely used in the SEO community to estimate a website’s authority. So, setting a DR filter can help you find keywords where non-authoritative sites rank high in the SERPs. 

So, let’s set it to a low value like 30. This will show us keywords that have at least one website with a DR up to 30 in the top 5. 

Applying a lowest DR filter to find keywords where a site with low-authority is rankingApplying a lowest DR filter to find keywords where a site with low-authority is ranking

If we expand one of the SERPs, we see a result with <DR30 and 0 backlinks!

A result ranking for best pop up tent with <DR30 and no backlinksA result ranking for best pop up tent with <DR30 and no backlinks

This certainly seems like an easy topic to rank for. 

10. Bulk analyze a list of keywords

You can enter up to 10,000 keywords at a time in Keywords Explorer, which allows you to analyze any custom list of keywords.

Once you’ve pasted your list, you’ll be able to view all their metrics.

You can bulk analyze up to 10,000 keywords in Keywords ExplorerYou can bulk analyze up to 10,000 keywords in Keywords Explorer

You can also cluster them by terms or Parent Topics.

You can group your custom keyword list by parent topics or termsYou can group your custom keyword list by parent topics or terms

11. See organic share of voice for your competitors

If you’ve entered your own list of keywords, you can go to the Traffic share by domains report to see sites that rank for your list of keywords, along with the traffic share they own.

See the organic share of voice for your competitors by going to the Traffic share by domains reportSee the organic share of voice for your competitors by going to the Traffic share by domains report

This tactic is great for keyword research. 

For example, we can click on the caret for any of the websites, choose Top pages, and we can see all the topics that are sending them the most search traffic.

Click on Top pages for any site in the Traffic share by domain to see their pages with the most search trafficClick on Top pages for any site in the Traffic share by domain to see their pages with the most search traffic

Site Audit lets you crawl your website to find and monitor for 100+ technical and on-page SEO issues.

To run a crawl, create a new project and either import your websites from Google Search Console (GSC) or add them manually.

Import or add your project into Site AuditImport or add your project into Site Audit

When your crawl is complete, you’ll see the Overview report, which will show you a high-level overview of all technical and on-page issues Ahrefs found on your site.

Overview report for Site Audit, after your crawl is doneOverview report for Site Audit, after your crawl is done

If your main goal is to keep your site in good technical health, then all you need to do is work on fixing the issues we found when crawling your site.

To do that, head to the All issues report.

The All Issues report in Site Audit, where you can see all your site's technical and on page SEO issuesThe All Issues report in Site Audit, where you can see all your site's technical and on page SEO issues

You can prioritize by working on the red ones first, which represent errors. Then, work on the yellow ones (warnings) and the blue ones (notices).

To see the affected URLs, click on the number in the corresponding row:

Click on the number in the Crawled column to see affected URLsClick on the number in the Crawled column to see affected URLs

There’s more to Site Audit than maintaining your site’s technical health, though. Here are some more use cases (that are not technical SEO):

12. Find all affiliate links on a website

Let’s say you own a recipe blog that mostly makes money from Amazon affiliate links. However, you recently joined a new affiliate program with higher payouts than Amazon. Now, you need to swap out the Amazon links for your new affiliate ones. 

But rather than doing a sitewide change, you want to test the new affiliate links on a few pages to get a sample size for conversion rates.

Here’s how you can do this with Site Audit (after running a crawl):

  1. Go to Page Explorer
  2. Hit Advanced filter 
  3. Create a rule to find URLs that have an external link containing amzn.to (Amazon’s short link). 
  4. Set an Organic traffic filter to show pages that get at least 1,000 monthly organic visits 
Setting up advanced filters to find Amazon affiliate linksSetting up advanced filters to find Amazon affiliate links

Hit ‘Apply,’ and you’ll see 51 pages that match these filters:

51 results matches the filter we set in Page Explorer51 results matches the filter we set in Page Explorer

You can pick from these pages to replace the Amazon links.

13. Find internal linking opportunities

The Internal Link Opportunities report shows you internal linking opportunities based on keywords your pages rank for. 

Specifically, it shows:

  • The page we recommend you link from 
  • The keyword that’s mentioned on the source page (also the keyword that the target page ranks for) 
  • The page we recommend you link to 
The internal link opportunities report for AhrefsThe internal link opportunities report for Ahrefs

For example, let’s say we want to add internal links to our blog post on keyword research. In the report, we can set a Target page filter and paste the URL to our keyword research guide.

Searching for a page to internally link to using the Target page filterSearching for a page to internally link to using the Target page filter

We now have over 200 potential pages we can link from.

Rank Tracker lets you monitor your Google rankings over time. You can track rankings from any country, city, zip, or postal code. On top of that, you can segment your keywords using tags and track your performance against your competitors.

In the Rank Tracker Overview, you can see charts that give you a nice visualization of various categories like share of voice, average position, traffic, SERP features, and positions. And these graphs are affected by the filters you set. 

Rank Tracker overviewRank Tracker overview

Below these groups is the data table where you’ll see ranking, keyword, and traffic data for each tracked keyword.

Ranking position, search volume, traffic, keyword difficulty, SERP features, and more for your tracked keywords, via Rank TrackerRanking position, search volume, traffic, keyword difficulty, SERP features, and more for your tracked keywords, via Rank Tracker

A cool feature in Rank Tracker is that we keep track of your competitors too:

14. Automatically track your competitors’ rankings

Go to the Competitors overview report and you’ll see the same data, plus how your competitors are performing for every keyword.

Competitors overview in Rank TrackerCompetitors overview in Rank Tracker

Even if you didn’t add any tracked competitors to your project, you can still get competitor insights by going to the Competitors traffic share report.

Competitors traffic share report in Rank TrackerCompetitors traffic share report in Rank Tracker

This report shows you all your organic search competitors for your tracked keywords. If you look at the Pages tab, you can see the exact pages you’re competing with in organic search. If you go to the Domains tab, you’ll see all websites ranking for your tracked keywords.

See all websites ranking for your tracked keywords in the Domains tabSee all websites ranking for your tracked keywords in the Domains tab

For example, both HubSpot and Shopify own a lot of traffic for our tracked keywords. So, they likely rank for a ton of topics that might be interesting for our own blog, which means we could dig deeper into them in Site Explorer.

Content Explorer is a search engine for marketers with billions of pages in its index. Search for any topic, and you’ll see all pages that match your query, along with their SEO and social metrics. 

The best part: You can apply any combination of filters to dig into the data.

Here are some use cases for Content Explorer:

15. Find low-competition topics with high search traffic potential

Here’s how:

  1. Enter a broad query (e.g., backpack) 
  2. Set a Referring domains filter <10 (to find low-competition topics) 
  3. Set a Page traffic filter >500 (to find topics with high traffic potential) 
  4. Set a Word count filter >500 (to find blog content) 
The filters for finding low-competition topics with high search traffic potentialThe filters for finding low-competition topics with high search traffic potential

For example, this seems like a great topic to cover for a website that sells backpacks:

A page on how to solo backpack that gets 770 monthly search visits with only 10 referring domainsA page on how to solo backpack that gets 770 monthly search visits with only 10 referring domains

Click the Page traffic box and you’ll see the exact keywords it ranks for, its ranking positions, and more:

The Page traffic tab shows you the exact keywords it ranks for, its ranking positions, and moreThe Page traffic tab shows you the exact keywords it ranks for, its ranking positions, and more

16. Find guest post opportunities

Here’s how:

  1. Enter a niche-related query (e.g., knitting) 
  2. Set a DR filter of 30-65 (to find low- to mid-authority websites) 
  3. Set a Website traffic filter of >5,000 (to find sites that get a good amount of search traffic) 
  4. Set a Word count filter of >500 (to narrow results to blog content) 
The filters for finding guest post opportunitiesThe filters for finding guest post opportunities

Head to the Websites tab to see sites that match our filters:

The Websites tab helps you identify guest post opportunities fastThe Websites tab helps you identify guest post opportunities fast

Eyeball the report to find potential sites to pitch. For example, a site like Nimble Needles would make a good guest post target. 

Web Explorer allows you to search through all pages, domains, and links that are indexed by Yep, which is our search engine. This index is around 500 billion pages, ~36 times larger than Content Explorer’s index. 

Basically, you can search through almost anything and filter them down by SEO metrics.

Here are some use cases:

17. Find unlinked brand mentions

Unlinked mention link building is when you: 

  • Find pages that mention your brand but don’t link to you 
  • Reach out and ask them to link to you 

The reason why this tactic works well is because the battle is halfway won. They know who you are and probably like you, but they may have just forgotten to link to you.

To find unlinked mentions, search for [your brand] -outlinkdomain:[yourwebsite].com -site:[yourwebsite].com.

The filters for finding unlinked brand mentionsThe filters for finding unlinked brand mentions

You’ll see over 61 million pages that mention Ahrefs but don’t link to us. 

18. Search niche-relevant pages that link out to Amazon affiliate URLs

For example, let’s say you want to find pages on gardening that link out to Amazon affiliate URLs. The reason you might search this is to find potential acquisition targets, find websites that might be interested in joining your affiliate program, or find affiliate content ideas for your own gardening website.

To find these pages, run this search: “gardening” (outlinkdomain:amazon.com OR outlinkdomain:amzn.to)

Filters for finding pages that link to Amazon affiliate URLsFilters for finding pages that link to Amazon affiliate URLs

You’ll see 14 million pages that match this query. 

FYI, if you want to explore more use cases in Web Explorer, simply hit the Examples tab. 

Web Explorer has an examples tab to help you find more use casesWeb Explorer has an examples tab to help you find more use cases

Final thoughts

We’ve barely scratched the surface with all available use cases in Ahrefs. That’s why we’ve also created a 7-hour certification course that digs deep into how Ahrefs works. I highly recommend checking it out.

Even though we’ve covered our core tools, we have some other tools as well. 

Check out Competitive Analysis, which includes tools like Content Gap and Link Intersect.

Ahrefs Competitive Analysis toolAhrefs Competitive Analysis tool

Our Batch Analysis tool lets you get SEO metrics on up to 200 targets in seconds:

Ahrefs' Batch Analysis toolAhrefs' Batch Analysis tool

And don’t forget to install our free SEO toolbar where you can get Ahrefs metrics laid over your SERPs and web pages. 

Ahrefs SEO toolbar overlays SEO metrics on your SERPsAhrefs SEO toolbar overlays SEO metrics on your SERPs

Any questions or comments? Let me know on X (Twitter).



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

8% Of Automattic Employees Choose To Resign

Published

on

By

8% Of Automattic Employees Choose To Resign

WordPress co-founder and Automattic CEO announced today that he offered Automattic employees the chance to resign with a severance pay and a total of 8.4 percent. Mullenweg offered $30,000 or six months of salary, whichever one is higher, with a total of 159 people taking his offer.

Reactions Of Automattic Employees

Given the recent controversies created by Mullenweg, one might be tempted to view the walkout as a vote of no-confidence in Mullenweg. But that would be a mistake because some of the employees announcing their resignations either praised Mullenweg or simply announced their resignation while many others tweeted how happy they are to stay at Automattic.

One former employee tweeted that he was sad about recent developments but also praised Mullenweg and Automattic as an employer.

He shared:

“Today was my last day at Automattic. I spent the last 2 years building large scale ML and generative AI infra and products, and a lot of time on robotics at night and on weekends.

I’m going to spend the next month taking a break, getting married, and visiting family in Australia.

I have some really fun ideas of things to build that I’ve been storing up for a while. Now I get to build them. Get in touch if you’d like to build AI products together.”

Another former employee, Naoko Takano, is a 14 year employee, an organizer of WordCamp conferences in Asia, a full-time WordPress contributor and Open Source Project Manager at Automattic announced on X (formerly Twitter) that today was her last day at Automattic with no additional comment.

She tweeted:

“Today was my last day at Automattic.

I’m actively exploring new career opportunities. If you know of any positions that align with my skills and experience!”

Naoko’s role at at WordPress was working with the global WordPress community to improve contributor experiences through the Five for the Future and Mentorship programs. Five for the Future is an important WordPress program that encourages organizations to donate 5% of their resources back into WordPress. Five for the Future is one of the issues Mullenweg had against WP Engine, asserting that they didn’t donate enough back into the community.

Mullenweg himself was bittersweet to see those employees go, writing in a blog post:

“It was an emotional roller coaster of a week. The day you hire someone you aren’t expecting them to resign or be fired, you’re hoping for a long and mutually beneficial relationship. Every resignation stings a bit.

However now, I feel much lighter. I’m grateful and thankful for all the people who took the offer, and even more excited to work with those who turned down $126M to stay. As the kids say, LFG!”

Read the entire announcement on Mullenweg’s blog:

Automattic Alignment

Featured Image by Shutterstock/sdx15

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

YouTube Extends Shorts To 3 Minutes, Adds New Features

Published

on

By

YouTube Extends Shorts To 3 Minutes, Adds New Features

YouTube expands Shorts to 3 minutes, adds templates, AI tools, and the option to show fewer Shorts on the homepage.

  • YouTube Shorts will allow 3-minute videos.
  • New features include templates, enhanced remixing, and AI-generated video backgrounds.
  • YouTube is adding a Shorts trends page and comment previews.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

How To Stop Filter Results From Eating Crawl Budget

Published

on

By

How To Find The Right Long-tail Keywords For Articles

Today’s Ask An SEO question comes from Michal in Bratislava, who asks:

“I have a client who has a website with filters based on a map locations. When the visitor makes a move on the map, a new URL with filters is created. They are not in the sitemap. However, there are over 700,000 URLs in the Search Console (not indexed) and eating crawl budget.

What would be the best way to get rid of these URLs? My idea is keep the base location ‘index, follow’ and newly created URLs of surrounded area with filters switch to ‘noindex, no follow’. Also mark surrounded areas with canonicals to the base location + disavow the unwanted links.”

Great question, Michal, and good news! The answer is an easy one to implement.

First, let’s look at what you’re trying and apply it to other situations like ecommerce and publishers. This way, more people can benefit. Then, go into your strategies above and end with the solution.

What Crawl Budget Is And How Parameters Are Created That Waste It

If you’re not sure what Michal is referring to with crawl budget, this is a term some SEO pros use to explain that Google and other search engines will only crawl so many pages on your website before it stops.

If your crawl budget is used on low-value, thin, or non-indexable pages, your good pages and new pages may not be found in a crawl.

If they’re not found, they may not get indexed or refreshed. If they’re not indexed, they cannot bring you SEO traffic.

This is why optimizing a crawl budget for efficiency is important.

Michal shared an example of how “thin” URLs from an SEO point of view are created as customers use filters.

The experience for the user is value-adding, but from an SEO standpoint, a location-based page would be better. This applies to ecommerce and publishers, too.

Ecommerce stores will have searches for colors like red or green and products like t-shirts and potato chips.

These create URLs with parameters just like a filter search for locations. They could also be created by using filters for size, gender, color, price, variation, compatibility, etc. in the shopping process.

The filtered results help the end user but compete directly with the collection page, and the collection would be the “non-thin” version.

Publishers have the same. Someone might be on SEJ looking for SEO or PPC in the search box and get a filtered result. The filtered result will have articles, but the category of the publication is likely the best result for a search engine.

These filtered results can be indexed because they get shared on social media or someone adds them as a comment on a blog or forum, creating a crawlable backlink. It might also be an employee in customer service responded to a question on the company blog or any other number of ways.

The goal now is to make sure search engines don’t spend time crawling the “thin” versions so you can get the most from your crawl budget.

The Difference Between Indexing And Crawling

There’s one more thing to learn before we go into the proposed ideas and solutions – the difference between indexing and crawling.

  • Crawling is the discovery of new pages within a website.
  • Indexing is adding the pages that are worthy of showing to a person using the search engine to the database of pages.

Pages can get crawled but not indexed. Indexed pages have likely been crawled and will likely get crawled again to look for updates and server responses.

But not all indexed pages will bring in traffic or hit the first page because they may not be the best possible answer for queries being searched.

Now, let’s go into making efficient use of crawl budgets for these types of solutions.

Using Meta Robots Or X Robots

The first solution Michal pointed out was an “index,follow” directive. This tells a search engine to index the page and follow the links on it. This is a good idea, but only if the filtered result is the ideal experience.

From what I can see, this would not be the case, so I would recommend making it “noindex,follow.”

Noindex would say, “This is not an official page, but hey, keep crawling my site, you’ll find good pages in here.”

And if you have your main menu and navigational internal links done correctly, the spider will hopefully keep crawling them.

Canonicals To Solve Wasted Crawl Budget

Canonical links are used to help search engines know what the official page to index is.

If a product exists in three categories on three separate URLs, only one should be “the official” version, so the two duplicates should have a canonical pointing to the official version. The official one should have a canonical link that points to itself. This applies to the filtered locations.

If the location search would result in multiple city or neighborhood pages, the result would likely be a duplicate of the official one you have in your sitemap.

Have the filtered results point a canonical back to the main page of filtering instead of being self-referencing if the content on the page stays the same as the original category.

If the content pulls in your localized page with the same locations, point the canonical to that page instead.

In most cases, the filtered version inherits the page you searched or filtered from, so that is where the canonical should point to.

If you do both noindex and have a self-referencing canonical, which is overkill, it becomes a conflicting signal.

The same applies to when someone searches for a product by name on your website. The search result may compete with the actual product or service page.

With this solution, you’re telling the spider not to index this page because it isn’t worth indexing, but it is also the official version. It doesn’t make sense to do this.

Instead, use a canonical link, as I mentioned above, or noindex the result and point the canonical to the official version.

Disavow To Increase Crawl Efficiency

Disavowing doesn’t have anything to do with crawl efficiency unless the search engine spiders are finding your “thin” pages through spammy backlinks.

The disavow tool from Google is a way to say, “Hey, these backlinks are spammy, and we don’t want them to hurt us. Please don’t count them towards our site’s authority.”

In most cases, it doesn’t matter, as Google is good at detecting spammy links and ignoring them.

You do not want to add your own site and your own URLs to the disavow tool. You’re telling Google your own site is spammy and not worth anything.

Plus, submitting backlinks to disavow won’t prevent a spider from seeing what you want and do not want to be crawled, as it is only for saying a link from another site is spammy.

Disavowing won’t help with crawl efficiency or saving crawl budget.

How To Make Crawl Budgets More Efficient

The answer is robots.txt. This is how you tell specific search engines and spiders what to crawl.

You can include the folders you want them to crawl by marketing them as “allow,” and you can say “disallow” on filtered results by disallowing the “?” or “&” symbol or whichever you use.

If some of those parameters should be crawled, add the main word like “?filter=location” or a specific parameter.

Robots.txt is how you define crawl paths and work on crawl efficiency. Once you’ve optimized that, look at your internal links. A link from one page on your site to another.

These help spiders find your most important pages while learning what each is about.

Internal links include:

  • Breadcrumbs.
  • Menu navigation.
  • Links within content to other pages.
  • Sub-category menus.
  • Footer links.

You can also use a sitemap if you have a large site, and the spiders are not finding the pages you want with priority.

I hope this helps answer your question. It is one I get a lot – you’re not the only one stuck in that situation.

More resources: 


Featured Image: Paulo Bobita/Search Engine Journal

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending