Connect with us
Cloak And Track Your Affiliate Links With Our User-Friendly Link Cloaking Tool, Try It Free

SEO

How to Perform a Competitor Link Analysis in 3 Steps

Published

on

How to Perform a Competitor Link Analysis in 3 Steps

A competitor link analysis is an assessment of the link profiles of major competitors where you look at their link building strategies to gain insights. Implementing some of the same strategies in your own link building can lead to higher rankings in search engines and increased business outcomes.

Performing this kind of analysis allows you to find sites that commonly link to websites similar to yours. If you look at well-ranking competitors and they have a lot of the same links, those links may be the ones helping them to rank.

Let’s take a look at how to perform a basic audit. Later, I’ll show you how to scale this process to get even more insights using bigger data. 

How to perform a competitor link analysis in three steps

This simple process can uncover a lot of link opportunities by looking at the common links that your immediate competitors already have.

1. Identify competitors

You may already know who your competitors are and can use that list later in the process. If you need some help, check out the Organic competitors report in Ahrefs’ Site Explorer to find a list of sites ranking for the same things you are.

Organic competitors report

If you don’t have a website yet, then you can search for a few main terms and you may be able to tell who the competitors will be. Another way of doing this is to simply add a few terms to Ahrefs’ Keywords Explorer and go to the Traffic share by domains report, which will show you which sites are getting the most traffic for those terms. 

Traffic share by domains reportTraffic share by domains report

2. Find common links 

Take your list of competitors and enter them in the Link Intersect report in Site Explorer. This is going to give you a list of all the referring domains that your competitors have and show how many different competitors have links from those sites. 

link intersect reportlink intersect report

You can click the numbers under each domain to see the kind of links they have from each website.

click to see the individual linksclick to see the individual links

3. Identifying competitor strategies

There are a couple of ways to use this data. You can look at the common links your competitors have that you’re missing and work your way through those, or you can look for patterns in the data to see the type of sites where competitors are getting links. You may see things like niche sites, local listings, or other local websites.

Congratulations, you’ve done a basic competitor link analysis! 

If you have a little more time, you can scale the link intersect process to gather even more data and opportunities. Keep reading to find out how.

bonus tip: check your competitors’ most linked content

If you look at the Best by links report for some of your competitors in Site Explorer, you may see some interesting content driving links to competitors that you otherwise may not find. 

best by links reportbest by links report

I commonly see things like tools, studies, or even quotes that drive a lot of links to individual competitors. 

A lot of these links may be unique to a particular competitor, and you may miss them when looking at the overlap of competitor links. 

If you see a competitor successfully gaining links this way, you may be able to create similar tools and content to gain more links for your own site.

Scaled link intersect process

With a scaled process, you’re working with more data that can lead to additional insights and opportunities. You’ll spend more time up front going through this process, but you may be able to reuse the opportunities you find for other clients. I would expect it to take two to four hours for most people to follow this process.

First, we’ll look at your competitors’ links to find your niche-specific links. And later, we’ll look at other sites in your city to find local link opportunities.

Niche links

The easier process that we covered earlier just looks at direct competitors. If you have more than 10 direct competitors, you can still use a similar process to what I’m about to show to gain more insights. 

Just skip the section below and go directly to where you’re exporting referring domains of competitors. You can use your own list or get a larger list from the Competing Domains report in Site Explorer.

Local only

This process is a great fit for companies focused on local SEO, but it requires a little more work. 

By local SEO, I mean companies that mostly compete in one area like dentist offices, law firms, plumbing companies, etc. If you look at sites competing in other markets that may be stronger or more competitive than your own, you’ll find a lot of additional opportunities that no one in your market has taken advantage of.

First, create a copy of this Google sheet with the top 50 U.S. cities. Replace “service” with the name of your niche and copy the value down to create your keyword list. 

copy service downcopy service down

If you’re in a different country, you can create a similar sheet with popular cities in your country.

Copy the resulting list of terms from column C and paste them into Keywords Explorer, then click the search button.

copy the keywords into keywords explorercopy the keywords into keywords explorer

If you click the export option, you’ll see another menu with an option to include the SERPs in the export. This will include the sites in the top 10.

export with SERPsexport with SERPs

With the list of sites, delete anything that doesn’t look like a niche site.

You can set your own filters to make this easier. But as a general way to clean the data, I’d recommend:

  • Insert > Table
  • Filter > DR less than 50. Most local sites won’t have a DR that high, whereas directories and aggregators like yellowpages.com will. This is trying to get rid of all the directory/roundup/best-of-type sites.
  • Filter Type > Organic

Then manually delete anything that seems out of place. What you’ll be left with is the top sites in your niche from different cities.

top niche sitestop niche sites

All companies

The most time-consuming process is what’s next. For each site, you’ll need to export its referring domains using the Referring domains report in Site Explorer. I’d recommend saving them to a new folder to make the next step easier.

export referring domainsexport referring domains

Or if you have an enterprise plan that has API access and you want to do this quickly, you can iterate through the websites. We provide the needed request if you click the “{ } API” button.

api request for referring domainsapi request for referring domains

Now we need to combine all the files into one. I’ll show you how to do this in Windows and on MacOS. But if it’s easier for you, then you may want to try one of the online tool options that can combine CSV files.

I usually do this with Windows Command Prompt. Here’s the process:

  1. Save the files to a new folder
  2. Get the path either by using shift+right-click on the folder and “Copy as path,” or copy it from the address bar when viewing the contents of the folder
  3. Open Command Prompt, which you can find by searching “cmd”
  4. Type “cd,” press “Space,” right-click and paste, then press “Enter”
  5. Type “copy *.csv whatever-name.csv” and press “Enter”

For MacOS, you’ll use Terminal instead of Command Prompt, and the command to combine them is “cat *.csv > whatever-name.csv” for step #5. But otherwise, the instructions should be the same.

Open the new combined file. We’ll need to get a count of how many times each of the referring domains appears in the file. Here’s how to do that:

  • Insert > Table
  • Insert Column next to Referring Domain column and name it Count
  • In the Count Column, add the formula =COUNTIF(B:B,[@[Referring Domain]])
count of referring domainscount of referring domains

Copy the numbers in the Count column and paste in the same place as values. This makes it so the numbers aren’t lost when we remove duplicates. To remove duplicates:

  • Data > Remove Duplicates based on the Referring Domain
  • Data > Sort by Count > Largest to Smallest

This leaves you with the top referring sites in your niche. It should look something like this: 

top referring sitestop referring sites

You’ll want to categorize them in ways that make sense to you. I’d highly recommend pulling out the niche-specific domains. Sometimes, you’ll see various organizations, trade shows, suppliers, vendors, niche-specific directories, etc.

niche specific links for accountantsniche specific links for accountants

Take that list and look at how they’re linking to these other sites. There are a number of ways to do this like checking the Linked Domains report for these sites to see how they link out to other sites.

You can even repeat this process with other similar niches where you may find additional opportunities that no one in your niche has taken advantage of.

Next, let’s look at how you can use a similar process to find local link opportunities.

Local links

If we look outside our niche to other sites in the same city, the overlap of their links can provide a lot of opportunities for local link building. Local websites tend to talk about and link to other local websites. If you’re an agency that has a lot of local SEO clients in the same area, you’ll definitely want to go through this process.

Make a copy of this Google sheet with some of the most popular niches for local businesses. Simply add the name of your city and copy the value down to create your keyword list.

copy the city value downcopy the city value down

The rest of the steps are the same as the niche example above where you copy the terms into Keywords Explorer, export the results with the SERPs, and then pull the referring domains for all of the sites.

When you get to the categorization part, look for any interesting patterns. Here are a few I found for my hometown of Raleigh:

  1. Colleges/universities – These include jobs, scholarships, club sponsorships, discounts, and alumni links.
  2. City-specific sites and directories
  3. Local news and magazines
  4. Sites about the state and surrounding areas
  5. Suppliers/affiliations/partners – Some of these were testimonials and case studies.
  6. Churches – They seem to link to a lot of local organizations and businesses of their members.
  7. Business groups – They link to those who are part of the group.
  8. Events – I saw a lot of links from meetup.com—most of which were from hosting or sponsoring.
  9. Sponsorships and charities – This is also great for supporting the local community.
  10. Podcasts
  11. Awards – Especially local favorites.
  12. Coupons
  13. Directories
  14. Job postings

You may find some other interesting patterns. For example, I saw that pretty much anyone involved in weddings like DJs, photographers, event planners, and caterers all seem to link to each other. I found another pattern where realtors, apartments, and HOA sites tend to link to things to do and places to eat. 

I also saw some well-done ego bait expert roundups. Many of these people have their own personal sites and blogs that link to this type of content.

There were also a lot of links from the local Reddit community, Facebook groups, and Nextdoor. These types of sites can also be valuable sources for referrals.

Final thoughts

Analyzing your competitors’ links is a great way to reverse engineer their strategies and find the links that may be helping your competitors the most. Scaling that process can lead to a lot of unique insights.

Message me on Twitter if you have any questions.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Cautions On Blocking GoogleOther Bot

Published

on

By

Google cautions about blocking and opting out of getting crawled by the GoogleOther crawler

Google’s Gary Illyes answered a question about the non-search features that the GoogleOther crawler supports, then added a caution about the consequences of blocking GoogleOther.

What Is GoogleOther?

GoogleOther is a generic crawler created by Google for the various purposes that fall outside of those of bots that specialize for Search, Ads, Video, Images, News, Desktop and Mobile. It can be used by internal teams at Google for research and development in relation to various products.

The official description of GoogleOther is:

“GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.”

Something that may be surprising is that there are actually three kinds of GoogleOther crawlers.

Three Kinds Of GoogleOther Crawlers

  1. GoogleOther
    Generic crawler for public URLs
  2. GoogleOther-Image
    Optimized to crawl public image URLs
  3. GoogleOther-Video
    Optimized to crawl public video URLs

All three GoogleOther crawlers can be used for research and development purposes. That’s just one purpose that Google publicly acknowledges that all three versions of GoogleOther could be used for.

What Non-Search Features Does GoogleOther Support?

Google doesn’t say what specific non-search features GoogleOther supports, probably because it doesn’t really “support” a specific feature. It exists for research and development crawling which could be in support of a new product or an improvement in a current product, it’s a highly open and generic purpose.

This is the question asked that Gary narrated:

“What non-search features does GoogleOther crawling support?”

Gary Illyes answered:

“This is a very topical question, and I think it is a very good question. Besides what’s in the public I don’t have more to share.

GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.

Historically Googlebot was used for this, but that kind of makes things murky and less transparent, so we launched GoogleOther so you have better controls over what your site is crawled for.

That said GoogleOther is not tied to a single product, so opting out of GoogleOther crawling might affect a wide range of things across the Google universe; alas, not Search, search is only Googlebot.”

It Might Affect A Wide Range Of Things

Gary is clear that blocking GoogleOther wouldn’t have an affect on Google Search because Googlebot is the crawler used for indexing content. So if blocking any of the three versions of GoogleOther is something a site owner wants to do, then it should be okay to do that without a negative effect on search rankings.

But Gary also cautioned about the outcome that blocking GoogleOther, saying that it would have an effect on other products and services across Google. He didn’t state which other products it could affect nor did he elaborate on the pros or cons of blocking GoogleOther.

Pros And Cons Of Blocking GoogleOther

Whether or not to block GoogleOther doesn’t necessarily have a straightforward answer. There are several considerations to whether doing that makes sense.

Pros

Inclusion in research for a future Google product that’s related to search (maps, shopping, images, a new feature in search) could be useful. It might be helpful to have a site included in that kind of research because it might be used for testing something good for a site and be one of the few sites chosen to test a feature that could increase earnings for a site.

Another consideration is that blocking GoogleOther to save on server resources is not necessarily a valid reason because GoogleOther doesn’t seem to crawl so often that it makes a noticeable impact.

If blocking Google from using site content for AI is a concern then blocking GoogleOther will have no impact on that at all. GoogleOther has nothing to do with crawling for Google Gemini apps or Vertex AI, including any future products that will be used for training associated language models. The bot for that specific use case is Google-Extended.

Cons

On the other hand it might not be helpful to allow GoogleOther if it’s being used to test something related to fighting spam and there’s something the site has to hide.

It’s possible that a site owner might not want to participate if GoogleOther comes crawling for market research or for training machine learning models (for internal purposes) that are unrelated to public-facing products like Gemini and Vertex.

Allowing GoogleOther to crawl a site for unknown purposes is like giving Google a blank check to use your site data in any way they see fit outside of training public-facing LLMs or purposes related to named bots like GoogleBot.

Takeaway

Should you block GoogleOther? It’s a coin toss. There are possible potential benefits but in general there isn’t enough information to make an informed decision.

Listen to the Google SEO Office Hours podcast at the 1:30 minute mark:

Featured Image by Shutterstock/Cast Of Thousands

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

AI Search Boosts User Satisfaction

Published

on

By

AI chat robot on search engine bar. Artificial intelligence bot innovation technology answer question with smart solution. 3D vector created from graphic software.

A new study finds that despite concerns about AI in online services, users are more satisfied with search engines and social media platforms than before.

The American Customer Satisfaction Index (ACSI) conducted its annual survey of search and social media users, finding that satisfaction has either held steady or improved.

This comes at a time when major tech companies are heavily investing in AI to enhance their services.

Search Engine Satisfaction Holds Strong

Google, Bing, and other search engines have rapidly integrated AI features into their platforms over the past year. While critics have raised concerns about potential negative impacts, the ACSI study suggests users are responding positively.

Google maintains its position as the most satisfying search engine with an ACSI score of 81, up 1% from last year. Users particularly appreciate its AI-powered features.

Interestingly, Bing and Yahoo! have seen notable improvements in user satisfaction, notching 3% gains to reach scores of 77 and 76, respectively. These are their highest ACSI scores in over a decade, likely due to their AI enhancements launched in 2023.

The study hints at the potential of new AI-enabled search functionality to drive further improvements in the customer experience. Bing has seen its market share improve by small but notable margins, rising from 6.35% in the first quarter of 2023 to 7.87% in Q1 2024.

Customer Experience Improvements

The ACSI study shows improvements across nearly all benchmarks of the customer experience for search engines. Notable areas of improvement include:

  • Ease of navigation
  • Ease of using the site on different devices
  • Loading speed performance and reliability
  • Variety of services and information
  • Freshness of content

These improvements suggest that AI enhancements positively impact various aspects of the search experience.

Social Media Sees Modest Gains

For the third year in a row, user satisfaction with social media platforms is on the rise, increasing 1% to an ACSI score of 74.

TikTok has emerged as the new industry leader among major sites, edging past YouTube with a score of 78. This underscores the platform’s effective use of AI-driven content recommendations.

Meta’s Facebook and Instagram have also seen significant improvements in user satisfaction, showing 3-point gains. While Facebook remains near the bottom of the industry at 69, Instagram’s score of 76 puts it within striking distance of the leaders.

Challenges Remain

Despite improvements, the study highlights ongoing privacy and advertising challenges for search engines and social media platforms. Privacy ratings for search engines remain relatively low but steady at 79, while social media platforms score even lower at 73.

Advertising experiences emerge as a key differentiator between higher- and lower-satisfaction brands, particularly in social media. New ACSI benchmarks reveal user concerns about advertising content’s trustworthiness and personal relevance.

Why This Matters For SEO Professionals

This study provides an independent perspective on how users are responding to the AI push in online services. For SEO professionals, these findings suggest that:

  1. AI-enhanced search features resonate with users, potentially changing search behavior and expectations.
  2. The improving satisfaction with alternative search engines like Bing may lead to a more diverse search landscape.
  3. The continued importance of factors like content freshness and site performance in user satisfaction aligns with long-standing SEO best practices.

As AI becomes more integrated into our online experiences, SEO strategies may need to adapt to changing user preferences.


Featured Image: kate3155/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google To Upgrade All Retailers To New Merchant Center By September

Published

on

By

Google To Upgrade All Retailers To New Merchant Center By September

Google has announced plans to transition all retailers to its updated Merchant Center platform by September.

This move will affect e-commerce businesses globally and comes ahead of the holiday shopping season.

The Merchant Center is a tool for online retailers to manage how their products appear across Google’s shopping services.

Key Changes & Features

The new Merchant Center includes several significant updates.

Product Studio

An AI-powered tool for content creation. Google reports that 80% of current users view it as improving efficiency.

This feature allows retailers to generate tailored product assets, animate still images, and modify existing product images to match brand aesthetics.

It also simplifies tasks like background removal and image resolution enhancement.

Centralized Analytics

A new tab consolidating various business insights, including pricing data and competitive analysis tools.

Retailers can access pricing recommendations, competitive visibility reports, and retail-specific search trends, enabling them to make data-driven decisions and capitalize on popular product categories.

Redesigned Navigation

Google claims the new interface is more intuitive and cites increased setup success rates for new merchants.

The platform now offers simplified website verification processes and can pre-populate product information during setup.

Initial User Response

According to Google, early adopters have shown increased engagement with the platform.

The company reports a 25% increase in omnichannel merchants adding product offers in the new system. However, these figures have yet to be independently verified.

Jeff Harrell, Google’s Senior Director of Merchant Shopping, states in an announcement:

“We’ve seen a significant increase in retention and engagement among existing online merchants who have moved to the new Merchant Center.”

Potential Challenges and Support

While Google emphasizes the upgrade’s benefits, some retailers, particularly those comfortable with the current version, may face challenges adapting to the new system.

The upgrade’s mandatory nature could raise concerns among users who prefer the existing interface or have integrated workflows based on the current system.

To address these concerns, Google has stated that it will provide resources and support to help with the transition. This includes tutorial videos, detailed documentation, and access to customer support teams for troubleshooting.

Industry Context

This update comes as e-commerce platforms evolve, with major players like Amazon and Shopify enhancing their seller tools. Google’s move is part of broader efforts to maintain competitiveness in the e-commerce services sector.

The upgrade could impact consumers by improving product listings and providing more accurate information across Google’s shopping services.

For the e-commerce industry as a whole, it signals a continued push towards AI-driven tools and data-centric decision-making.

Transition Timeline

Google states that retailers will be automatically upgraded by September if they still need to transition.

The company advises users to familiarize themselves with the new features before the busy holiday shopping period.


Featured Image: BestForBest/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending