SEO
8 Ahrefs API Use Cases For Agencies and Enterprises
It’s no secret that APIs are a major time saver. They help automate many marketing tasks from creating reports to forecasting SEO opportunities.
They can also improve operational efficiency and provide insights for executives to make better decisions, faster.
Here are the top 8 use cases of our API and how you can leverage actionable SEO and website data in a jiffy.
Reporting is by far the biggest use case of our API. It is ideal for:
- Building executive dashboards
- Creating visuals for internal reports
- Creating scorecards
- Monitoring your search visibility for key segments
- Monitoring website health over time
You can warehouse the data yourself and mix it with other sources, or you can visualize it with business intelligence tools like Tableau, Power BI, or even just Google Sheets.
For example, we use the API to pull referring domain data for our blog and aggregate by author. We have some nice little sparklines to visualize growth (or decline), too.
We also combine referring domain data with data from other sources like GSC.
For example, this view shows us actual traffic data from GSC alongside the number of followed DR40+ referring domains for each post:
No matter what reporting tools you use internally, we’ve made it easy to integrate many Ahrefs graphs and visuals directly into your dashboard so you can build similar reports.
Just use this nifty API button in any report with data that you’d like to pull into your internal dashboards:
The true power of using our API for reporting is how it helps you keep your finger on the pulse of every area of your business so you can make better decisions faster.
We recommend building reports to monitor the performance of the following:
- Each of your business units
- Different product lines and services
- Custom segmentation
- Individual authors and contributors
It’s the best way to identify underperforming or underresourced products and teams. These dashboards can also help you get SEO buy-in from executives so they approve new projects or budget increases.
If you ever need to pull big SEO data and mix it however you like, our API is the tool for the job. It can help with:
- Large-scale SEO analysis
- Enterprise audits
- Data studies
- SEO testing
- Content inventory creation
- Building outreach lists
- And more
As with most APIs, the best part is that you can pull the data into almost any tool you’re already working with.
For example, say you have a massive list of websites for which you want to pull metrics like Domain Rating (DR). You can do this for up to 200 websites with Batch Analysis—but you can pull the metric for as many URLs as you like with the API.
Here’s a simple example in Google Sheets:
Alternatively, say you want to enrich your content inventory by pulling the keyword data. Specifically, the keyword sending the most organic traffic to each URL, its ranking position, and the estimated traffic it sends. The API makes it possible to do this at the touch of a button.
These are overly simple examples. You can pull as much or as little data as you want to suit pretty much any requirements. You can even mix and match data from Site Explorer, Site Audit, Keywords Explorer, and Rank Tracker.
The true power of an API kicks in when you automate strategic tasks that cannot easily be scaled manually and can be done better when you automate them.
It’s the secret to taking your strategies to the next level, especially for enterprise SEO projects.
For instance, you can automate many link building workflows like triggering alerts and actions based on discovering new or lost links.
You can also automate technical workflows like finding pages to redirect. On large websites, this can be an overwhelming task to do manually. A simple workflow you can consider instead might look like:
Sidenote.
If this use case sounds interesting, feel free to check out this free redirect-matching script created by our technical SEO genius, Patrick Stox. Once configured, it automatically runs through the above process for you.
The opportunities for automated workflows that harness our SEO API really are endless. We’ve seen folks use our API to:
- Pull keywords into internal systems and tag them based on products, services, locations, or business units they relate to.
- Pull domain metrics for domain buying.
- Combine SEO data with Google Ads data to lower ad costs.
- And so much more.
Many agency sales teams, digital investors, and B2B business development managers use our API data to assist with things like:
- Lead scoring and enrichment
- Qualifying prospects
- Finding advertising partners
- Doing due diligence on companies
For example, let’s say you’re evaluating the following companies as prospects for a new marketing product or service you’re launching.
In this example, we’ve pulled the following website metrics to help score these prospects:
- Domain Rating (DR) can help determine the size and authority of a prospect’s company.
- Organic Cost can indicate a website’s size and visibility potential.
- Paid Cost can help indicate the current budget a company is investing in Google Ads.
Depending on what your ideal customer looks like, you can score these prospects in a few different ways using these three metrics alone.
For instance, you can favor indicators of underperformance if you sell a service that can help close a performance gap:
Or if you offer a high-ticket product or service, you can qualify prospects based on indicators of business size or the size of their budgets:
No matter the case, you can use the data available in our API to draw conclusions like the following about any prospects you’re evaluating:
- Showit is the ideal candidate for us to work with. There’s a lot of room for growth and we can make a decent impact with our competitively priced marketing services.
- WordPress is a great candidate for us to pitch our PPC services since it has the smallest spend among website-building platforms of similar size.
- Webflow may be a great candidate for our non-search marketing services. They clearly have a marketing budget for PPC and SEO, and they may also be open to investing in other channels.
Bottom line? If website performance can be used as an indicator to segment your prospects or leads, our API can help enrich your sales processes big time.
While using SEO metrics to qualify leads is one potential use case for sales teams, another is to use these metrics to help close more deals by:
- Creating data-driven case studies
- Populating data into customized sales decks
- Sharing the performance of your entire client portfolio
For example, some forward-thinking agency sales teams use our API to pull organic data across their client portfolios. They build performance dashboards that they then send to prospective clients.
And sure, at a small scale you can simply use our Portfolios feature that allows you to track multiple websites as a cohort:
But with the API, you can aggregate more metrics and track more projects so you can display real-time results to prospective clients.
Ever wanted to say (and prove) to potential clients things like the following?
- “We’ve delivered over 10,000 position 1 rankings for our clients in the last 6 months.”
- “Six of our clients have achieved over 1 million organic visits after partnering with us.”
- “We’ve saved our clients an average of $100,000/month in ad spend.”
With our API, you can. It’s all about aggregating SEO performance metrics to help your proposals stand out from the crowd.
The global ecommerce market is forecast to hit $6.3 trillion in 2024, and with more people buying online now than ever before, digital performance data is vital for investors to be able to access in real time.
If you’re a venture capitalist, hedge fund manager, or private equity investor, you can use our API as an alternative data source to:
- Monitor online market movements
- Check your portfolio’s digital performance
- Track online performance of any company
- Be instantly notified of website traffic losses
- Inform your investment decisions
For instance, in this video, Sam looks at how the websites with the most visibility in search engines perform as a custom stock portfolio against some of the most popular assets in the world like the S&P 500, Nasdaq 100, real estate, gold, bonds, and Bitcoin.
For seasoned investors, the power of data available in our API can help take your investment decisions to the next level. You can integrate graphs from the Ahrefs dashboard directly into your tools (thanks to our nifty API button) or mix website traffic data with other data sets however you like.
For instance, let’s say you’re considering investing in a particular company. Everything looks good on paper, and you’ve been monitoring its growth over the last few months, including its website performance.
Had you not added a graph tracking their website performance in your dashboard (like the following), you may not have noticed this 25% loss of organic traffic early enough to take appropriate action:
In some industries, this may not matter regarding the stock value since website visits don’t necessarily translate into purchases or company valuation. In others, it could be a deal breaker.
If multiple companies in the same vertical are experiencing similar losses in visibility, this could indicate a widespread market movement you need to know about. Traffic losses across multiple websites can also often indicate revenue losses across the industry.
For instance, this is an example of two market-leading companies in a specific vertical experiencing traffic losses at a similar time.
And here you can see their keyword ranking movements echo one another with similar rises and dips after January 2024:
Such patterns can indicate a bigger issue affecting the entire market, not just specific companies.
The data available in our API can help you monitor widespread market movements and changes in search behaviors across any vertical you’re interested in and in real-time.
While website performance data on its own is not enough to base investment decisions on, it is a vital alternative data source to help you beat the market and mitigate potential losses.
Competitive intelligence is what Ahrefs was built for. With analytics tools and Google Search Console, you can easily find performance data about your own website. But what about competitors?
Our tool allows you to compare apples to apples when looking at competitor data. In particular, our API can help you automate things like:
- Creating competitor scorecards
- Estimating resources needed to catch up to competitors
- Monitoring competitor movements
- Gathering historical insights
- Finding and predicting untapped opportunities
For example, Patrick recently created a handful of beginner-friendly competitor scorecards that you can also take for a spin.
To use these you will need to first make a copy and add your Ahrefs API key. If you’re using the general scorecard, you’ll need to select a date (it must be the first date of a month to work). Then, add your domain and your competitors.
You won’t need to add a date with the MOM and YOY versions. Just add the API and competitor URLs.
Here’s an example of what the output will look like:
If you find yourself running competitor gap analysis reports at scale, you may also benefit from using our API to automate competitor backlink analysis and closing content gaps against top competitors.
Making projections is a core staple of enterprise SEO. It’s how executive teams are able to approve projects and allocate funding appropriately.
It’s also how agency owners set their agencies apart from competitors by adding forecasts to their sales pitches.
With our API and these free templates that Patrick has pre-built, you can:
Check out Patrick’s detailed post on all things to do with SEO forecasting for more ideas and tips on how to use these free templates in your business today.
Final thoughts
With the power of seriously big data on your side, the possibilities for how you can automate SEO tasks, site audits, and reports are endless.
The Ahrefs API offers many data points no other tool offers. We’ve designed it that way on purpose.
Feel free to book a demo with our enterprise team to see what our API can do for your business.
SEO
How To Stop Filter Results From Eating Crawl Budget
Today’s Ask An SEO question comes from Michal in Bratislava, who asks:
“I have a client who has a website with filters based on a map locations. When the visitor makes a move on the map, a new URL with filters is created. They are not in the sitemap. However, there are over 700,000 URLs in the Search Console (not indexed) and eating crawl budget.
What would be the best way to get rid of these URLs? My idea is keep the base location ‘index, follow’ and newly created URLs of surrounded area with filters switch to ‘noindex, no follow’. Also mark surrounded areas with canonicals to the base location + disavow the unwanted links.”
Great question, Michal, and good news! The answer is an easy one to implement.
First, let’s look at what you’re trying and apply it to other situations like ecommerce and publishers. This way, more people can benefit. Then, go into your strategies above and end with the solution.
What Crawl Budget Is And How Parameters Are Created That Waste It
If you’re not sure what Michal is referring to with crawl budget, this is a term some SEO pros use to explain that Google and other search engines will only crawl so many pages on your website before it stops.
If your crawl budget is used on low-value, thin, or non-indexable pages, your good pages and new pages may not be found in a crawl.
If they’re not found, they may not get indexed or refreshed. If they’re not indexed, they cannot bring you SEO traffic.
This is why optimizing a crawl budget for efficiency is important.
Michal shared an example of how “thin” URLs from an SEO point of view are created as customers use filters.
The experience for the user is value-adding, but from an SEO standpoint, a location-based page would be better. This applies to ecommerce and publishers, too.
Ecommerce stores will have searches for colors like red or green and products like t-shirts and potato chips.
These create URLs with parameters just like a filter search for locations. They could also be created by using filters for size, gender, color, price, variation, compatibility, etc. in the shopping process.
The filtered results help the end user but compete directly with the collection page, and the collection would be the “non-thin” version.
Publishers have the same. Someone might be on SEJ looking for SEO or PPC in the search box and get a filtered result. The filtered result will have articles, but the category of the publication is likely the best result for a search engine.
These filtered results can be indexed because they get shared on social media or someone adds them as a comment on a blog or forum, creating a crawlable backlink. It might also be an employee in customer service responded to a question on the company blog or any other number of ways.
The goal now is to make sure search engines don’t spend time crawling the “thin” versions so you can get the most from your crawl budget.
The Difference Between Indexing And Crawling
There’s one more thing to learn before we go into the proposed ideas and solutions – the difference between indexing and crawling.
- Crawling is the discovery of new pages within a website.
- Indexing is adding the pages that are worthy of showing to a person using the search engine to the database of pages.
Pages can get crawled but not indexed. Indexed pages have likely been crawled and will likely get crawled again to look for updates and server responses.
But not all indexed pages will bring in traffic or hit the first page because they may not be the best possible answer for queries being searched.
Now, let’s go into making efficient use of crawl budgets for these types of solutions.
Using Meta Robots Or X Robots
The first solution Michal pointed out was an “index,follow” directive. This tells a search engine to index the page and follow the links on it. This is a good idea, but only if the filtered result is the ideal experience.
From what I can see, this would not be the case, so I would recommend making it “noindex,follow.”
Noindex would say, “This is not an official page, but hey, keep crawling my site, you’ll find good pages in here.”
And if you have your main menu and navigational internal links done correctly, the spider will hopefully keep crawling them.
Canonicals To Solve Wasted Crawl Budget
Canonical links are used to help search engines know what the official page to index is.
If a product exists in three categories on three separate URLs, only one should be “the official” version, so the two duplicates should have a canonical pointing to the official version. The official one should have a canonical link that points to itself. This applies to the filtered locations.
If the location search would result in multiple city or neighborhood pages, the result would likely be a duplicate of the official one you have in your sitemap.
Have the filtered results point a canonical back to the main page of filtering instead of being self-referencing if the content on the page stays the same as the original category.
If the content pulls in your localized page with the same locations, point the canonical to that page instead.
In most cases, the filtered version inherits the page you searched or filtered from, so that is where the canonical should point to.
If you do both noindex and have a self-referencing canonical, which is overkill, it becomes a conflicting signal.
The same applies to when someone searches for a product by name on your website. The search result may compete with the actual product or service page.
With this solution, you’re telling the spider not to index this page because it isn’t worth indexing, but it is also the official version. It doesn’t make sense to do this.
Instead, use a canonical link, as I mentioned above, or noindex the result and point the canonical to the official version.
Disavow To Increase Crawl Efficiency
Disavowing doesn’t have anything to do with crawl efficiency unless the search engine spiders are finding your “thin” pages through spammy backlinks.
The disavow tool from Google is a way to say, “Hey, these backlinks are spammy, and we don’t want them to hurt us. Please don’t count them towards our site’s authority.”
In most cases, it doesn’t matter, as Google is good at detecting spammy links and ignoring them.
You do not want to add your own site and your own URLs to the disavow tool. You’re telling Google your own site is spammy and not worth anything.
Plus, submitting backlinks to disavow won’t prevent a spider from seeing what you want and do not want to be crawled, as it is only for saying a link from another site is spammy.
Disavowing won’t help with crawl efficiency or saving crawl budget.
How To Make Crawl Budgets More Efficient
The answer is robots.txt. This is how you tell specific search engines and spiders what to crawl.
You can include the folders you want them to crawl by marketing them as “allow,” and you can say “disallow” on filtered results by disallowing the “?” or “&” symbol or whichever you use.
If some of those parameters should be crawled, add the main word like “?filter=location” or a specific parameter.
Robots.txt is how you define crawl paths and work on crawl efficiency. Once you’ve optimized that, look at your internal links. A link from one page on your site to another.
These help spiders find your most important pages while learning what each is about.
Internal links include:
- Breadcrumbs.
- Menu navigation.
- Links within content to other pages.
- Sub-category menus.
- Footer links.
You can also use a sitemap if you have a large site, and the spiders are not finding the pages you want with priority.
I hope this helps answer your question. It is one I get a lot – you’re not the only one stuck in that situation.
More resources:
Featured Image: Paulo Bobita/Search Engine Journal
SEO
Ad Copy Tactics Backed By Study Of Over 1 Million Google Ads
Mastering effective ad copy is crucial for achieving success with Google Ads.
Yet, the PPC landscape can make it challenging to discern which optimization techniques truly yield results.
Although various perspectives exist on optimizing ads, few are substantiated by comprehensive data. A recent study from Optmyzr attempted to address this.
The goal isn’t to promote or dissuade any specific method but to provide a clearer understanding of how different creative decisions impact your campaigns.
Use the data to help you identify higher profit probability opportunities.
Methodology And Data Scope
The Optmyzr study analyzed data from over 22,000 Google Ads accounts that have been active for at least 90 days with a minimum monthly spend of $1,500.
Across more than a million ads, we assessed Responsive Search Ads (RSAs), Expanded Text Ads (ETAs), and Demand Gen campaigns. Due to API limitations, we could not retrieve asset-level data for Performance Max campaigns.
Additionally, all monetary figures were converted to USD to standardize comparisons.
Key Questions Explored
To provide actionable insights, we focused on addressing the following questions:
- Is there a correlation between Ad Strength and performance?
- How do pinning assets impact ad performance?
- Do ads written in title case or sentence case perform better?
- How does creative length affect ad performance?
- Can ETA strategies effectively translate to RSAs and Demand Gen ads?
As we evaluated the results, it’s important to note that our data set represents advanced marketers.
This means there may be selection bias, and these insights might differ in a broader advertiser pool with varying levels of experience.
The Relationship Between Ad Strength And Performance
Google explicitly states that Ad Strength is a tool designed to guide ad optimization rather than act as a ranking factor.
Despite this, marketers often hold mixed opinions about its usefulness, as its role in ad performance appears inconsistent.
Our data corroborates this skepticism. Ads labeled with an “average” Ad Strength score outperformed those with “good” or “excellent” scores in key metrics like CPA, conversion rate, and ROAS.
This disparity is particularly evident in RSAs, where the ROAS tends to decrease sharply when moving from “average” to “good,” with only a marginal increase when advancing to “excellent.”
Interestingly, Demand Gen ads also showed a stronger performance with an “average” Ad Strength, except for ROAS.
The metrics for conversion rates in Demand Gen and RSAs were notably similar, which is surprising since Demand Gen ads are typically designed for awareness, while RSAs focus on driving transactions.
Key Takeaways:
- Ad Strength doesn’t reliably correlate with performance, so it shouldn’t be a primary metric for assessing your ads.
- Most ads with “poor” or “average” Ad Strength labels perform well by standard advertising KPIs.
- “Good” or “excellent” Ad Strength labels do not guarantee better performance.
How Does Pinning Affect Ad Performance?
Pinning refers to locking specific assets like headlines or descriptions in fixed positions within the ad. This technique became common with RSAs, but there’s ongoing debate about its efficacy.
Some advertisers advocate for pinning all assets to replicate the control offered by ETAs, while others prefer to let Google optimize placements automatically.
Our data suggests that pinning some, but not all, assets offers the most balanced results in terms of CPA, ROAS, and CPC. However, ads where all assets are pinned achieve the highest relevance in terms of CTR.
Still, this marginally higher CTR doesn’t necessarily translate into better conversion metrics. Ads with unpinned or partially pinned assets generally perform better in terms of conversion rates and cost-based metrics.
Key Takeaways:
- Selective pinning is optimal, offering a good balance between creative control and automation.
- Fully pinned ads may increase CTR but tend to underperform in metrics like CPA and ROAS.
- Advertisers should embrace RSAs, as they consistently outperform ETAs – even with fully pinned assets.
Title Case Vs. Sentence Case: Which Performs Better?
The choice between title case (“This Is a Title Case Sentence”) and sentence case (“This is a sentence case sentence”) is often a point of contention among advertisers.
Our analysis revealed a clear trend: Ads using sentence case generally outperformed those in title case, particularly in RSAs and Demand Gen campaigns.
(RSA Data)
(ETA Data)
(Demand Gen)
ROAS, in particular, showed a marked preference for sentence case across these ad types, suggesting that a more natural, conversational tone may resonate better with users.
Interestingly, many advertisers still use a mix of title and sentence case within the same account, which counters the traditional approach of maintaining consistency throughout the ad copy.
Key Takeaways:
- Sentence case outperforms title case in RSAs and Demand Gen ads on most KPIs.
- Including sentence case ads in your testing can improve performance, as it aligns more closely with organic results, which users perceive as higher quality.
- Although ETAs perform slightly better with title case, sentence case is increasingly the preferred choice in modern ad formats.
The Impact Of Ad Length On Performance
Ad copy, particularly for Google Ads, requires brevity without sacrificing impact.
We analyzed the effects of character count on ad performance, grouping ads by the length of headlines and descriptions.
(RSA Data)
(ETA Data)
(Demand Gen Data)
Interestingly, shorter headlines tend to outperform longer ones in CTR and conversion rates, while descriptions benefit from moderate length.
Ads that tried to maximize character counts by using dynamic keyword insertion (DKI) or customizers often saw no significant performance improvement.
Moreover, applying ETA strategies to RSAs proved largely ineffective.
In almost all cases, advertisers who carried over ETA tactics to RSAs saw a decline in performance, likely because of how Google dynamically assembles ad components for display.
Key Takeaways:
- Shorter headlines lead to better performance, especially in RSAs.
- Focus on concise, impactful messaging instead of trying to fill every available character.
- ETA tactics do not translate well to RSAs, and attempting to replicate them can hurt performance.
Final Thoughts On Ad Optimizations
In summary, several key insights emerge from this analysis.
First, Ad Strength should not be your primary focus when assessing performance. Instead, concentrate on creating relevant, engaging ad copy tailored to your target audience.
Additionally, pinning assets should be a strategic, creative decision rather than a hard rule, and advertisers should incorporate sentence case into their testing for RSAs and Demand Gen ads.
Finally, focus on quality over quantity in ad copy length, as longer ads do not always equate to better results.
By refining these elements of your ads, you can drive better ROI and adapt to the evolving landscape of Google Ads.
Read the full Ad Strength & Creative Study from Optmyzr.
More resources:
Featured Image: Sammby/Shutterstock
SEO
Bing Expands Generative Search Capabilities For Complex Queries
Microsoft has announced an expansion of Bing’s generative search capabilities.
The update focuses on handling complex, informational queries.
Bing provides examples such as “how to effectively run a one-on-one” and “how can I remove background noise from my podcast recordings.”
Searchers in the United States can access the new features by typing “Bing generative search” into the search bar. This will present a carousel of sample queries.
A “Deep search” button on the results page activates the generative search function for other searches.
Beta Release and Potential Challenges
It’s important to note that this feature is in beta.
Bing acknowledges that you may experience longer loading times as the system works to ensure accuracy and relevance.
The announcement reads:
“While we’re excited to give you this opportunity to explore generative search firsthand, this experience is still being rolled out in beta. You may notice a bit of loading time as we work to ensure generative search results are shown when we’re confident in their accuracy and relevancy, and when it makes sense for the given query. You will generally see generative search results for informational and complex queries, and it will be indicated under the search box with the sentence “Results enhanced with Bing generative search” …”
This is the waiting screen you get after clicking on “Deep search.”
In practice, I found the wait was long and sometimes the searches would fail before completing.
The ideal way to utilize this search experience is to click on the suggestions provided after entering “Bing generative search” into the search bar.
Potential Impact
Bing’s generative search results include citations and links to original sources.
This approach is intended to drive traffic to publishers, but it remains to be seen how effective this will be in practice.
Bing encourages users to provide feedback on the new feature using thumbs up/down icons or the dedicated feedback button.
See also: Google AIO Is Ranking More Niche Specific Sites
Looking Ahead
This development comes as search engines increasingly use AI to enhance their capabilities.
As Bing rolls out this expanded generative search feature, remember the technology is still in beta, so performance and accuracy may vary.
Featured Image: JarTee/Shutterstock
-
SEARCHENGINES6 days ago
Daily Search Forum Recap: September 27, 2024
-
SEO6 days ago
How to Estimate It and Source Data
-
SEO4 days ago
6 Things You Can Do to Compete With Big Sites
-
SEO6 days ago
9 Successful PR Campaign Examples, According to the Data
-
SEO5 days ago
Yoast Co-Founder Suggests A WordPress Contributor Board
-
SEARCHENGINES5 days ago
Google’s 26th Birthday Doodle Is Missing
-
AFFILIATE MARKETING7 days ago
Kevin O’Leary: I Got an MBA Instead of Following My Passion
-
SEARCHENGINES4 days ago
Google Volatility With Gains & Losses, Updated Web Spam Policies, Cache Gone & More Search News