Connect with us

SEO

How To Improve Lead Quality Without Backend Data

Published

on

How To Improve Lead Quality Without Backend Data


Integrating backend data into digital marketing initiatives is a gamechanger for performance.

But what can you do when backend conversion data is unavailable or unreliable?

How can marketers optimize lead quality and higher margins without explicit insight into which keywords and audiences have the most value?

This article will walk you through some indirect ways in which you can optimize for lead quality in Google Ads, despite not having the ideal data passback with your digital advertising platforms.

First, let’s review why this is so important.

Why Optimize For Down-Funnel Events Or Margin?

Before we get into strategies, let’s first align on why this even matters. What’s the harm in just optimizing toward a lead event or transaction?

This visualization below is one our agency uses when communicating the need to blend backend data to inform campaign strategy and decision-making.

Note that while this example is specific to B2B, B2C advertisers generally will have a progression toward revenue, as well.

Advertisement

Continue Reading Below

For example, B2C ecommerce could have a Click > Add To Cart > Start Checkout > Create Account > Complete Order flow.

 Image provided by Closed Loop, December 2021

As the above shows, the path to revenue has several milestone stages, including Clicks, Leads, MQLs, Opportunities, and Closed/Won.

Most advertisers these days are savvy enough to realize that optimizing for the lowest cost click/website visit will lead to low-quality, fat tail keywords that don’t produce impact down-funnel.

Advertisement

Continue Reading Below

However, due to the technical hurdles required to blend backend and front-end data, I often find advertisers stopping at leads when optimizing.

Not a big deal? Think again.

Here’s an example that solidifies what an impact shifting your focus lower in the funnel can bring:

deeper-funnel-metricsImage provided by Closed Loop, December 2021

On the surface, Campaign A has far stronger performance when evaluating based on leads.

However, the gap widens lower in the funnel to the point where the cost per sale for Campaign A is over 5x of that of Campaign B.

So, What Do I Do About It?

You should not wait until the ideal state solution is deployed before optimizing toward lead quality.

Offline data can take time to get integrated into Google Ads and other platforms.

However, you can still take meaningful steps to start moving the needle in the right direction while that integration is being worked on.

Before we get into those steps, let’s look at the ideal data state.

The Ideal State Of Backend Data

The ideal scenario for optimizing toward backend data includes:

  • Hidden fields are set up on all of your website and native lead gen forms to pass the platform identifier (GCLID, FBCLID, etc.) into your CRM record.
  • Offline Conversion Tracking (OCT) integrated fully into Google Ads and other supporting channels (Microsoft Ads, Facebook, LinkedIn).
  • Values calculated and assigned to each conversion point.
  • Value-based bidding enabled in-platform.
  • Backend data blending across channels via a daily, automated CRM export (assuming not all ad platforms you are running on support data passback) to enable cross-channel, full-funnel reporting.

While this may sound straightforward, my experience is that it takes many advertisers a long time to get to this point, given the need to involve stakeholders from multiple departments.

Here are tangible steps you can take to optimize for down-funnel events while the ideal state is being worked on.

Advertisement

Continue Reading Below

Stage 1: The Low Hanging Fruit

Perceived Keyword Intent

When optimizing to lead, the search terms that trigger said lead will vary greatly in quality.

Assuming limited budgets and a desire to improve down-funnel results, you should evaluate keywords (and campaign budget allocation) based on perceived intent, as well as tangential signals that identify quality, such as engagement signals ported in from Google Analytics.

If share of voice (SOV) is lower-than-ideal for strongly performing, high perceived intent terms, consider decreasing exposures on terms with a low perceived intent or poor tangential signals.

Don’t let higher cost per lead numbers scare you.

If you identify keywords with higher perceived intent, despite higher cost per lead, consider adding an “intent multiplier” for leads triggered from certain keywords and audiences.

Pro-tip: Apply labels when making adjustments so that you can easily filter for changes made at specific points in time.

This will enable you to make updates quickly to that data set in the future (e.g. If your monthly budget increases and it makes sense to activate a tranche of keywords previously paused).

Advertisement

Continue Reading Below

Pre-Qualify The Click

Marketing 101 tells us that the higher the CTR is, the more aligned your targeting and messaging are.

However, one should not blindly optimize in efforts to maximize CTR.

Your ad copy is one of the easiest levers you have at your disposal when trying to improve the quality of your leads.

Think through the attributes that make up a high-quality lead, then tailor your ads to speak to those personas.

For example, if you are a B2B advertiser attracting enterprise IT prospects in the Retail vertical, call out things like “Enterprise IT Solution for Retail” in your headline.

Tailoring will decrease the ad’s relevancy for some searchers.

However, you’ll free up the budget for audiences better aligned.

By clearly identifying who your product or service is for in the copy, you’ll weed out those who aren’t good fits, such as SMBs and manufacturing companies.

Utilize Audience Layers

Google Ads has a wide range of affinity, in-market, detailed demographic, and custom audience options available to advertisers.

Advertisement

Continue Reading Below

By applying audience layers to your campaigns, you can bid up or down (manual bidding) or include or exclude via RLSA campaigns.

Stage 2: Leverage First-Party Data For Audience Building

Regardless of whether your CRM is connected to your advertising platforms, it still holds customer and prospect records that are highly valuable to you as a marketer that you can extract.

Here are three ways you can fully leverage that data.

Nurture Using CRM Data

You can improve down-funnel lead quality – especially in sales funnels that extend beyond a few days – via lead nurture initiatives across display/programmatic, YouTube, social, and search.

A marketer’s job does not stop at the lead stage.

An organization must stay top of mind throughout the entire buyer’s journey.

Marketers should be working with sales in evaluating (often via a lead scoring system) which leads in their system have promise.

Then ensure that they and the organization’s buying committee (via ABM), are being saturated with both brand and thought leadership content to keep you top of mind and to build more authority.

Advertisement

Continue Reading Below

Marketers should also collaborate with sales to evaluate promising leads (using a lead scoring system) and ensure they, including the organization’s buying committee (via ABM), are nurtured with content marketing to maintain brand awareness and industry authority.

While Google does not offer ABM solutions, you can target specific companies and functions within said companies via the Microsoft Audience Network, LinkedIn, and other providers.

List Building Using CRM Data

Major ad platforms offer list upload options (via phone, e-mail, or mobile app ID) to seed Similar To/Lookalike audiences.

By thinking through your list upload segments, you can target people who have attributes similar to your most valuable lists (e.g. top customers).

On the flip side, you can upload lists for groups of low-quality prospects and customers, then exclude them from your targeting (all bid strategies) or bid down if using a manual bid strategy.

You don’t have to use that list for explicit targeting. It can also glean insights into your customer base or as a seed list for Similar To or Lookalike audience creation (keep reading!).

Advertisement

Continue Reading Below

Target Modeling Using CRM Data

In addition to explicitly targeting and/or creating a lookalike-based audience using a list upload, both Google Ads and LinkedIn have audience insight tools that can help you identify additional audience segments that align with your best and worst customers.

In Google Ads, head over to your Shared Library > Audience Manager > Your Data Insights.

Here, you will be able to select an audience (upload, pixel, YouTube-based), then see how that audience indexes against a control group (e.g., US population; Bad lead list) across dimensions like age, gender, parental status, location, device and most importantly, Google affinity and in-market segments.

Here’s a look at that report, using a “Closed/Won” list:

audience segmentation on Google Ads Screenshot from Google Ads, December 2021
audience segmentation on google adsScreenshot from Google Ads, December 2021

Once you have some insights, you can decide how best to apply the insights across your campaigns. This could be via bid adjustments, value rules, inclusions (RLSA), or exclusions.

Advertisement

Continue Reading Below

TL;DR: Use Customer Match uploads to feed Google Ads (and beyond) your customer data, then utilize that data through inclusions, exclusions, and attribute modeling.

Pro-tip: While you can manually upload lists to platforms, consider tools like Zapier, Salesforce Advertising Studio, and Liveramp to automate this update process better and improve match rates.

Stage 3: Use Conversion Values To Inform Bidding

The holy grail to strive for is OCT-based conversion points + value-based bidding. Even without OCT data, using value signals in your bidding decisions can still be a net win for performance.

Here are three steps to maximizing conversion value usage.

Step 1: Assign Values To Each Conversion Point

Don’t worry about providing the system with a perfect value when getting started. The goal is to establish values that will nudge the algorithms in the right direction.

Down the road, these values should be based on the value multiplied by the conversion rate from that action to the transaction.

Step 2: Test Into Value-based Bidding (Max Conversion Value/tROAS)

Advertisement

Continue Reading Below

When first getting started, you should set your tROAS targets equal to the CPA of your legacy bid strategy.

The goal here is to shift to value-based bidding without undue volatility and then start improving efficiency and/or scale by adjusting tROAS target.

Step 3: Use Value Rules

New to Google Ads in 2021, this feature allows you to add, subtract and multiply any conversion value based on audience, device or location.

For example, imagine I want to target enterprise IT decision-makers but don’t have OCT, so lack visibility into what drives performance beyond the lead.

Even without backend data, I intuitively know I want the algorithms to:

  • Bid higher if in a Similar To audience based on-site engagement.
  • Bid higher if Google buckets them into an In Market: Enterprise Software bucket.
  • Bid higher for those who work at enterprise companies.
  • Bid higher for those located in San Francisco, CA.
  • Bid lower if they work at a small company.

Translating that into value rules looks something like this:

value rules on google ads exampleScreenshot from Google Ads, December 2021

With value rules, you can use tools like Google Ads Data Insights along with findings pulled from LinkedIn Detailed Demographics reports, first-party personas, and customer data to tell the algorithms to serve ads to those prospects of higher quality.

Advertisement

Continue Reading Below

Wrapping Things Up

Integrating first-party data to inform audience building and bidding should be at the top of your priority list.

Incorporating backend conversion data via OCT can be challenging, but it is a worthy endeavor to strive toward.

Remember these methods when improving lead quality without using OCT while you lay the groundwork for direct data passback.

More resources:


Featured Image: Brovko Serhii/Shutterstock





Source link

SEO

Google’s Mueller Criticizes Negative SEO & Link Disavow Companies

Published

on

Google's Mueller Criticizes Negative SEO & Link Disavow Companies

John Mueller recently made strong statements against SEO companies that provide negative SEO and other agencies that provide link disavow services outside of the tool’s intended purpose, saying that they are “cashing in” on clients who don’t know better.

While many frequently say that Mueller and other Googlers are ambiguous, even on the topic of link disavows.

The fact however is that Mueller and other Googlers have consistently recommended against using the link disavow tool.

This may be the first time Mueller actually portrayed SEOs who liberally recommend link disavows in a negative light.

What Led to John Mueller’s Rebuke

The context of Mueller’s comments about negative SEO and link disavow companies started with a tweet by Ryan Jones (@RyanJones)

Ryan tweeted that he was shocked at how many SEOs regularly offer disavowing links.

He tweeted:

“I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.”

The reason why Ryan is shocked is because Google has consistently recommended the tool for disavowing paid/spammy links that the sites (or their SEOs) are responsible for.

And yet, here we are, eleven years later, and SEOs are still misusing the tool for removing other kinds of tools.

Here’s the background information about that.

Link Disavow Tool

In the mid 2000’s there was a thriving open market for paid links prior to the Penguin Update in April 2012. The commerce in paid links was staggering.

I knew of one publisher with around fifty websites who received a $30,000 check every month for hosting paid links on his site.

Even though I advised my clients against it, some of them still purchased links because they saw everyone else was buying them and getting away with it.

The Penguin Update caused the link selling boom collapsed.

Thousands of websites lost rankings.

SEOs and affected websites strained under the burden of having to contact all the sites from which they purchased paid links to ask to have them removed.

So some in the SEO community asked Google for a more convenient way to disavow the links.

Months went by and after resisting the requests, Google relented and released a disavow tool.

Google cautioned from the very beginning to only use the tool for disavowing links that the site publishers (or their SEOs) are responsible for.

The first paragraph of Google’s October 2012 announcement of the link disavow tool leaves no doubt on when to use the tool:

“Today we’re introducing a tool that enables you to disavow links to your site.

If you’ve been notified of a manual spam action based on ‘unnatural links’ pointing to your site, this tool can help you address the issue.

If you haven’t gotten this notification, this tool generally isn’t something you need to worry about.”

The message couldn’t be clearer.

But at some point in time, link disavowing became a service applied to random and “spammy looking” links, which is not what the tool is for.

Link Disavow Takes Months To Work

There are many anecdotes about link disavows that helped sites regain rankings.

They aren’t lying, I know credible and honest people who have made this claim.

But here’s the thing, John Mueller has confirmed that the link disavow process takes months to work its way through Google’s algorithm.

Sometimes things happen that are not related, no correlation. It just looks that way.

John shared how long it takes for a link disavow to work in a Webmaster Hangout:

“With regards to this particular case, where you’re saying you submitted a disavow file and then the ranking dropped or the visibility dropped, especially a few days later, I would assume that that is not related.

So in particular with the disavow file, what happens is we take that file into account when we reprocess the links kind of pointing to your website.

And this is a process that happens incrementally over a period of time where I would expect it would have an effect over the course of… I don’t know… maybe three, four, five, six months …kind of step by step going in that direction.

So if you’re saying that you saw an effect within a couple of days and it was a really strong effect then I would assume that this effect is completely unrelated to the disavow file. …it sounds like you still haven’t figured out what might be causing this.”

John Mueller: Negative SEO and Link Disavow Companies are Making Stuff Up

Context is important to understand what was said.

So here’s the context for John Mueller’s remark.

An SEO responded to Ryan’s tweet about being shocked at how many SEOs regularly disavow links.

The person responding to Ryan tweeted that disavowing links was still important, that agencies provide negative SEO services to take down websites and that link disavow is a way to combat the negative links.

The SEO (SEOGuruJaipur) tweeted:

“Google still gives penalties for backlinks (for example, 14 Dec update, so disavowing links is still important.”

SEOGuruJaipur next began tweeting about negative SEO companies.

Negative SEO companies are those that will build spammy links to a client’s competitor in order to make the competitor’s rankings drop.

SEOGuruJaipur tweeted:

“There are so many agencies that provide services to down competitors; they create backlinks for competitors such as comments, bookmarking, directory, and article submission on low quality sites.”

SEOGuruJaipur continued discussing negative SEO link builders, saying that only high trust sites are immune to the negative SEO links.

He tweeted:

“Agencies know what kind of links hurt the website because they have been doing this for a long time.

It’s only hard to down for very trusted sites. Even some agencies provide a money back guarantee as well.

They will provide you examples as well with proper insights.”

John Mueller tweeted his response to the above tweets:

“That’s all made up & irrelevant.

These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”

Then someone else joined the discussion:

Mueller tweeted a response:

“Don’t waste your time on it; do things that build up your site instead.”

Unambiguous Statement on Negative SEO and Link Disavow Services

A statement by John Mueller (or anyone) can appear to conflict with prior statements when taken out of context.

That’s why I not only placed his statements into their original context but also the history going back eleven years that is a part of that discussion.

It’s clear that John Mueller feels that those selling negative SEO services and those providing disavow services outside of the intended use are “making stuff up” and “cashing in” on clients who might not “know better.”

Featured image by Shutterstock/Asier Romero



Source link

Continue Reading

SEO

Source Code Leak Shows New Ranking Factors to Consider

Published

on

Source Code Leak Shows New Ranking Factors to Consider

January 25, 2023, the day that Yandex—Russia’s search engine—was hacked. 

Its complete source code was leaked online. And, it might not be the first time we’ve seen hacking happen in this industry, but it is one of the most intriguing, groundbreaking events in years.

But Yandex isn’t Google, so why should we care? Here’s why we do: these two search engines are very similar in how they process technical elements of a website, and this leak just showed us the 1,922 ranking factors Yandex uses in its algorithm. 

Simply put, this information is something that we can use to our advantage to get more traffic from Google.

Yandex vs Google

As I said, a lot of these ranking factors are possibly quite similar to the signals that Google uses for search.

Yandex’s algorithm shows a RankBrain analog: MatrixNext. It also seems that they are using PageRank (almost the same way as Google does), and a lot of their text algorithms are the same. Interestingly, there are also a lot of ex-Googlers working in Yandex. 

So, reviewing these factors and understanding how they play into search rankings and traffic will provide some very useful insights into how search engines like Google work. No doubt, this new trove of information will greatly influence the SEO market in the months to come. 

That said, Yandex isn’t Google. The chances of Google having the exact same list of ranking factors is low — and Google may not even give that signal the same amount of weight that Yandex does. 

Still, it’s information that potentially will be useful for driving traffic, so make sure to take a look at them here (before it’s scrubbed from the internet forever).

An early analysis of ranking factors

Many of their ranking factors are as expected. These include:

  • Many link-related factors (e.g., age, relevancy, etc.).
  • Content relevance, age, and freshness.
  • Host reliability
  • End-user behavior signals.

Some sites also get preference (such as Wikipedia). FI_VISITS_FROM_WIKI even shows that sites that are referenced by Wikipedia get plus points. 

These are all things that we already know.

But something interesting: there were several factors that I and other SEOs found unusual, such as PageRank being the 17th highest weighted factor in Yandex, and the 19th highest weighted factor being query-document relevance (in other words, how close they match thematically). There’s also karma for likely spam hosts, based on Whois information.

Other interesting factors are the average domain ranking across queries, percent of organic traffic, and the number of unique visitors.

You can also use this Yandex Search Ranking Factor Explorer, created by Rob Ousbey, to search through the various ranking factors.

The possible negative ranking factors:

Here’s my thoughts on Yandex’s factors that I found interesting: 

FI_ADV: -0.2509284637 — this factor means having tons of adverts scattered around your page and buying PPC can affect rankings. 

FI_DATER_AGE: -0.2074373667 — this one evaluates content age, and whether your article is more than 10 years old, or if there’s no determinable date. Date metadata is important. 

FI_COMM_LINKS_SEO_HOSTS: -0.1809636391 — this can be a negative factor if you have too much commercial anchor text, particularly if the proportion of such links goes above 50%. Pay attention to anchor text distribution. I’ve written a guide on how to effectively use anchor texts if you need some help on this. 

FI_RANK_ARTROZ — outdated, poorly written text will bring your rankings down. Go through your site and give your content a refresh. FI_WORD_COUNT also shows that the number of words matter, so avoid having low-content pages.

FI_URL_HAS_NO_DIGITS, FI_NUM_SLASHES, FI_FULL_URL_FRACTION — urls shouldn’t have digits, too many slashes (too much hierarchy), and of course contain your targeted keyword.

FI_NUM_LINKS_FROM_MP — always interlink your main pages (such as your homepage or landing pages) to any other important content you want to rank. Otherwise, it can hurt your content.

FI_HOPS — reduce the crawl depth for any pages that matter to you. No important pages should be more than a few clicks away from your homepage. I recommend keeping it to two clicks, at most. 

FI_IS_UNREACHABLE — likewise, avoid making any important page an orphan page. If it’s unreachable from your homepage, it’s as good as dead in the eyes of the search engine.

The possible positive ranking factors:

FI_IS_COM: +0.2762504972 — .com domains get a boost in rankings.

FI_YABAR_HOST_VISITORS — the more traffic you get, the more ranking power your site has. The strategy of targeting smaller, easier keywords first to build up an audience before targeting harder keywords can help you build traffic.

FI_BEAST_HOST_MEAN_POS — the average position of the host for keywords affects your overall ranking. This factor and the previous one clearly show that being smart with your keyword and content planning matters. If you need help with that, check out these 5 ways to build a solid SEO strategy.

FI_YABAR_HOST_SEARCH_TRAFFIC — this might look bad but shows that having other traffic sources (such as social media, direct search, and PPC) is good for your site. Yandex uses this to determine if a real site is being run, not just some spammy SEO project.

This one includes a whole host of CTR-related factors. 

CTR ranking factors from Yandex

It’s clear that having searchable and interesting titles that drive users to check your content out is something that positively affects your rankings.

Google is rewarding sites that help end a user’s search journey (as we know from the latest mobile search updates and even the Helpful Content update). Do what you can to answer the query early on in your article. The factor “FI_VISITORS_RETURN_MONTH_SHARE“ also shows that it helps to encourage users to return to your site for more information on the topics they’re interested in. Email marketing is a handy tool here.

FI_GOOD_RATIO and FI_MANY_BAD — the percentage of “good” and “bad” backlinks on your site. Getting your backlinks from high-quality websites with traffic is important for your rankings. The factor FI_LINK_AGE also shows that adding a link-building strategy to your SEO as early as possible can help with your rankings.

FI_SOCIAL_URL_IS_VERIFIED — that little blue check has actual benefits now. Links from verified accounts have more weight.

Key Takeaway

Yandex and Google, being so similar to each other in theory, means that this data leak is something we must pay attention to. 

Several of these factors may already be common knowledge amongst SEOs, but having them confirmed by another search engine enforces how important they are for your strategy.

These initial findings, and understanding what it might mean for your website, can help you identify what to improve, what to scrap, and what to focus on when it comes to your SEO strategy. 

Source link

Continue Reading

SEO

Top 7 SEO Keyword Research Tools For Agencies

Published

on

Top 7 SEO Keyword Research Tools For Agencies

All successful SEO campaigns rely on accurate, comprehensive data. And that process starts with the right keyword research tools.

Sure, you can get away with collecting keyword data manually on your own. But while you may be saving the cost of a premium tool, manual keyword research costs you in ot

her ways:

  • Efficiency. Doing keyword research manually is time intensive. How much is an hour of your time worth?
  • Comprehensiveness. Historical and comprehensive data isn’t easy to get on your own. It’s too easy to miss out on vital information that will make your SEO strategy a success.
  • Competition. Keyword research tools allow you to understand not only what users are searching for but also what your competition focuses on. You can quickly identify gaps and find the best path to profitability and success.
  • Knowledge. Long-time SEO experts can craft their own keyword strategies with a careful analysis of the SERPs, but that requires years of practice, trial, and costly errors. Not everyone has that experience. And not everyone has made enough mistakes to avoid the pitfalls.

A good SEO keyword research tool eliminates much of the guesswork. Here are seven well-known and time-tested tools for SEO that will get you well on the way to dominating your market.

1. Google Keyword Planner

Screenshot from Google Keyword Planner, January 2023

Cost: Free.

Google Keyword Planner is a classic favorite.

It’s free, but because the information comes directly from the search engine, it’s reliable and trustworthy. It’s also flexible, allowing you to:

  • Identify new keywords.
  • Find related keywords.
  • Estimate the number of searches for each variation.
  • Estimate competition levels.

The tool is easy to access and available as a web application and via API, and it costs nothing; it just requires a Google Ads account.

You must also be aware of a few things when using this tool.

First, these are estimates based on historical data. That means if trends change, it won’t necessarily be reflected here.

Google Keyword Planner also can’t tell you much about the SERP itself, such as what features you can capitalize on and how the feature converts.

Because it’s part of Google Ads, PPC experience can help you gain more insights. You’ll find trends broadly across a demographic or granular level, like a city, region, or major city.

Google Keyword Planner also tends to combine data for similar keywords. So, if you want to know if [keyword near me] is better than [keywords near me], you’ll need a different tool.

Lastly, the tool uses broad definitions of words like “competition,” which doesn’t tell you who is ranking for the term, how much they’re investing to hold that ranking, or how likely you are to unseat them from their coveted top 10 rankings.

That being said, it’s an excellent tool if you just want to get a quick look or fresh ideas, if you’d like to use an API and create your own tools, or simply prefer to do the other tasks yourself.

2. Keyword.io

Cost: Free, $29 per month, and $49 per month.

If Google’s Keyword Planner isn’t quite enough, but you’re on a tight budget, Keyword.io may be the alternative you need. It also has different features.

Keyword.io uses autocomplete APIs to pull basic data for several sites and search engines, including Google, Amazon, eBay, Bing, Wikipedia, Alibaba, YouTube, Yandex, Fiverr, and Fotolia. This is perfect for niche clients and meeting specific needs.

It also has a Question/Intent Generator, an interactive topic explorer, and a topical overview tool.

In its user interface (UI), you’ll find an easy-to-use filter system and a chart that includes the competition, search volume, CPC, and a few other details about your chosen keywords.

It does have some limits, however.

You can run up to 20,000 keywords per seed with a limit of 100 requests per day (five per minute) or 1,000 requests per day (10 per minute) on its paid plans.

Its API access, related keywords tool, Google Ad data, and other features are also limited to paid accounts.

3. Semrush

Semrush's keyword toolScreenshot from Semrush

Cost: $119.95 to $449.95 per month.

In its digital marketing suite, Semrush offers a collection of six keyword tools and four competitive analysis tools with a database of more than 21 billion keywords.

You can get a full overview of the keywords you’re watching, including paid and organic search volume, intent, competition, CPC, historical data, SERP analysis, and more.

You’ll get related keywords and questions, as well as a ton of guidance, ideas, and suggestions from the Semrush Magic, Position Tracking, and Organic Traffic Insights tools.

The Keyword Planner, however, is where much of the magic happens.

The organic competitor tab makes it easy to spot content and keyword gaps. Expand them and develop clusters that will help you grab traffic and conversions.

You can also see long-tail keyword data and other data to see what Page 1 holds regarding competition, difficulty, and opportunities at a broad or hyperlocal level.

The full suite of tools is a huge benefit. Teams can collaborate, share insights, and plan.

The seamless integration allows you to integrate your data, meaning teams can easily collaborate, share insights, and strategize.

And when you’re done, it can track everything you need for a successful digital marketing strategy.

Some of the tools they offer include:

  • On-page SEO tools.
  • Competitive analysis suite.
  • Log file analysis.
  • Site auditing.
  • Content marketing tools.
  • Marketing analysis.
  • Paid advertising tools.
  • Local SEO tools.
  • Rank tracking.
  • Social media management.
  • Link-building tools.
  • Amazon marketing tools.
  • Website monetization tools.

Semrush’s best features when it comes to keyword research are its historical information and PPC metrics.

You can deep dive into campaigns and keywords to unlock the secrets of the SERPs and provide agency or in-house teams with priceless information they don’t usually access.

4. Moz Keyword Explorer

Keyword Research Tool From MozScreenshot from Moz, January 2023

Cost: Free for 10 queries per month. $99-$599 per month.

With a database of more than 500 million keywords, Moz Keyword Explorer may be a great option if you’re looking to build a strategy rather than get a quick view of the data for a few keywords.

Moz has long been a leader in the SEO space.

Constantly updating and improving its Keyword Explorer Tool and its other core services, Moz keeps up with the trends and is well known for providing SEO professionals with the latest tools. And it has done so for more than a decade.

Like the Google Keyword Tool, Moz’s keyword planning tool provides information on the difficulty and monthly search volume for terms. It also lets you drill down geographically.

When you start, you’ll find the Keyword Overview, which provides monthly search volumes, ranking difficulty, organic click-through opportunities, and an estimated priority level.

You can also:

  • Find new relevant keywords you should be targeting but aren’t.
  • Learn how your site performs for keywords.
  • Find areas where you can improve your SEO (including quick wins and larger investments).
  • Prioritize keywords for efficient strategy creation.
  • Top SERP analysis and features.
  • Competitor analysis.
  • Organic click-through rates.

Unlike the Google Keyword Tool, however, Moz supplies you with data beyond the basics. Think of it like keyword research and SERP analysis.

Moz does tend to have fewer keyword suggestions. And like Google’s Keyword Planner, it provides range estimates for search data rather than a specific number.

However, the database is updated frequently, so you can feel confident that you’re keeping up with the constant change in consumer search habits and rankings.

Plus, it’s easy to use, so teams can quickly take care of marketing tasks like finding opportunities, tracking performance, identifying problem areas, and gathering page-level details.

Moz also offers several other tools to help you get your site on track and ahead of the competition, but we really like it for its keyword research and flexibility.

5. Ahrefs Keyword Explorer

Cost: $99-$999 per month.

If I had to describe Ahrefs in one word, it would be power.

Enter a word into the search box, and you’re presented with multiple panels that can tell you everything you want to know about that keyword.

Total search volume, clicks, difficulty, the SERP features, and even a volume-difficulty distribution. And while it may look like a lot, all the information is well-organized and clearly presented.

Ahrefs provides terms in a parent-child topic format, providing the terms with context, so you can easily learn more about the terms, such as intent, while identifying overlap and keeping it all easy to find and understand.

These topics appear when you search for a related term, including the term’s ranking on the SERP, SERP result type, first-page ranking difficulty scores, and a snapshot of the user-delivered SERP. You can stay broad or narrow it all down by city or language.

Ahrefs can get a bit expensive. Agencies may find it difficult to scale if they prefer several user or client accounts, but it’s still one of the best and most reliable keyword research tools on the market.

What I really like about Ahrefs is that it’s thorough. It has one of the largest databases of all the tools available (19.2 billion keywords, 10 search engines, and 242 countries at the time of writing), and it’s regularly updated.

It makes international SEO strategies a breeze and includes data for everything from Google and Bing to YouTube and Amazon.

Plus, they clearly explain their metrics and database. And that level of transparency means trust.

Other tools in the suite include:

  • Site Explorer.
  • Site auditing.
  • Rank tracking.
  • Content Explorer.

6. SERanking

SERanking's Keyword Research ToolScreenshot from SERanking, November 2022.

Cost: $23.52-$239 per month, depending on the ranking check and payment frequency.

SERanking shines as a keyword research tool within an all-around SEO toolkit. SERanking helps you keep costs down while offering features that allow agencies to meet clients’ unique needs.

One of the first things you’ll notice when you log in is its intuitive user interface. But this tool isn’t just another pretty online tool.

Its database is robust.

SERanking’s U.S. database includes 887 million keywords, 327 million U.S. domains, and 3 trillion indexed backlinks. And this doesn’t include its expansive European and Asian databases.

The overview page provides a solid look at the data, which includes search volume, the CPC, and a difficulty score.

SERanking also provides lists of related and low-volume keywords if you need inspiration or suggestions, as well as long-tail keyword suggestions with information about SERP features, competition levels, search volume, and other details you need to know to identify new opportunities.

Of course, identifying keywords is only the start of the mystery. How do you turn keywords into conversions? SERanking provides keyword tools that help you answer this question.

You can find out who the competition is in the organic results and see who is buying search ads, as well as details like estimated traffic levels and copies of the ads they’re using.

This allows you to see what’s working, gain insights into the users searching for those terms, and generate new ideas to try.

SERanking offers agency features, such as white labeling, report builders, lead generator, and other features you’ll find helpful.

However, one of the features agencies might find most helpful in keyword research is SERanking’s bulk keyword analysis, which lets you run thousands of keywords and download full reports for all the terms that matter.

Other tools in the SERanking Suite include:

  • Keyword Rank Tracker.
  • Keyword Grouper.
  • Keyword Suggestions and Search Volume Checker.
  • Index Status checker.
  • Backlink Checker.
  • Backlink monitoring.
  • Competitive research tool.
  • Website auditing tool.
  • On-page SEO Checker.
  • Page Changes Monitor.
  • Social media analytics.
  • Traffic analysis.

SERanking is more affordable than some of the other tools out there, but it does come at a cost.

It isn’t as robust as some of its competitors and doesn’t get as granular in the same way, but it still provides the features and data you need to create a successful SEO strategy.

And with its flexible pricing, this tool is well worth considering.

7. BrightEdge Data Cube

Cost: Custom pricing model.

If you’re looking for an AI-powered digital marketing tool suite that includes a quality research tool, BrightEdge may be the right option for you.

Unlike other tools that focus on supplying you with data and ways to analyze that data, BrightEdge looks to do much of the time-consuming analysis for you.

Among its search, content, social, local, and mobile solutions, you’ll find Data Cube – an AI-backed content and keyword tool that uses natural language processing to find related topics and keywords.

You’ll also encounter DataMind, an AI that helps you find search trends, changes in consumer behaviors, and important competitor movements you need to know about.

The two together make it quick and easy to perform keyword research, build out topics, create content strategies, and strengthen your SEO plans.

Once you enter a topic or broad keyword, the tool will provide you with relevant keywords, the search volume, competition levels, keyword value, it’s universal listing, and the number of words in the phrase.

Filter the results by a custom set of criteria to narrow the list down and get the necessary information.

Once you have a list, select the ones you want to keep and download them or use them with BrightEdge’s other tools to create full strategies and gain more insights.

This could include competitor analysis, analyzing SERP features, intent, or other tasks.

For agencies that provide local SEO, BrightEdge also offers HyperLocal, which helps you find and track keywords and keyword performance at the local level.

When you’re done, give the Opportunity Forecasting and tracking tools a try to monitor your progress and provide clients with the information they care about.

Perhaps the nicest feature for agencies is its Storybuilder – a reporting tool that allows you to create rich client reports that provide clients with targeted overviews and the data they’re most interested in.

If this sounds like the right tool for you, the company gives demos, but there are a few things you should consider.

First, it only updates once per month. And while the company keeps its pricing close to the chest, this digital marketing tool suite is a significant investment. It may not be the best choice if keyword research is the only thing you need.

Secondly, while the tools are highly sophisticated and refined, there is a learning curve to get started.

You’ll also discover that there are limits on features like keyword tracking, and it can be time-consuming to set up, with some adjustments requiring technical support.

Lastly, BrightEdge’s keyword research tool doesn’t let you get too far into the weeds and doesn’t include PPC traffic.

That aside, agencies and larger brands will find that it scales easily, has a beautifully designed UI, and makes you look great to clients.

The Best Agency SEO Keyword Research Tools

This list only contains seven of the many tools available today to help you get your keyword research done to an expert degree.

But no matter how many of the tools we share with you or which ones, it’s important to understand that none are flawless.

Each tool has its own unique strengths and weaknesses, so selecting a platform is very much dependent on the types of clients that you typically work with and personal preference.

In reality, you’ll likely find that you prefer to work between a few tools to accomplish everything you’d like.

Google Keyword Planner and Keyword.io are top choices when you want a quick look at the data, or you’d like to export the data to work on elsewhere. You may even want to use this data with the other tools mentioned in this chapter.

Ahrefs, Moz, Semrush, and BrightEdge are far more robust and are better suited to agency SEO tasks.

While not free (although they offer free plans or a trial period except BrightEdge), they allow you to really dig into the search space, ultimately resulting in higher traffic, more conversions, and stronger SEO strategies. These benefits require more time and often come with a learning curve.

By far, the most important keyword research tool you have access to is you.

Keyword research is more than simply choosing the keywords with the biggest search volume or the phrase with the lowest Cost Per Click (CPC).

It’s your expertise, experience, knowledge, and insights that transform data into digital marketing you can be proud of.


Featured Image: Paulo Bobita/Search Engine Journal



Source link

Continue Reading

Trending

en_USEnglish