Connect with us


How To Use Chrome UX Report To Improve Your Site Performance



How To Use Chrome UX Report To Improve Your Site Performance

Measuring success in website performance is a tricky task for small business owners.

It can be easy to apply the same approach that works when evaluating viability throughout a company.

Black and white, hard and fast numbers. Results.

The truth of website performance is more of a multilayered exploration of RUM (Real User Measurements) within the context of the web at large.

Looking at RUM such as performance, page load, and page views gives a detailed picture of hard data.

However, RUM data is only part of a bigger equation to measure a website’s impact and success.

The CrUX Of The Matter: The State Of The Web As Experienced By Real Users

To be truly informative and deliver actionable data, you must balance users’ experiences on your site within the scope of user experience across the internet.

Stepping back lets businesses understand their site’s performance and know where those metrics land within a vast array of parameters.

This is where the Chrome UX Report enters the fray.

Making Web Performance Data On A Broad Scale Accessible To The Masses

First established by Google in 2017, the Chrome UX Report is a publicly available dataset of real user measurements.

Also known as the CrUX report, it gathers website performance data for Chrome users from millions of websites.

If that data were only compiled, it would be an incredible – but difficult to utilize – resource. When paired with the right program, however, the data is transformed.

When put to use correctly, the CrUX Report transforms an immense collection of web performance data into a clear and accessible resource.

To better understand the data compiled in the CrUX Report and how best to utilize it, we need to step back.

It’s time to review Chrome’s Core Web Vitals.

Page Experience And User Experience Are A Direct Result Of The Health Of A Website

Google is always seeking innovative and cutting-edge ways to provide users with a smooth, crisp online experience.

A significant cornerstone of that effort is Google’s work empowering website owners to maximize their sites.

When website owners deliver user-friendly, beneficial sites effectively, everyone wins.

The only way to achieve success in an endeavor, though, is to know what the rules and standards are.

In May 2020, Google released a new set of metrics to help evaluate website performance as it impacted user experience.

These were its Core Web Vitals.

The goal was to clear away minor and arbitrary details muddying up the water.

To do so, Google narrowed down a website’s user experience score to three core measurements:

  • LCP (Largest Contentful Paint).
  • FID (First Input Delay).
  • CLS (Cumulative Layout Shift).

Core Web Vitals are anchored on the crucial role of page experience in the more extensive user experience.

How does Google define page experience?

For its purposes, page experience measures how users perceive their experience interacting with an individual web page.

Building off of that, they define CWV as:

“A set of real-world, user-centered metrics that quantify key aspects of the user experience. They measure dimensions of web usability such as load time, interactivity, and the stability of content as it loads.”

When merged, the three elements of Core Web Vitals – LCP, FID, and CLS – deliver powerful insight.

As a whole, CWV results provide a precise picture of a user’s page experience on an individual website.

This page experience ultimately defines their user experience as a whole when utilizing the Google search engine.

A year after the initial announcement, it was made permanent.

Google finalized the integration of the new metrics with the permanent inclusion of Core Web Vitals into its algorithm.

Core Web Vitals zero in on website speed, responsiveness, and visual stability. However, it should be noted that they are part of a vast picture.

The Google search algorithm is always evolving, a challenge that demands vigilance on the part of professional SEO operators.

There were once reportedly more than 200 ranking factors.

Nowadays, some are weighted far more heavily and studied throughout the industry.

These factors continue to evolve as Google adapts to new data and user behavior.

The best practice is to review its ranking factors on a year-to-year basis.

Breaking Down The Google Search Algorithm

Despite the blanket of lore that has covered it over the years, the Google search algorithm has a definitive foundation.

It is anchored by the company’s commitment to ensuring a smooth and efficient search experience for all users.

Google continuously molds its search algorithm year-round to best meet the present and future demands of the global population.

The Google search algorithm focuses on a litany of factors that perpetually fluctuates.

In response, the SEO industry has consistently honed in on the key elements to focus on every year.

Knowing where to prioritize your focus when designing your website empowers you to achieve an optimal presence in the rankings.

These can include, but are not limited to:

  • Core Web Vitals.
  • High-Quality Content.
  • On-Page Optimization.
  • User Engagement.
  • Link Building/Domain Authority.
  • Effective Keyword Strategy (Meta Title Tags).
  • Mobile-Friendliness.
    • Mobile versions given priority.
  • Search Intent.
  • Internal Links.
  • Video Search (including Clip Markup and Seek Markup).
  • HTTPS Site Security.
  • Featured Snippets.

A Deep Dive Into Core Web Vitals

It is great to understand the roles of page and user experience in forming the metrics that comprise CWV.

For those seeking to maximize the Chrome UX Report for the benefit of their companies, a comprehensive understanding is crucial. It is important to understand the roles of the individual elements of the CWV.


Largest Contentful Paint measures the loading performance of a page.

The LCP metric quantifies this by evaluating the amount of time it takes to load the largest piece of content (video, image, text block, etc.) from the moment a user requests the URL.

Google recommends that websites keep LCP under 2.5 seconds for 75% of their page loads.


First Input Delay measures the interactivity of a page.

More precisely, it is the time between an action to when a browser responds to that action.

From the moment a link, button, or other actionable element is clicked to the precise moment the page responds, transforming the webpage from static to interactive.

Google recommends that websites keep their FID under 100 milliseconds for 75% of page loads.


Cumulative Layout Shift measures every layout shift that occurs across a site.

It begins with zero (no shifting), increasing to a positive number (in correlation with the total amount of shifting).

Factors that result in shifting range from the appearance of buttons, images that force a text block to move, and dropdown banner ads.

Google recommends websites strive for a CLS score of 0.1 or less.

Boiled down to a central doctrine, one could argue simply: Speed wins.

In the context of the user experience, a website’s performance is integral; it is judged and measured with precision.

This is the crux of the SI (Speed Index) metric: How quickly the content of a page is visually displayed.

The speed of delivery and the performance of the information/content delivered have an extremely short window to win over users.

SI scores reward those pages that load a lot of data in the shortest amount of time, resulting in a better user experience.

Perception And The Unique Relationship Of LCP And CLS

Anyone who has ever interacted with a web page knows the pain of waiting for a website to load fully.

For seasoned internet veterans, the truly difficult memories rest in the dreaded dial-up era.

Those who survived those dark times could hear the America Online (AOL) internet connection sounds in their sleep.

America is experiencing the gradual rollout of 5G mobile networks and 5G wireless internet.

As a result, lightning-fast speed impacts every aspect of modern life; speed is the expectation.

When a visitor experiences significant lag time waiting for your website to load, this has severe consequences.

It negatively impacts their perception of your brand and often causes them to leave entirely.

In Google’s efforts to define page performance’s inner workings, the content load process carried significant influence.

LCP may sound intimidating, but it’s fairly straightforward.

At its core, it asks: How quickly is the most meaningful content of a web page loaded?

This content can include various elements, including images, image tags, video thumbnails, background images with CSS, and text.

LCP highlights the importance of user perception.

When does a web page first begin to matter to a visitor? When the most visible, meaningful element appears.

The gap between when a user first clicks to initiate a page and when they first see core content can feel like an eternity.

Given the immediate and critical nature of this first interaction, Google recommends an individual page’s content loads in 2.5 seconds or less.

Concentrated effort to minimize LCP allows visitors to see and experience a website faster!

And this is great.

It’s foundational to a quality user experience and rightly earns its place as one of the Core Web Vitals. But what then?

If the website that loads for the visitor is a frustrating interaction, the promise of that pristine LCP is negated.

Here, the user experience event measured as CLS makes itself known.

Similar to LCP, user perception is central to the importance of CLS.

CLS quantifies the perceived visual stability of a page, measuring how much a page unexpectedly shifts throughout the lifetime of the page.

The focus here is on the totality and extent of unexpected shifting a visitor experiences.

Regardless of how long a visitor spends on a page, they should expect a smooth experience without repetitive, jarring adjustments.

Whether it’s a brief snippet of news or an in-depth analysis that requires significant scrolling, the results should be the same.

Equally important, both websites deserve to be judged fairly, as well.

To ensure this, Google created a way to highlight key layout shifts, developing a unique system.

This system involved, among many factors, session windows, session gaps, and layout shift amounts.

The result of Google’s ongoing efforts is a metric that shines a much-needed light on what is a massive part of a visitor’s experience.

Image Optimization Is Central To LCP And CLS Strategies

Optimization efforts for LCP and CLS highlight the necessity of understanding and implementing Image Optimization.

When you review techniques that help improve results for either metric, a consistent pattern emerges.

To deliver a website with quality Core Web Vitals, owners must be purposeful in their approach.

They need to intentionally focus on managing the specific details of image and video files and the timing of their integration.

Largest Contentful Paint

  • Image optimization, including size, compression, format, and attributes.
  • Streamlined utilization of CSS and JavaScript
  • Optimal ratio of server-side and client-side rendering.
  • Top-shelf server response time.
Cumulative Layout Shift

  • Specify width + height attributes for images and videos.
  • Properly integrate content (i.e., ads and embeds) with effectively-timed JavaScript
  • Understand best practices to navigate web fonts.
  • Navigate page changes with expected layout shifts.

The Importance Of Image Optimization For LCP

Image optimization and compression are particularly important factors impacting LCP.

This is especially true for website owners unable to adjust their server response time.

Image optimization focuses on utilizing the most effective attributes for images.

This includes optimal size and resolution and capitalizing on cutting-edge file formats.

The latest and most effective image format is Google’s WebP, which delivers optimal lossless and lossy compression for web images.

WebP image compression plugins and manual pre-upload converters are often free and easy to use.

Width And Height Attributes Are Critical Factors In Preventing Cumulative Layout Shift

Few things are as aggravating as content shifting unexpectedly when a visitor attempts to read the information on a website.

It often seems that the browser is waiting for the perfect moment to strike, right when they’ve become engrossed.

Suddenly, a jarring shift occurs, and they are left to figure out where their spot shifted.

This experience will consistently and negatively impact a user’s experience, leading to increasingly difficult odds of retaining them.

Setting proper attributes is a simple but extremely effective step a website owner can take to prevent CLS.

By setting Width and Height attributes for images and videos, they preemptively restrict the browser to specific guidelines.

Without declared Width and Height dimensions, there is little instruction ensuring the visual content doesn’t cause havoc when loaded.

With responsive images, the impact of setting max-width with CSS is amplified.

This measure applies to ad images and videos, highlighting the importance of reserving sufficient space for ads in CSS.

Taking the time to constrain ad content effectively ensures it doesn’t shift a page that has already loaded.

Optimizing CWV Performance To Maximize The CrUX Report

Business leaders don’t just want their websites to perform at peak page proficiency. They want to ensure it will attract and effectively hold onto visitors.

It’s easier to turn off a potential site visitor than to find a debate raging somewhere on social media.

Today’s internet users are increasingly mobile-centric and expectant of lightning-fast speeds.

To not only survive in that world but to thrive, leaders need to understand Core Web Vitals.

They need to grasp the interlocking elements of Core Web Vitals and how they impact performance.

Once armed with that knowledge and understanding, leaders are equipped to maximize the benefits of the CrUX Report.

It’s easy to see the benefits of letting the CrUX Report speak into and guide best practices.

More resources:

Featured Image: Saxarinka/Shutterstock

Source link


Source Code Leak Shows New Ranking Factors to Consider



Source Code Leak Shows New Ranking Factors to Consider

January 25, 2023, the day that Yandex—Russia’s search engine—was hacked. 

Its complete source code was leaked online. And, it might not be the first time we’ve seen hacking happen in this industry, but it is one of the most intriguing, groundbreaking events in years.

But Yandex isn’t Google, so why should we care? Here’s why we do: these two search engines are very similar in how they process technical elements of a website, and this leak just showed us the 1,922 ranking factors Yandex uses in its algorithm. 

Simply put, this information is something that we can use to our advantage to get more traffic from Google.

Yandex vs Google

As I said, a lot of these ranking factors are possibly quite similar to the signals that Google uses for search.

Yandex’s algorithm shows a RankBrain analog: MatrixNext. It also seems that they are using PageRank (almost the same way as Google does), and a lot of their text algorithms are the same. Interestingly, there are also a lot of ex-Googlers working in Yandex. 

So, reviewing these factors and understanding how they play into search rankings and traffic will provide some very useful insights into how search engines like Google work. No doubt, this new trove of information will greatly influence the SEO market in the months to come. 

That said, Yandex isn’t Google. The chances of Google having the exact same list of ranking factors is low — and Google may not even give that signal the same amount of weight that Yandex does. 

Still, it’s information that potentially will be useful for driving traffic, so make sure to take a look at them here (before it’s scrubbed from the internet forever).

An early analysis of ranking factors

Many of their ranking factors are as expected. These include:

  • Many link-related factors (e.g., age, relevancy, etc.).
  • Content relevance, age, and freshness.
  • Host reliability
  • End-user behavior signals.

Some sites also get preference (such as Wikipedia). FI_VISITS_FROM_WIKI even shows that sites that are referenced by Wikipedia get plus points. 

These are all things that we already know.

But something interesting: there were several factors that I and other SEOs found unusual, such as PageRank being the 17th highest weighted factor in Yandex, and the 19th highest weighted factor being query-document relevance (in other words, how close they match thematically). There’s also karma for likely spam hosts, based on Whois information.

Other interesting factors are the average domain ranking across queries, percent of organic traffic, and the number of unique visitors.

You can also use this Yandex Search Ranking Factor Explorer, created by Rob Ousbey, to search through the various ranking factors.

The possible negative ranking factors:

Here’s my thoughts on Yandex’s factors that I found interesting: 

FI_ADV: -0.2509284637 — this factor means having tons of adverts scattered around your page and buying PPC can affect rankings. 

FI_DATER_AGE: -0.2074373667 — this one evaluates content age, and whether your article is more than 10 years old, or if there’s no determinable date. Date metadata is important. 

FI_COMM_LINKS_SEO_HOSTS: -0.1809636391 — this can be a negative factor if you have too much commercial anchor text, particularly if the proportion of such links goes above 50%. Pay attention to anchor text distribution. I’ve written a guide on how to effectively use anchor texts if you need some help on this. 

FI_RANK_ARTROZ — outdated, poorly written text will bring your rankings down. Go through your site and give your content a refresh. FI_WORD_COUNT also shows that the number of words matter, so avoid having low-content pages.

FI_URL_HAS_NO_DIGITS, FI_NUM_SLASHES, FI_FULL_URL_FRACTION — urls shouldn’t have digits, too many slashes (too much hierarchy), and of course contain your targeted keyword.

FI_NUM_LINKS_FROM_MP — always interlink your main pages (such as your homepage or landing pages) to any other important content you want to rank. Otherwise, it can hurt your content.

FI_HOPS — reduce the crawl depth for any pages that matter to you. No important pages should be more than a few clicks away from your homepage. I recommend keeping it to two clicks, at most. 

FI_IS_UNREACHABLE — likewise, avoid making any important page an orphan page. If it’s unreachable from your homepage, it’s as good as dead in the eyes of the search engine.

The possible positive ranking factors:

FI_IS_COM: +0.2762504972 — .com domains get a boost in rankings.

FI_YABAR_HOST_VISITORS — the more traffic you get, the more ranking power your site has. The strategy of targeting smaller, easier keywords first to build up an audience before targeting harder keywords can help you build traffic.

FI_BEAST_HOST_MEAN_POS — the average position of the host for keywords affects your overall ranking. This factor and the previous one clearly show that being smart with your keyword and content planning matters. If you need help with that, check out these 5 ways to build a solid SEO strategy.

FI_YABAR_HOST_SEARCH_TRAFFIC — this might look bad but shows that having other traffic sources (such as social media, direct search, and PPC) is good for your site. Yandex uses this to determine if a real site is being run, not just some spammy SEO project.

This one includes a whole host of CTR-related factors. 

CTR ranking factors from Yandex

It’s clear that having searchable and interesting titles that drive users to check your content out is something that positively affects your rankings.

Google is rewarding sites that help end a user’s search journey (as we know from the latest mobile search updates and even the Helpful Content update). Do what you can to answer the query early on in your article. The factor “FI_VISITORS_RETURN_MONTH_SHARE“ also shows that it helps to encourage users to return to your site for more information on the topics they’re interested in. Email marketing is a handy tool here.

FI_GOOD_RATIO and FI_MANY_BAD — the percentage of “good” and “bad” backlinks on your site. Getting your backlinks from high-quality websites with traffic is important for your rankings. The factor FI_LINK_AGE also shows that adding a link-building strategy to your SEO as early as possible can help with your rankings.

FI_SOCIAL_URL_IS_VERIFIED — that little blue check has actual benefits now. Links from verified accounts have more weight.

Key Takeaway

Yandex and Google, being so similar to each other in theory, means that this data leak is something we must pay attention to. 

Several of these factors may already be common knowledge amongst SEOs, but having them confirmed by another search engine enforces how important they are for your strategy.

These initial findings, and understanding what it might mean for your website, can help you identify what to improve, what to scrap, and what to focus on when it comes to your SEO strategy. 

Source link

Continue Reading


Top 7 SEO Keyword Research Tools For Agencies



Top 7 SEO Keyword Research Tools For Agencies

All successful SEO campaigns rely on accurate, comprehensive data. And that process starts with the right keyword research tools.

Sure, you can get away with collecting keyword data manually on your own. But while you may be saving the cost of a premium tool, manual keyword research costs you in ot

her ways:

  • Efficiency. Doing keyword research manually is time intensive. How much is an hour of your time worth?
  • Comprehensiveness. Historical and comprehensive data isn’t easy to get on your own. It’s too easy to miss out on vital information that will make your SEO strategy a success.
  • Competition. Keyword research tools allow you to understand not only what users are searching for but also what your competition focuses on. You can quickly identify gaps and find the best path to profitability and success.
  • Knowledge. Long-time SEO experts can craft their own keyword strategies with a careful analysis of the SERPs, but that requires years of practice, trial, and costly errors. Not everyone has that experience. And not everyone has made enough mistakes to avoid the pitfalls.

A good SEO keyword research tool eliminates much of the guesswork. Here are seven well-known and time-tested tools for SEO that will get you well on the way to dominating your market.

1. Google Keyword Planner

Screenshot from Google Keyword Planner, January 2023

Cost: Free.

Google Keyword Planner is a classic favorite.

It’s free, but because the information comes directly from the search engine, it’s reliable and trustworthy. It’s also flexible, allowing you to:

  • Identify new keywords.
  • Find related keywords.
  • Estimate the number of searches for each variation.
  • Estimate competition levels.

The tool is easy to access and available as a web application and via API, and it costs nothing; it just requires a Google Ads account.

You must also be aware of a few things when using this tool.

First, these are estimates based on historical data. That means if trends change, it won’t necessarily be reflected here.

Google Keyword Planner also can’t tell you much about the SERP itself, such as what features you can capitalize on and how the feature converts.

Because it’s part of Google Ads, PPC experience can help you gain more insights. You’ll find trends broadly across a demographic or granular level, like a city, region, or major city.

Google Keyword Planner also tends to combine data for similar keywords. So, if you want to know if [keyword near me] is better than [keywords near me], you’ll need a different tool.

Lastly, the tool uses broad definitions of words like “competition,” which doesn’t tell you who is ranking for the term, how much they’re investing to hold that ranking, or how likely you are to unseat them from their coveted top 10 rankings.

That being said, it’s an excellent tool if you just want to get a quick look or fresh ideas, if you’d like to use an API and create your own tools, or simply prefer to do the other tasks yourself.


Cost: Free, $29 per month, and $49 per month.

If Google’s Keyword Planner isn’t quite enough, but you’re on a tight budget, may be the alternative you need. It also has different features. uses autocomplete APIs to pull basic data for several sites and search engines, including Google, Amazon, eBay, Bing, Wikipedia, Alibaba, YouTube, Yandex, Fiverr, and Fotolia. This is perfect for niche clients and meeting specific needs.

It also has a Question/Intent Generator, an interactive topic explorer, and a topical overview tool.

In its user interface (UI), you’ll find an easy-to-use filter system and a chart that includes the competition, search volume, CPC, and a few other details about your chosen keywords.

It does have some limits, however.

You can run up to 20,000 keywords per seed with a limit of 100 requests per day (five per minute) or 1,000 requests per day (10 per minute) on its paid plans.

Its API access, related keywords tool, Google Ad data, and other features are also limited to paid accounts.

3. Semrush

Semrush's keyword toolScreenshot from Semrush

Cost: $119.95 to $449.95 per month.

In its digital marketing suite, Semrush offers a collection of six keyword tools and four competitive analysis tools with a database of more than 21 billion keywords.

You can get a full overview of the keywords you’re watching, including paid and organic search volume, intent, competition, CPC, historical data, SERP analysis, and more.

You’ll get related keywords and questions, as well as a ton of guidance, ideas, and suggestions from the Semrush Magic, Position Tracking, and Organic Traffic Insights tools.

The Keyword Planner, however, is where much of the magic happens.

The organic competitor tab makes it easy to spot content and keyword gaps. Expand them and develop clusters that will help you grab traffic and conversions.

You can also see long-tail keyword data and other data to see what Page 1 holds regarding competition, difficulty, and opportunities at a broad or hyperlocal level.

The full suite of tools is a huge benefit. Teams can collaborate, share insights, and plan.

The seamless integration allows you to integrate your data, meaning teams can easily collaborate, share insights, and strategize.

And when you’re done, it can track everything you need for a successful digital marketing strategy.

Some of the tools they offer include:

  • On-page SEO tools.
  • Competitive analysis suite.
  • Log file analysis.
  • Site auditing.
  • Content marketing tools.
  • Marketing analysis.
  • Paid advertising tools.
  • Local SEO tools.
  • Rank tracking.
  • Social media management.
  • Link-building tools.
  • Amazon marketing tools.
  • Website monetization tools.

Semrush’s best features when it comes to keyword research are its historical information and PPC metrics.

You can deep dive into campaigns and keywords to unlock the secrets of the SERPs and provide agency or in-house teams with priceless information they don’t usually access.

4. Moz Keyword Explorer

Keyword Research Tool From MozScreenshot from Moz, January 2023

Cost: Free for 10 queries per month. $99-$599 per month.

With a database of more than 500 million keywords, Moz Keyword Explorer may be a great option if you’re looking to build a strategy rather than get a quick view of the data for a few keywords.

Moz has long been a leader in the SEO space.

Constantly updating and improving its Keyword Explorer Tool and its other core services, Moz keeps up with the trends and is well known for providing SEO professionals with the latest tools. And it has done so for more than a decade.

Like the Google Keyword Tool, Moz’s keyword planning tool provides information on the difficulty and monthly search volume for terms. It also lets you drill down geographically.

When you start, you’ll find the Keyword Overview, which provides monthly search volumes, ranking difficulty, organic click-through opportunities, and an estimated priority level.

You can also:

  • Find new relevant keywords you should be targeting but aren’t.
  • Learn how your site performs for keywords.
  • Find areas where you can improve your SEO (including quick wins and larger investments).
  • Prioritize keywords for efficient strategy creation.
  • Top SERP analysis and features.
  • Competitor analysis.
  • Organic click-through rates.

Unlike the Google Keyword Tool, however, Moz supplies you with data beyond the basics. Think of it like keyword research and SERP analysis.

Moz does tend to have fewer keyword suggestions. And like Google’s Keyword Planner, it provides range estimates for search data rather than a specific number.

However, the database is updated frequently, so you can feel confident that you’re keeping up with the constant change in consumer search habits and rankings.

Plus, it’s easy to use, so teams can quickly take care of marketing tasks like finding opportunities, tracking performance, identifying problem areas, and gathering page-level details.

Moz also offers several other tools to help you get your site on track and ahead of the competition, but we really like it for its keyword research and flexibility.

5. Ahrefs Keyword Explorer

Cost: $99-$999 per month.

If I had to describe Ahrefs in one word, it would be power.

Enter a word into the search box, and you’re presented with multiple panels that can tell you everything you want to know about that keyword.

Total search volume, clicks, difficulty, the SERP features, and even a volume-difficulty distribution. And while it may look like a lot, all the information is well-organized and clearly presented.

Ahrefs provides terms in a parent-child topic format, providing the terms with context, so you can easily learn more about the terms, such as intent, while identifying overlap and keeping it all easy to find and understand.

These topics appear when you search for a related term, including the term’s ranking on the SERP, SERP result type, first-page ranking difficulty scores, and a snapshot of the user-delivered SERP. You can stay broad or narrow it all down by city or language.

Ahrefs can get a bit expensive. Agencies may find it difficult to scale if they prefer several user or client accounts, but it’s still one of the best and most reliable keyword research tools on the market.

What I really like about Ahrefs is that it’s thorough. It has one of the largest databases of all the tools available (19.2 billion keywords, 10 search engines, and 242 countries at the time of writing), and it’s regularly updated.

It makes international SEO strategies a breeze and includes data for everything from Google and Bing to YouTube and Amazon.

Plus, they clearly explain their metrics and database. And that level of transparency means trust.

Other tools in the suite include:

  • Site Explorer.
  • Site auditing.
  • Rank tracking.
  • Content Explorer.

6. SERanking

SERanking's Keyword Research ToolScreenshot from SERanking, November 2022.

Cost: $23.52-$239 per month, depending on the ranking check and payment frequency.

SERanking shines as a keyword research tool within an all-around SEO toolkit. SERanking helps you keep costs down while offering features that allow agencies to meet clients’ unique needs.

One of the first things you’ll notice when you log in is its intuitive user interface. But this tool isn’t just another pretty online tool.

Its database is robust.

SERanking’s U.S. database includes 887 million keywords, 327 million U.S. domains, and 3 trillion indexed backlinks. And this doesn’t include its expansive European and Asian databases.

The overview page provides a solid look at the data, which includes search volume, the CPC, and a difficulty score.

SERanking also provides lists of related and low-volume keywords if you need inspiration or suggestions, as well as long-tail keyword suggestions with information about SERP features, competition levels, search volume, and other details you need to know to identify new opportunities.

Of course, identifying keywords is only the start of the mystery. How do you turn keywords into conversions? SERanking provides keyword tools that help you answer this question.

You can find out who the competition is in the organic results and see who is buying search ads, as well as details like estimated traffic levels and copies of the ads they’re using.

This allows you to see what’s working, gain insights into the users searching for those terms, and generate new ideas to try.

SERanking offers agency features, such as white labeling, report builders, lead generator, and other features you’ll find helpful.

However, one of the features agencies might find most helpful in keyword research is SERanking’s bulk keyword analysis, which lets you run thousands of keywords and download full reports for all the terms that matter.

Other tools in the SERanking Suite include:

  • Keyword Rank Tracker.
  • Keyword Grouper.
  • Keyword Suggestions and Search Volume Checker.
  • Index Status checker.
  • Backlink Checker.
  • Backlink monitoring.
  • Competitive research tool.
  • Website auditing tool.
  • On-page SEO Checker.
  • Page Changes Monitor.
  • Social media analytics.
  • Traffic analysis.

SERanking is more affordable than some of the other tools out there, but it does come at a cost.

It isn’t as robust as some of its competitors and doesn’t get as granular in the same way, but it still provides the features and data you need to create a successful SEO strategy.

And with its flexible pricing, this tool is well worth considering.

7. BrightEdge Data Cube

Cost: Custom pricing model.

If you’re looking for an AI-powered digital marketing tool suite that includes a quality research tool, BrightEdge may be the right option for you.

Unlike other tools that focus on supplying you with data and ways to analyze that data, BrightEdge looks to do much of the time-consuming analysis for you.

Among its search, content, social, local, and mobile solutions, you’ll find Data Cube – an AI-backed content and keyword tool that uses natural language processing to find related topics and keywords.

You’ll also encounter DataMind, an AI that helps you find search trends, changes in consumer behaviors, and important competitor movements you need to know about.

The two together make it quick and easy to perform keyword research, build out topics, create content strategies, and strengthen your SEO plans.

Once you enter a topic or broad keyword, the tool will provide you with relevant keywords, the search volume, competition levels, keyword value, it’s universal listing, and the number of words in the phrase.

Filter the results by a custom set of criteria to narrow the list down and get the necessary information.

Once you have a list, select the ones you want to keep and download them or use them with BrightEdge’s other tools to create full strategies and gain more insights.

This could include competitor analysis, analyzing SERP features, intent, or other tasks.

For agencies that provide local SEO, BrightEdge also offers HyperLocal, which helps you find and track keywords and keyword performance at the local level.

When you’re done, give the Opportunity Forecasting and tracking tools a try to monitor your progress and provide clients with the information they care about.

Perhaps the nicest feature for agencies is its Storybuilder – a reporting tool that allows you to create rich client reports that provide clients with targeted overviews and the data they’re most interested in.

If this sounds like the right tool for you, the company gives demos, but there are a few things you should consider.

First, it only updates once per month. And while the company keeps its pricing close to the chest, this digital marketing tool suite is a significant investment. It may not be the best choice if keyword research is the only thing you need.

Secondly, while the tools are highly sophisticated and refined, there is a learning curve to get started.

You’ll also discover that there are limits on features like keyword tracking, and it can be time-consuming to set up, with some adjustments requiring technical support.

Lastly, BrightEdge’s keyword research tool doesn’t let you get too far into the weeds and doesn’t include PPC traffic.

That aside, agencies and larger brands will find that it scales easily, has a beautifully designed UI, and makes you look great to clients.

The Best Agency SEO Keyword Research Tools

This list only contains seven of the many tools available today to help you get your keyword research done to an expert degree.

But no matter how many of the tools we share with you or which ones, it’s important to understand that none are flawless.

Each tool has its own unique strengths and weaknesses, so selecting a platform is very much dependent on the types of clients that you typically work with and personal preference.

In reality, you’ll likely find that you prefer to work between a few tools to accomplish everything you’d like.

Google Keyword Planner and are top choices when you want a quick look at the data, or you’d like to export the data to work on elsewhere. You may even want to use this data with the other tools mentioned in this chapter.

Ahrefs, Moz, Semrush, and BrightEdge are far more robust and are better suited to agency SEO tasks.

While not free (although they offer free plans or a trial period except BrightEdge), they allow you to really dig into the search space, ultimately resulting in higher traffic, more conversions, and stronger SEO strategies. These benefits require more time and often come with a learning curve.

By far, the most important keyword research tool you have access to is you.

Keyword research is more than simply choosing the keywords with the biggest search volume or the phrase with the lowest Cost Per Click (CPC).

It’s your expertise, experience, knowledge, and insights that transform data into digital marketing you can be proud of.

Featured Image: Paulo Bobita/Search Engine Journal

Source link

Continue Reading


Yelp Details Removal Of Paid Review Groups & Lead Generators



Yelp Details Removal Of Paid Review Groups & Lead Generators

Yelp published its 2022 Trust and Safety Report detailing actions it took against lead generators, fake review groups and businesses incentivizing reviews.

Yelp Cracks Down on Paid Review Groups

The report details the proactive approach to chasing down online review groups and breaking them up.

Among the tactics Yelp used is identifying IP addresses used for fake positive reviews, as well as connecting users to groups that are arranging paid reviews.

Yelp’s Trust and Safety report revealed that it fights online review exchanges by identifying the groups on social media and working together with Twitter, Facebook, Instagram and LinkedIn to break them up.

In a 2021 blog post about their recommendation software, Yelp wrote that they monitor online groups and even conduct sting operations to catch the fake review rings.

Yelps newly released Trust and Safety report explains:

“Yelp strictly prohibits offering incentives or other compensation in exchange for writing, changing or removing a review.

To combat this on and off our platform, our User Operations team did the following in 2022:

– Issued 415+ warnings to businesses for engaging in compensated or incentivized review behaviors.

– As part of our broader Consumer Alerts program, we placed 88 Compensated Activity Alerts on business pages after receiving evidence someone offered cash or other incentives in exchange for posting, updating or removing a review.

We also placed 405 Suspicious Review Activity Alerts after our systems detected a large number of positive reviews coming from a single IP address, or reviews from users who may be connected to a group that coordinates incentivized reviews.

Made 1,100+ reports to third-party sites, such as Twitter (150 reports were made by Yelp), Facebook (130 reports), Instagram (110 reports) and LinkedIn (70 reports), to warn them of content from more than 900 suspicious groups, posts or individuals we found on their sites participating in online review exchanges.

Third-party platforms took action on content at issue in approximately 77% of our reports.”

Yelp Closes Thousands of Fraudulent Accounts

The Trust and Safety report reports that Yelp closed over 77,000 user accounts for terms of service violations and suspected deceptive and abusive actions.

They also rejected over 32,800 potential new business pages for being associated with spammy activities that violated Yelp’s policies.

An interesting revelation is how they came down hard on lead generation businesses whose business model is to create fake business listings and then sell leads to local businesses.

Yelp writes:

“Nearly 2,000 business pages removed for being associated with lead generators, violating Yelp’s policies.

‘Lead generators’ create fake business pages then take the customer leads generated and auction them to other contractors.

This behavior tricks people into paying exorbitant costs for services, targeting vulnerable consumers who are often allowing service providers into their home (locksmiths, movers, home cleaning, etc.).”

Yelp User Operations Team Content Removals

The report notes that 2% of all Yelp contributions were removed by their user operations  team in 2022. That includes reviews, photos, review up-voting, and other forms of contributions.

Yelp Trust and Safety

The integrity of reviews is important to a recommender ecosystem like Yelp. Yelp uses a recommendation software as their first line of defense against deceptive behavior. The software itself is dynamic in that it keeps tabs on the users and businesses as they engage with the platform.

Yelp writes:

“The recommendation software is entirely automated and applies the same objective rules to every business. The reviews that are recommended for any business can change over time as Yelp’s software learns more about the reviewers and the business.”

It also employs human moderators in their User Operations team to follow up and manually review reports generated by users, businesses or their automated systems.

Read Yelp’s Trust and Safety Report for 2022

Featured image by Shutterstock/

Source link

Continue Reading