Connect with us

SEO

A Complete Guide To the Google Penguin Algorithm Update

Published

on

A Complete Guide To the Google Penguin Algorithm Update

Ten years have passed since Google introduced the Penguin algorithm and took a stronger stance on manipulative link-building practices.

The algorithm has had a number of updates and has become a real-time part of the core Google algorithm, and as a result, penalties have become less common, but still exist both in partial and site-wide format.

 Screenshot by author, May 2022

For the most part, Google claims to ignore a lot of poor-quality links online, but is still alert and monitoring for unnatural patterns such as link schemes, PBNs, link exchanges, and unnatural outbound linking patterns.

The Introduction Of Penguin

In 2012, Google officially launched the “webspam algorithm update,” which specifically targeted link spam and manipulative link-building practices.

The webspam algorithm later became known (officially) as the Penguin algorithm update via a tweet from Matt Cutts, who was then head of the Google webspam team.

While Google officially named the algorithm Penguin, there is no official word on where this name came from.

The Panda algorithm name came from one of the key engineers involved with it, and it’s more than likely that Penguin originated from a similar source.

One of my favorite Penguin naming theories is that it pays homage to The Penguin, from DC’s Batman.

Prior to the Penguin algorithm, link volume played a larger part in determining a webpage’s scoring when crawled, indexed, and analyzed by Google.

This meant when it came to ranking websites by these scores for search results pages, some low-quality websites and pieces of content appeared in more prominent positions in the organic search results than they should have.

Why Google Penguin Was Needed

Google’s war on low-quality started with the Panda algorithm, and Penguin was an extension and addition to the arsenal to fight this war.

Penguin was Google’s response to the increasing practice of manipulating search results (and rankings) through black hat link building techniques.

Cutts, speaking at the SMX Advanced 2012 conference, said:

We look at it something designed to tackle low-quality content. It started out with Panda, and then we noticed that there was still a lot of spam and Penguin was designed to tackle that.

The algorithm’s objective was to gain greater control over and reduce the effectiveness of, a number of blackhat spamming techniques.

By better understanding and process the types of links websites and webmasters were earning, Penguin worked toward ensuring that natural, authoritative, and relevant links rewarded the websites they pointed to, while manipulative and spammy links were downgraded.

Penguin only deals with a site’s incoming links. Google only looks at the links pointing to the site in question and does not look at the outgoing links at all from that site.

Initial Launch & Impact

When Penguin first launched in April 2012, it affected more than 3% of search results, according to Google’s own estimations.

Penguin 2.0, the fourth update (including the initial launch) to the algorithm was released in May 2013 and affected roughly 2.3% of all queries.

On launch, Penguin was said to have targeted two specific manipulative practices, in particular, these being link schemes and keyword stuffing.

Link schemes are the umbrella term for manipulative link building practices, such as exchanges, paying for links, and other unnatural link practices outlined in Google’s link scheme documentation.

Penguin’s initial launch also took aim at keyword stuffing, which has since become associated with the Panda algorithm (which is thought of as more of a content and site quality algorithm).

Key Google Penguin Updates & Refreshes

There have been a number of updates and refreshes to the Penguin algorithm since it was launched in 2012, and possibly a number of other tweaks that have gone down in history as unknown algorithm updates.

Google Penguin 1.1: March 26, 2012

This wasn’t a change to the algorithm itself, but the first refresh of the data within it.

In this instance, websites that had initially been affected by the launch and had been proactive in clearing up their link profiles saw some recovery, while others who hadn’t been caught by Penguin the first time round saw an impact.

Google Penguin 1.2: October 5, 2012

This was another data refresh. It affected queries in the English language, as well as affected international queries.

Google Penguin 2.0: May 22, 2013

This was a more technically advanced version of the Penguin algorithm and changed how the algorithm impacted search results.

Penguin 2.0 impacted around 2.3% of English queries, as well as other languages proportionately.

This was also the first Penguin update to look deeper than the websites homepage and top-level category pages for evidence of link spam being directed to the website.

Google Penguin 2.1: October 4, 2013

The only refresh to Penguin 2.0 (2.1) came on October 4 of the same year. It affected about 1% of queries.

While there was no official explanation from Google, data suggests that the 2.1 data refresh also advanced on how deep Penguin looked into a website and crawled deeper, and conducted further analysis as to whether spammy links were contained.

Google Penguin 3.0: October 17, 2014

While this was named as a major update, it was, in fact, another data refresh; allowing those impacted by previous updates to emerge and recover, while many others who had continued to utilize spammy link practices and had escaped the radar of the previous impacts saw an impact.

Googler Pierre Far confirmed this through a post on his Google+ profile and that the update would take a “few weeks” to roll out fully.

Far also stated that this update affected less than 1% of English search queries.

Google Penguin 4.0: September 23, 2016

Almost two years after the 3.0 refresh, the final Penguin algorithm update was launched.

The biggest change with this iteration was that Penguin became a part of the core algorithm.

When an algorithm transcends to become a part of the core, it doesn’t mean that the algorithm’s functionality has changed or may change dramatically again.

It means that Google’s perception of the algorithm has changed, not the algorithm itself.

Now running concurrently with the core, Penguin evaluates websites and links in real-time. This meant that you can see (reasonably) instant impacts of your link building or remediation work.

The new Penguin also wasn’t closed-fisted in handing out link-based penalties but rather devalued the links themselves. This is a contrast to the previous Penguin iterations, where the negative was punished.

That being said, studies and, from personal experience, algorithmic penalties relating to backlinks still do exist.

Data released by SEO professionals (e.g., Michael Cottam), as well as seeing algorithmic downgrades lifted through disavow files after Penguin 4.0, enforce this belief.

Google Penguin Algorithmic Downgrades

Soon after the Penguin algorithm was introduced, webmasters and brands who had used manipulative link building techniques or filled their backlink profiles with copious amounts of low-quality links began to see decreases in their organic traffic and rankings.

Not all Penguin downgrades were site-wide – some were partial and only affected certain keyword groups that had been heavily spammed and over-optimized, such as key products and in some cases even brands.

A website impacted by a Penguin penalty, which took 17 months to lift.A website impacted by a Penguin penalty, which took 17 months to lift.

The impact of Penguin can also pass between domains, so changing domains and redirecting the old one to the new can cause more problems in the long run.

Experiments and research show that using a 301 or 302 redirect won’t remove the effect of Penguin, and in the Google Webmasters Forum, John Mueller confirmed that using a meta refresh from one domain to a new domain could also cause complications.

In general, we recommend not using meta-refresh type redirects, as this can cause confusion with users (and search engine crawlers, who might mistake that for an attempted redirect).

Google Penguin Recovery

The disavow tool has been an asset to SEO practitioners, and this hasn’t changed even now that Penguin exists as part of the core algorithm.

As you would expect, there have been studies and theories published that disavowing links doesn’t, in fact, do anything to help with link-based algorithmic downgrades and manual actions, but this has theory has been shot down by Google representatives publicly.

That being said, Google recommends that the disavow tool should only be used as a last resort when dealing with link spam, as disavowing a link is a lot easier (and a quicker process in terms of its effect) than submitting reconsideration requests for good links.

What To Include In A Disavow File

A disavow file is a file you submit to Google that tells them to ignore all the links included in the file so that they will not have any impact on your site.

The result is that the negative links will no longer cause negative ranking issues with your site, such as with Penguin.

But, it does also mean that if you erroneously included high-quality links in your disavow file, those links will no longer help your ranking.

You do not need to include any notes in your disavow file unless they are strictly for your reference. It is fine just to include the links and nothing else.

Google does not read any of the notations you have made in your disavow file, as they process it automatically without a human ever reading it.

Some find it useful to add internal notations, such as the date a group of URLs was added to the disavow file or comments about their attempts to reach the webmaster about getting a link removed.

Once you have uploaded your disavow file, Google will send you a confirmation.

But while Google will process it immediately, it will not immediately discount those links. So, you will not instantly recover from submitting the disavow alone.

Google still needs to go out and crawl those individual links you included in the disavow file, but the disavow file itself will not prompt Google to crawl those pages specifically.

Also, there is no way to determine which links have been discounted and which ones have not been, as Google will still include both in your linking report in Google Search Console.

If you have previously submitted a disavow file to Google, they will replace that file with your new one, not add to it.

So, it is important to make sure that if you have previously disavowed links, you still include those links in your new disavow file.

You can always download a copy of the current disavow file in Google Search Console.

Disavowing Individual Links vs. Domains

It is recommended that you choose to disavow links on a domain level instead of disavowing the individual links.

There will be some cases where you will want to disavow individually specific links, such as on a major site that has a mix of quality versus paid links.

But for the majority of links, you can do a domain-based disavow.

Google only needs to crawl one page on that site for that link to be discounted on your site.

Doing domain-based disavows also means that you do not have to worry about those links being indexed as www or non-www, as the domain-based disavow will take this into account.

Finding Your Backlinks

If you suspect your site has been negatively impacted by Penguin, you need to do a link audit and remove or disavow the low-quality or spammy links.

Google Search Console includes a list of backlinks for site owners, but be aware that it also includes links that are already nofollowed.

If the link is nofollowed, it will not have any impact on your site. But keep in mind that the site could remove that nofollow in the future without warning.

There are also many third-party tools that will show links to your site, but because some websites block those third-party bots from crawling their site, they will not be able to show you every link pointing to your site.

And while some of the sites blocking these bots are high-quality well-known sites not wanting to waste the bandwidth on those bots, it is also being used by some spammy sites to hide their low-quality links from being reported.

Monitoring backlinks is also an essential task, as sometimes the industry we work in isn’t entirely honest and negative SEO attacks can happen. That’s when a competitor buys spammy links and points them to your site.

Many use “negative SEO” as an excuse when their site gets caught by Google for low-quality links.

However, Google has said they are pretty good about recognizing this when it happens, so it is not something most website owners need to worry about.

This also means that proactively using the disavow feature without a clear sign of an algorithmic penalty or a notification of a manual action is a good idea.

Interestingly, however, a poll conducted by SEJ in September 2017 found that 38% of SEOs never disavow backlinks.

Going through a backlink profile, and scrutinizing each linking domain as to whether it’s a link you want or not, is not a light task.

Link Removal Outreach

Google recommends that you attempt to outreach to websites and webmasters where the bad links are originating from first and request their removal before you start disavowing them.

Some site owners demand a fee to remove a link.

Google recommends never paying for link removals. Just include those links in your disavow file instead and move on to the next link removal.

While outreach is an effective way to recover from a link-based penalty, it isn’t always necessary.

The Penguin algorithm also takes into account the link profile as a whole, and the volume of high-quality, natural links versus the number of spammy links.

While in the instances of a partial penalty (impacting over-optimized keywords), the algorithm may still affect you. The essentials of backlink maintenance and monitoring should keep you covered.

Some webmasters even go as far as including “terms” within the terms and conditions of their website and actively outreaching to websites they don’t feel should be linking to them:

TOS linkingWebsite terms and conditions regarding linking to the website in question.

Assessing Link Quality

Many have trouble when assessing link quality.

Don’t assume that because a link comes from a .edu site that it is high-quality.

Plenty of students sell links from their personal websites on those .edu domains which are extremely spammy and should be disavowed.

Likewise, there are plenty of hacked sites within .edu domains that have low-quality links.

Do not make judgments strictly based on the type of domain. While you can’t make automatic assumptions on .edu domains, the same applies to all TLDs and ccTLDs.

Google has confirmed that just being on a specific TLD does not help or hurt the search rankings. But you do need to make individual assessments.

There is a long-running joke about how there’s never been a quality page on a .info domain because so many spammers were using them, but in fact, there are some great quality links coming from that TLD, which shows why individual assessment of links is so important.

Beware Of Links From Presumed High-Quality Sites

Don’t look at the list of links and automatically consider links from specific websites as being a great quality link, unless you know that very specific link is high quality.

Just because you have a link from a major website such as Huffington Post or the BBC does not make that an automatic high-quality link in the eyes of Google – if anything, you should question it more.

Many of those sites are also selling links, albeit some disguised as advertising or done by a rogue contributor selling links within their articles.

These types of links from high-quality sites actually being low-quality have been confirmed by many SEOs who have received link manual actions that include links from these sites in Google’s examples. And yes, they could likely be contributing to a Penguin issue.

As advertorial content increases, we are going to see more and more links like these get flagged as low-quality.

Always investigate links, especially if you are considering not removing any of them simply based on the site the link is from.

Promotional Links

As with advertorials, you need to think about any links that sites may have pointed to you that could be considered promotional links.

Paid links do not always mean money is exchanged for the links.

Examples of promotional links that are technically paid links in Google’s eyes are any links given in exchange for a free product for review or a discount on products.

While these types of links were fine years ago, they now need to be nofollowed.

You will still get the value of the link, but instead of it helping rankings, it would be through brand awareness and traffic.

You may have links out there from a promotional campaign done years ago that are now negatively impacting a site.

For all these reasons, it is vitally important to individually assess every link. You want to remove the poor quality links because they are impacting Penguin or could cause a future manual action.

But, you do not want to remove the good links, because those are the links that are helping your rankings in the search results.

Promotional links that are not nofollowed can also trigger the manual action for outgoing links on the site that placed those links.

No Penguin Recovery In Sight?

Sometimes after webmasters have gone to great lengths to clean up their link profiles, they still don’t see an increase in traffic or rankings.

There are a number of possible reasons behind this, including:

  • The initial traffic and ranking boost was seen prior to the algorithmic penalty was unjustified (and likely short-term) and came from the bad backlinks.
  • When links have been removed, no efforts have been made to gain new backlinks of greater value.
  • Not all the negative backlinks have been disavowed/a high enough proportion of the negative backlinks have been removed.
  • The issue wasn’t link-based, to begin with.

When you recover from Penguin, don’t expect your rankings to go back to where they used to be before Penguin, nor for the return to be immediate.

Far too many site owners are under the impression that they will immediately begin ranking at the top for their top search queries once Penguin is lifted.

First, some of the links that you disavowed were likely contributing to an artificially high ranking, so you cannot expect those rankings to be as high as they were before.

Second, because many site owners have trouble assessing the quality of the links, some high-quality links inevitably get disavowed in the process, links that were contributing to the higher rankings.

Add to the mix the fact Google is constantly changing its ranking algorithm, so factors that benefited you previously might not have as big of an impact now, and vice versa.

Google Penguin Myths & Misconceptions

One of the great things about the SEO industry and those involved in it is that it’s a very active and vibrant community, and there are always new theories and experiment findings being published online daily.

Naturally, this has led to a number of myths and misconceptions being born about Google’s algorithms. Penguin is no different.

Here are a few myths and misconceptions about the Penguin algorithm we’ve seen over the years.

Myth: Penguin Is A Penalty

One of the biggest myths about the Penguin algorithm is that people call it a penalty (or what Google refers to as a manual action).

Penguin is strictly algorithmic in nature. It cannot be lifted by Google manually.

Despite the fact that an algorithmic change and a penalty can both cause a big downturn in website rankings, there are some pretty drastic differences between them.

A penalty (or manual action) happens when a member of Google’s webspam team has responded to a flag, investigated, and felt the need to enforce a penalty on the domain.

You will receive a notification through Google Search Console relating to this manual action.

When you get hit by a manual action, not only do you need to review your backlinks and submit a disavow for the spammy ones that go against Google’s guidelines, but you also need to submit a reconsideration request to the Google webspam team.

If successful, the penalty will be revoked; and if unsuccessful, it’s back to reviewing the backlink profile.

A Penguin downgrade happens without any involvement of a Google team member. It’s all done algorithmically.

Previously, you would have to wait for a refresh or algorithm update, but now, Penguin runs in real-time so recoveries can happen a lot faster (if enough remediation work has been done).

Myth: Google Will Notify You If Penguin Hits Your Site

Another myth about the Google Penguin algorithm is that you will be notified when it has been applied.

Unfortunately, this isn’t true. The Search Console won’t notify you that your rankings have taken a dip because of the application of the Penguin.

Again, this shows the difference between an algorithm and a penalty – you would be notified if you were hit by a penalty.

However, the process of recovering from Penguin is remarkably similar to that of recovering from a penalty.

Myth: Disavowing Bad Links Is The Only Way To Reverse A Penguin Hit

While this tactic will remove a lot of the low-quality links, it is utterly time-consuming and a potential waste of resources.

Google Penguin looks at the percentage of good quality links compared to those of a spammy nature.

So, rather than focusing on manually removing those low-quality links, it may be worth focusing on increasing the number of quality links your website has.

This will have a better impact on the percentage Penguin takes into account.

Myth: You Can’t Recover From Penguin

Yes, you can recover from Penguin.

It is possible, but it will require some experience in dealing with the fickle nature of Google algorithms.

The best way to shake off the negative effects of Penguin is to forget all of the existing links on your website, and begin to gain original editorially-given links.

The more of these quality links you gain, the easier it will be to release your website from the grip of Penguin.


Featured Image: Paulo Bobita/Search Engine Journal



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Why Building a Brand is Key to SEO

Published

on

By

Why Building a Brand is Key to SEO

For better or worse, brands dominate Google search results. As more results are generated by AI and machines start to understand the offline and online world, big brands are only going to get more powerful. 

Watch on-demand as we tackle the challenge of competing with dominant brands in Google search results. We explained why big brands lead the rankings and how to measure your own brand’s impact against these competitors.

We even shared actionable strategies for improving your visibility by weaving your brand into your SEO.

You’ll learn:

  • Why brands dominate Google (and will continue to do so).
  • How to measure your brand’s impact on search, and what you should focus on.
  • Ways to weave your brand’s identity into your content.

With Dr. Pete Meyers, we explored why brand marketing is vital to search marketing, and how to incorporate your brand into your everyday content and SEO efforts.

If you’re looking to have your brand stand out in a sea of competition, you won’t want to miss this.

View the slides below, or check out the full presentation for all the details.

 

Join Us For Our Next Webinar!

Optimizing For Google’s New Landscape And The Future Of Search

Join us as we dive deep into the evolution reshaping Google’s search rankings in 2024 and beyond. We’ll show you actionable insights to help you navigate the disruption and emerge with a winning SEO strategy.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

How SEO Can Capture Demand You Create Elsewhere

Published

on

How SEO Can Capture Demand You Create Elsewhere

Generating demand is about making people want stuff they had no desire to buy before encountering your marketing. 

Sometimes, it’s a short-term play, like an ecommerce store creating buzz before launching a new product. Other times, like with B2B marketing, it’s a long-term play to engage out-of-market audiences.

In either situation, demand generation can quickly become an expensive marketing activity.

Here are some ways SEO can help you capture and retain the demand you’re generating so your marketing budget goes further.

How is demand typically generated? 

There’s no right or wrong way to generate demand. Any marketing activity that generates a desire to buy something (where there wasn’t such a desire before) can be considered demand generation.

Common examples include using:

  • Paid ads
  • Word of mouth
  • Social media
  • Video marketing
  • Email newsletters
  • Content marketing
  • Community marketing

For example, Pryshan is a small local brand in Australia that has created a new type of exfoliating stone from clay. They’ve been selling it offline since 2018, if not earlier.

It’s not a groundbreaking innovation, but it’s also not been done before.

To launch their product online, they started running a bunch of Facebook ads:

Because of their ads, this company is in the early stages of generating demand for its product. Sure, it’s not the type of marketing that will go viral, but it’s still a great example of demand gen.

Looking at search volume data, there are 40 searches per month for the keyword “clay stone exfoliator” in Australia and a handful of other related searches:

Ahrefs' keyword metrics for "clay stone exfoliator" and similar keywords indicating over 100 searches per month when aggregated.Ahrefs' keyword metrics for "clay stone exfoliator" and similar keywords indicating over 100 searches per month when aggregated.

However, these same keywords get hardly any searches in the US:

Search volumes for the clay stone exfoliator keywords in the US are all 0 to 10.Search volumes for the clay stone exfoliator keywords in the US are all 0 to 10.

This never happens.

Australia has a much smaller population than the US. For non-localized searches, Australian search volume is usually about 6-10% of US search volume for the same keywords.

Take a look at the most popular searches as an example:

Side by side copmarison of search volumes in the US compared to Australia for the keywords Youtube, Facebook, Wordle, Gmail and GoogleSide by side copmarison of search volumes in the US compared to Australia for the keywords Youtube, Facebook, Wordle, Gmail and Google

Pryshan’s advertising efforts on other platforms directly create the search demand for exfoliating clay stones.

It doesn’t matter where or how you educate people about the product you sell. What matters is shifting their perceptions from cognitive awareness to emotional desire.

Emotions trigger actions, and usually, the first action people take once they become aware of a cool new thing is to Google it.

If you’re not including SEO as part of your marketing efforts, here are three things you can do to:

  • minimize budget wastage
  • capture interest when people search
  • convert the audiences you’re already reaching

1. Make your product, service, or innovation searchable 

If you’re working hard to create demand for your product, make sure it’s easy for people to discover it when they search Google.

  • Give it a simple name that’s easy to remember
  • Label it according to how people naturally search
  • Avoid any terms that create ambiguities with an existing thing

For example, the concept of a clay exfoliating stone is easy for people to remember.

Even if they don’t remember what Pryshan calls their product, they’ll remember the videos and images they saw of the product being used to exfoliate people’s skin. They’ll remember it’s made from clay instead of a more common material like pumice.

It makes sense for Pryshan to call its product something similar to what people will be inclined to search for.

In this example, however, the context of exfoliation is important.

If Pryshan chooses to call its product “clay stones,” it will have a harder time disambiguating itself from gardening products in search results. It’s already the odd one out in SERPs for such keywords:

Pryshan's shop listing on Google for the keyword "clay stones" is among gardening products.Pryshan's shop listing on Google for the keyword "clay stones" is among gardening products.

When you go through your branding exercises to decide what to call your product or innovation, it helps to search your ideas on Google.

This way, you’ll easily see what phrases to avoid so that your product isn’t being grouped with unrelated things.

2. Own as much real estate on search results as you can 

Imagine being part of a company that invested a lot of money in re-branding itself. New logo, new slogan, new marketing materials… the lot.

On the back of their new business cards, the designers thought inviting people to search for the new slogan on Google would be clever.

The only problem was that this company didn’t rank for the slogan.

They weren’t showing up at all! (Yes, it’s a true story, no I can’t share the brand’s name).

This tactic isn’t new. Many businesses leverage the fact that people will Google things to convert offline audiences into online audiences through their printed, radio, and TV ads.

Billboard that includes a Google search for "cheesesteaks nearby".Billboard that includes a Google search for "cheesesteaks nearby".

Don’t do this if you don’t already own the search results page.

It’s not only a very expensive mistake to make, but it gives the conversions you’ve worked hard for directly to your competitors.

Instead, use SEO to become the only brand people see when they search for your brand, product, or something that you’ve created.

SERP results that can capture demandSERP results that can capture demand

Let’s use Pryshan as an example.

They’re the first brand to create exfoliating clay stones. Their audience has created a few new keywords to find Pryshan’s products on Google, with “clay stone exfoliator” being the most popular variation.

Yet even though it’s a product they’ve brought to market, competitors and retailers are already encroaching on their SERP real estate for this keyword:

Search results for the keyword "clay stone exfoliator" and where Pryshan shows up.Search results for the keyword "clay stone exfoliator" and where Pryshan shows up.

Sure, Pryshan holds four of the organic spots, but it’s not enough.

Many competitors are showing up in the paid product carousel before Pryshan’s website can be seen by searchers:

Sponsored product listings on Google.Sponsored product listings on Google.

They’re already paying for Facebook ads, why not consider some paid Google placements too?

Not to mention, stockists and competitors are ranking for three of the other organic positions.

Having stockists show up for your product may not seem so bad, but if you’re not careful, they may undercut your prices or completely edge you out of the SERPs.

This is also a common tactic used by affiliate marketers to earn commissions from brands that are not SEO-savvy.

In short, SEO can help you protect your brand presence on Google.

3. Use search data to measure demand gen success 

If you’re working hard to generate demand for a cool new thing that’s never been done before, it can be hard to know if it’s working.

Sure, you can measure sales. But a lot of the time, demand generation doesn’t turn into immediate sales.

B2B marketing is a prominent example. Educating and converting out-of-market audiences into in-market prospects can take a long time.

That’s where SEO data can help close the gap and give you data to get more buy-in from decision-makers.

Measure increases in branded searches

A natural byproduct of demand generation activities is that people search more for your brand (or they should if you’re doing it right).

Tracking if your branded keywords improve over time can help you gauge how your demand generation efforts are going.

In Ahrefs, you can use Rank Tracker to monitor how many people discover your website from your branded searches and whether these are trending up:

Example of Ahrefs' Rank Tracker dashboard.Example of Ahrefs' Rank Tracker dashboard.

If your brand is big enough and gets hundreds of searches a month, you can also check out this nifty graph that forecasts search potential in Keywords Explorer:

Example of Ahrefs' keyword metrics indicating monthly search volume and a graph of forecasted growth.Example of Ahrefs' keyword metrics indicating monthly search volume and a graph of forecasted growth.

Discover and track new keywords about your products, services or innovations

If, as part of your demand generation strategy, you’re encouraging people to search for new keywords relating to your product, service, or innovation, set up alerts to monitor your presence for those terms.

This method will also help you uncover the keywords your audience naturally uses anyway.

Start by going to Ahrefs Alerts and setting up a new keyword alert.

How to set up Ahrefs' Alert feature.How to set up Ahrefs' Alert feature.

Add your website.

Leave the volume setting untouched (you want to include low search volume keywords so you discover the new searches people make).

Set your preferred email frequency, and voila, you’re done.

Monitor visibility against competitors

If you’re worried other brands may steal your spotlight in Google’s search results, you can also use Ahrefs to monitor your share of the traffic compared to them.

I like to use the Share of Voice graph in Site Explorer to do this. It looks like this:

Using Ahrefs' Share of Voice graph to compare the traffic from multiple websites.Using Ahrefs' Share of Voice graph to compare the traffic from multiple websites.

This graph is a great bird’s eye view of how you stack up against competitors and if you’re at risk of losing visibility to any of them.

Final thoughts

As SEO professionals, it’s easy to forget how hard some businesses work to generate demand for their products or services.

Demand always comes first, and it’s our job to capture it.

It’s not a chicken or egg scenario. The savviest marketers use this to their advantage by creating their own SEO opportunities long before competitors figure out what they’re doing.

If you’ve seen other great examples of how SEO and demand generation work together, share them with me on LinkedIn anytime.

 

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Explains How Cumulative Layout Shift (CLS) Is Measured

Published

on

By

Google Explains How Cumulative Layout Shift (CLS) Is Measured

Google’s Web Performance Developer Advocate, Barry Pollard, has clarified how Cumulative Layout Shift (CLS) is measured.

CLS quantifies how much unexpected layout shift occurs when a person browses your site.

This metric matters to SEO as it’s one of Google’s Core Web Vitals. Pages with low CLS scores provide a more stable experience, potentially leading to better search visibility.

How is it measured? Pollard addressed this question in a thread on X.

Understanding CLS Measurement

Pollard began by explaining the nature of CLS measurement:

“CLS is ‘unitless’ unlike LCP and INP which are measured in seconds/milliseconds.”

He further clarified:

“Each layout shift is calculated by multipyling two percentages or fractions together: What moved (impact fraction) How much it moved (distance fraction).”

This calculation method helps quantify the severity of layout shifts.

As Pollard explained:

“The whole viewport moves all the way down – that’s worse than just half the view port moving all the way down. The whole viewport moving down a little? That’s not as bad as the whole viewport moving down a lot.”

Worse Case Scenario

Pollard described the worst-case scenario for a single layout shift:

“The maximum layout shift is if 100% of the viewport (impact fraction = 1.0) is moved one full viewport down (distance fraction = 1.0).

This gives a layout shift score of 1.0 and is basically the worst type of shift.”

However, he reminds us of the cumulative nature of CLS:

“CLS is Cumulative Layout Shift, and that first word (cumulative) matters. We take all the individual shifts that happen within a short space of time (max 5 seconds) and sum them up to get the CLS score.”

Pollard explained the reasoning behind the 5-second measurement window:

“Originally we cumulated ALL the shifts, but that didn’t really measure the UX—especially for pages opened for a long time (think SPAs or email). Measuring all shifts meant, given enough, time even the best pages would fail!”

He also noted the theoretical maximum CLS score:

“Since each element can only shift when a frame is drawn and we have a 5 second cap and most devices run at 60fps, that gives a theoretical cap on CLS of 5 secs * 60 fps * 1.0 max shift = 300.”

Interpreting CLS Scores

Pollard addressed how to interpret CLS scores:

“… it helps to think of CLS as a percentage of movement. The good threshold of 0.1 means about the page moved 10%—which could mean the whole page moved 10%, or half the page moved 20%, or lots of little movements were equivalent to either of those.”

Regarding the specific threshold values, Pollard explained:

“So why is 0.1 ‘good’ and 0.25 ‘poor’? That’s explained here as was a combination of what we’d want (CLS = 0!) and what is achievable … 0.05 was actually achievable at the median, but for many sites it wouldn’t be, so went slightly higher.”

See also: How You Can Measure Core Web Vitals

Why This Matters

Pollard’s insights provide web developers and SEO professionals with a clearer understanding of measuring and optimizing for CLS.

As you work with CLS, keep these points in mind:

  • CLS is unitless and calculated from impact and distance fractions.
  • It’s cumulative, measuring shifts over a 5-second window.
  • The “good” threshold of 0.1 roughly equates to 10% of viewport movement.
  • CLS scores can exceed 1.0 due to multiple shifts adding up.
  • The thresholds (0.1 for “good”, 0.25 for “poor”) balance ideal performance with achievable goals.

With this insight, you can make adjustments to achieve Google’s threshold.


Featured Image: Piscine26/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending