Connect with us

SEO

A Complete Guide To the Google Penguin Algorithm Update

Published

on

A Complete Guide To the Google Penguin Algorithm Update

Ten years have passed since Google introduced the Penguin algorithm and took a stronger stance on manipulative link-building practices.

The algorithm has had a number of updates and has become a real-time part of the core Google algorithm, and as a result, penalties have become less common, but still exist both in partial and site-wide format.

 Screenshot by author, May 2022Unnatural links warning in Google Search Console

For the most part, Google claims to ignore a lot of poor-quality links online, but is still alert and monitoring for unnatural patterns such as link schemes, PBNs, link exchanges, and unnatural outbound linking patterns.

The Introduction Of Penguin

In 2012, Google officially launched the “webspam algorithm update,” which specifically targeted link spam and manipulative link-building practices.

The webspam algorithm later became known (officially) as the Penguin algorithm update via a tweet from Matt Cutts, who was then head of the Google webspam team.

While Google officially named the algorithm Penguin, there is no official word on where this name came from.

The Panda algorithm name came from one of the key engineers involved with it, and it’s more than likely that Penguin originated from a similar source.

One of my favorite Penguin naming theories is that it pays homage to The Penguin, from DC’s Batman.

Prior to the Penguin algorithm, link volume played a larger part in determining a webpage’s scoring when crawled, indexed, and analyzed by Google.

This meant when it came to ranking websites by these scores for search results pages, some low-quality websites and pieces of content appeared in more prominent positions in the organic search results than they should have.

Why Google Penguin Was Needed

Google’s war on low-quality started with the Panda algorithm, and Penguin was an extension and addition to the arsenal to fight this war.

Penguin was Google’s response to the increasing practice of manipulating search results (and rankings) through black hat link building techniques.

Cutts, speaking at the SMX Advanced 2012 conference, said:

We look at it something designed to tackle low-quality content. It started out with Panda, and then we noticed that there was still a lot of spam and Penguin was designed to tackle that.

The algorithm’s objective was to gain greater control over and reduce the effectiveness of, a number of blackhat spamming techniques.

By better understanding and process the types of links websites and webmasters were earning, Penguin worked toward ensuring that natural, authoritative, and relevant links rewarded the websites they pointed to, while manipulative and spammy links were downgraded.

Penguin only deals with a site’s incoming links. Google only looks at the links pointing to the site in question and does not look at the outgoing links at all from that site.

Initial Launch & Impact

When Penguin first launched in April 2012, it affected more than 3% of search results, according to Google’s own estimations.

Penguin 2.0, the fourth update (including the initial launch) to the algorithm was released in May 2013 and affected roughly 2.3% of all queries.

On launch, Penguin was said to have targeted two specific manipulative practices, in particular, these being link schemes and keyword stuffing.

Link schemes are the umbrella term for manipulative link building practices, such as exchanges, paying for links, and other unnatural link practices outlined in Google’s link scheme documentation.

Penguin’s initial launch also took aim at keyword stuffing, which has since become associated with the Panda algorithm (which is thought of as more of a content and site quality algorithm).

Key Google Penguin Updates & Refreshes

There have been a number of updates and refreshes to the Penguin algorithm since it was launched in 2012, and possibly a number of other tweaks that have gone down in history as unknown algorithm updates.

Google Penguin 1.1: March 26, 2012

This wasn’t a change to the algorithm itself, but the first refresh of the data within it.

In this instance, websites that had initially been affected by the launch and had been proactive in clearing up their link profiles saw some recovery, while others who hadn’t been caught by Penguin the first time round saw an impact.

Google Penguin 1.2: October 5, 2012

This was another data refresh. It affected queries in the English language, as well as affected international queries.

Google Penguin 2.0: May 22, 2013

This was a more technically advanced version of the Penguin algorithm and changed how the algorithm impacted search results.

Penguin 2.0 impacted around 2.3% of English queries, as well as other languages proportionately.

This was also the first Penguin update to look deeper than the websites homepage and top-level category pages for evidence of link spam being directed to the website.

Google Penguin 2.1: October 4, 2013

The only refresh to Penguin 2.0 (2.1) came on October 4 of the same year. It affected about 1% of queries.

While there was no official explanation from Google, data suggests that the 2.1 data refresh also advanced on how deep Penguin looked into a website and crawled deeper, and conducted further analysis as to whether spammy links were contained.

Google Penguin 3.0: October 17, 2014

While this was named as a major update, it was, in fact, another data refresh; allowing those impacted by previous updates to emerge and recover, while many others who had continued to utilize spammy link practices and had escaped the radar of the previous impacts saw an impact.

Googler Pierre Far confirmed this through a post on his Google+ profile and that the update would take a “few weeks” to roll out fully.

Far also stated that this update affected less than 1% of English search queries.

Google Penguin 4.0: September 23, 2016

Almost two years after the 3.0 refresh, the final Penguin algorithm update was launched.

The biggest change with this iteration was that Penguin became a part of the core algorithm.

When an algorithm transcends to become a part of the core, it doesn’t mean that the algorithm’s functionality has changed or may change dramatically again.

It means that Google’s perception of the algorithm has changed, not the algorithm itself.

Now running concurrently with the core, Penguin evaluates websites and links in real-time. This meant that you can see (reasonably) instant impacts of your link building or remediation work.

The new Penguin also wasn’t closed-fisted in handing out link-based penalties but rather devalued the links themselves. This is a contrast to the previous Penguin iterations, where the negative was punished.

That being said, studies and, from personal experience, algorithmic penalties relating to backlinks still do exist.

Data released by SEO professionals (e.g., Michael Cottam), as well as seeing algorithmic downgrades lifted through disavow files after Penguin 4.0, enforce this belief.

Google Penguin Algorithmic Downgrades

Soon after the Penguin algorithm was introduced, webmasters and brands who had used manipulative link building techniques or filled their backlink profiles with copious amounts of low-quality links began to see decreases in their organic traffic and rankings.

Not all Penguin downgrades were site-wide – some were partial and only affected certain keyword groups that had been heavily spammed and over-optimized, such as key products and in some cases even brands.

A website impacted by a Penguin penalty, which took 17 months to lift.A website impacted by a Penguin penalty, which took 17 months to lift.A website impacted by a Penguin penalty, which took 17 months to lift.

The impact of Penguin can also pass between domains, so changing domains and redirecting the old one to the new can cause more problems in the long run.

Experiments and research show that using a 301 or 302 redirect won’t remove the effect of Penguin, and in the Google Webmasters Forum, John Mueller confirmed that using a meta refresh from one domain to a new domain could also cause complications.

In general, we recommend not using meta-refresh type redirects, as this can cause confusion with users (and search engine crawlers, who might mistake that for an attempted redirect).

Google Penguin Recovery

The disavow tool has been an asset to SEO practitioners, and this hasn’t changed even now that Penguin exists as part of the core algorithm.

As you would expect, there have been studies and theories published that disavowing links doesn’t, in fact, do anything to help with link-based algorithmic downgrades and manual actions, but this has theory has been shot down by Google representatives publicly.

That being said, Google recommends that the disavow tool should only be used as a last resort when dealing with link spam, as disavowing a link is a lot easier (and a quicker process in terms of its effect) than submitting reconsideration requests for good links.

What To Include In A Disavow File

A disavow file is a file you submit to Google that tells them to ignore all the links included in the file so that they will not have any impact on your site.

The result is that the negative links will no longer cause negative ranking issues with your site, such as with Penguin.

But, it does also mean that if you erroneously included high-quality links in your disavow file, those links will no longer help your ranking.

You do not need to include any notes in your disavow file unless they are strictly for your reference. It is fine just to include the links and nothing else.

Google does not read any of the notations you have made in your disavow file, as they process it automatically without a human ever reading it.

Some find it useful to add internal notations, such as the date a group of URLs was added to the disavow file or comments about their attempts to reach the webmaster about getting a link removed.

Once you have uploaded your disavow file, Google will send you a confirmation.

But while Google will process it immediately, it will not immediately discount those links. So, you will not instantly recover from submitting the disavow alone.

Google still needs to go out and crawl those individual links you included in the disavow file, but the disavow file itself will not prompt Google to crawl those pages specifically.

Also, there is no way to determine which links have been discounted and which ones have not been, as Google will still include both in your linking report in Google Search Console.

If you have previously submitted a disavow file to Google, they will replace that file with your new one, not add to it.

So, it is important to make sure that if you have previously disavowed links, you still include those links in your new disavow file.

You can always download a copy of the current disavow file in Google Search Console.

Disavowing Individual Links vs. Domains

It is recommended that you choose to disavow links on a domain level instead of disavowing the individual links.

There will be some cases where you will want to disavow individually specific links, such as on a major site that has a mix of quality versus paid links.

But for the majority of links, you can do a domain-based disavow.

Google only needs to crawl one page on that site for that link to be discounted on your site.

Doing domain-based disavows also means that you do not have to worry about those links being indexed as www or non-www, as the domain-based disavow will take this into account.

Finding Your Backlinks

If you suspect your site has been negatively impacted by Penguin, you need to do a link audit and remove or disavow the low-quality or spammy links.

Google Search Console includes a list of backlinks for site owners, but be aware that it also includes links that are already nofollowed.

If the link is nofollowed, it will not have any impact on your site. But keep in mind that the site could remove that nofollow in the future without warning.

There are also many third-party tools that will show links to your site, but because some websites block those third-party bots from crawling their site, they will not be able to show you every link pointing to your site.

And while some of the sites blocking these bots are high-quality well-known sites not wanting to waste the bandwidth on those bots, it is also being used by some spammy sites to hide their low-quality links from being reported.

Monitoring backlinks is also an essential task, as sometimes the industry we work in isn’t entirely honest and negative SEO attacks can happen. That’s when a competitor buys spammy links and points them to your site.

Many use “negative SEO” as an excuse when their site gets caught by Google for low-quality links.

However, Google has said they are pretty good about recognizing this when it happens, so it is not something most website owners need to worry about.

This also means that proactively using the disavow feature without a clear sign of an algorithmic penalty or a notification of a manual action is a good idea.

Interestingly, however, a poll conducted by SEJ in September 2017 found that 38% of SEOs never disavow backlinks.

Going through a backlink profile, and scrutinizing each linking domain as to whether it’s a link you want or not, is not a light task.

Link Removal Outreach

Google recommends that you attempt to outreach to websites and webmasters where the bad links are originating from first and request their removal before you start disavowing them.

Some site owners demand a fee to remove a link.

Google recommends never paying for link removals. Just include those links in your disavow file instead and move on to the next link removal.

While outreach is an effective way to recover from a link-based penalty, it isn’t always necessary.

The Penguin algorithm also takes into account the link profile as a whole, and the volume of high-quality, natural links versus the number of spammy links.

While in the instances of a partial penalty (impacting over-optimized keywords), the algorithm may still affect you. The essentials of backlink maintenance and monitoring should keep you covered.

Some webmasters even go as far as including “terms” within the terms and conditions of their website and actively outreaching to websites they don’t feel should be linking to them:

TOS linkingWebsite terms and conditions regarding linking to the website in question.TOS linking

Assessing Link Quality

Many have trouble when assessing link quality.

Don’t assume that because a link comes from a .edu site that it is high-quality.

Plenty of students sell links from their personal websites on those .edu domains which are extremely spammy and should be disavowed.

Likewise, there are plenty of hacked sites within .edu domains that have low-quality links.

Do not make judgments strictly based on the type of domain. While you can’t make automatic assumptions on .edu domains, the same applies to all TLDs and ccTLDs.

Google has confirmed that just being on a specific TLD does not help or hurt the search rankings. But you do need to make individual assessments.

There is a long-running joke about how there’s never been a quality page on a .info domain because so many spammers were using them, but in fact, there are some great quality links coming from that TLD, which shows why individual assessment of links is so important.

Beware Of Links From Presumed High-Quality Sites

Don’t look at the list of links and automatically consider links from specific websites as being a great quality link, unless you know that very specific link is high quality.

Just because you have a link from a major website such as Huffington Post or the BBC does not make that an automatic high-quality link in the eyes of Google – if anything, you should question it more.

Many of those sites are also selling links, albeit some disguised as advertising or done by a rogue contributor selling links within their articles.

These types of links from high-quality sites actually being low-quality have been confirmed by many SEOs who have received link manual actions that include links from these sites in Google’s examples. And yes, they could likely be contributing to a Penguin issue.

As advertorial content increases, we are going to see more and more links like these get flagged as low-quality.

Always investigate links, especially if you are considering not removing any of them simply based on the site the link is from.

Promotional Links

As with advertorials, you need to think about any links that sites may have pointed to you that could be considered promotional links.

Paid links do not always mean money is exchanged for the links.

Examples of promotional links that are technically paid links in Google’s eyes are any links given in exchange for a free product for review or a discount on products.

While these types of links were fine years ago, they now need to be nofollowed.

You will still get the value of the link, but instead of it helping rankings, it would be through brand awareness and traffic.

You may have links out there from a promotional campaign done years ago that are now negatively impacting a site.

For all these reasons, it is vitally important to individually assess every link. You want to remove the poor quality links because they are impacting Penguin or could cause a future manual action.

But, you do not want to remove the good links, because those are the links that are helping your rankings in the search results.

Promotional links that are not nofollowed can also trigger the manual action for outgoing links on the site that placed those links.

No Penguin Recovery In Sight?

Sometimes after webmasters have gone to great lengths to clean up their link profiles, they still don’t see an increase in traffic or rankings.

There are a number of possible reasons behind this, including:

  • The initial traffic and ranking boost was seen prior to the algorithmic penalty was unjustified (and likely short-term) and came from the bad backlinks.
  • When links have been removed, no efforts have been made to gain new backlinks of greater value.
  • Not all the negative backlinks have been disavowed/a high enough proportion of the negative backlinks have been removed.
  • The issue wasn’t link-based, to begin with.

When you recover from Penguin, don’t expect your rankings to go back to where they used to be before Penguin, nor for the return to be immediate.

Far too many site owners are under the impression that they will immediately begin ranking at the top for their top search queries once Penguin is lifted.

First, some of the links that you disavowed were likely contributing to an artificially high ranking, so you cannot expect those rankings to be as high as they were before.

Second, because many site owners have trouble assessing the quality of the links, some high-quality links inevitably get disavowed in the process, links that were contributing to the higher rankings.

Add to the mix the fact Google is constantly changing its ranking algorithm, so factors that benefited you previously might not have as big of an impact now, and vice versa.

Google Penguin Myths & Misconceptions

One of the great things about the SEO industry and those involved in it is that it’s a very active and vibrant community, and there are always new theories and experiment findings being published online daily.

Naturally, this has led to a number of myths and misconceptions being born about Google’s algorithms. Penguin is no different.

Here are a few myths and misconceptions about the Penguin algorithm we’ve seen over the years.

Myth: Penguin Is A Penalty

One of the biggest myths about the Penguin algorithm is that people call it a penalty (or what Google refers to as a manual action).

Penguin is strictly algorithmic in nature. It cannot be lifted by Google manually.

Despite the fact that an algorithmic change and a penalty can both cause a big downturn in website rankings, there are some pretty drastic differences between them.

A penalty (or manual action) happens when a member of Google’s webspam team has responded to a flag, investigated, and felt the need to enforce a penalty on the domain.

You will receive a notification through Google Search Console relating to this manual action.

When you get hit by a manual action, not only do you need to review your backlinks and submit a disavow for the spammy ones that go against Google’s guidelines, but you also need to submit a reconsideration request to the Google webspam team.

If successful, the penalty will be revoked; and if unsuccessful, it’s back to reviewing the backlink profile.

A Penguin downgrade happens without any involvement of a Google team member. It’s all done algorithmically.

Previously, you would have to wait for a refresh or algorithm update, but now, Penguin runs in real-time so recoveries can happen a lot faster (if enough remediation work has been done).

Myth: Google Will Notify You If Penguin Hits Your Site

Another myth about the Google Penguin algorithm is that you will be notified when it has been applied.

Unfortunately, this isn’t true. The Search Console won’t notify you that your rankings have taken a dip because of the application of the Penguin.

Again, this shows the difference between an algorithm and a penalty – you would be notified if you were hit by a penalty.

However, the process of recovering from Penguin is remarkably similar to that of recovering from a penalty.

Myth: Disavowing Bad Links Is The Only Way To Reverse A Penguin Hit

While this tactic will remove a lot of the low-quality links, it is utterly time-consuming and a potential waste of resources.

Google Penguin looks at the percentage of good quality links compared to those of a spammy nature.

So, rather than focusing on manually removing those low-quality links, it may be worth focusing on increasing the number of quality links your website has.

This will have a better impact on the percentage Penguin takes into account.

Myth: You Can’t Recover From Penguin

Yes, you can recover from Penguin.

It is possible, but it will require some experience in dealing with the fickle nature of Google algorithms.

The best way to shake off the negative effects of Penguin is to forget all of the existing links on your website, and begin to gain original editorially-given links.

The more of these quality links you gain, the easier it will be to release your website from the grip of Penguin.


Featured Image: Paulo Bobita/Search Engine Journal

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘penguin-algorithm-update’,
content_category: ‘seo’
});

Source link

SEO

What Are the Benefits of SEO? (And How to Get Started)

Published

on

What Are the Benefits of SEO? (And How to Get Started)

If you are a business owner, you may have heard about how vital search engine optimization (SEO) is. But how can it help fast-track your business’s growth, get more customers, and make a difference to your bottom line?

In more simple terms: Why do SEO? 

1. It increases your organic share of voice

There are an estimated 3.5 billion searches on Google each day. To tap into this audience, you’ll need to do SEO. 

One of the key benefits of SEO is increasing your organic share of voice (SOV). More organic SOV means more traffic, leads, and revenue for your business.

It also means more market share in your industry. We can see from the graph below that there is a strong relationship between SOV and market share

Relationship between SOV and market share graph

At Ahrefs, we calculate organic SOV by dividing the traffic to the site by the total search traffic for all keywords.

In other words, if you only track one keyword and the top 10 positions are occupied by pages of your website, your SOV is 100%.

So how can you measure organic SOV?

We first need to create a new project in Ahrefs’ Rank Tracker and add our keywords for the website we want to track.

How to add keywords in Ahrefs' Rank Tracker, via Ahrefs' Rank Tracker

Once you have added a new project and added your keywords, you can go to the dashboard and check your SOV.

It should look something like this:

SOV screenshot, via Ahrefs' Rank Tracker

If you click through the SOV on the dashboard, you can look at your competitor’s SOV compared to your SOV.

SOV vs. competitors screenshot, via Ahrefs' Rank Tracker

If no competitors are showing in your dashboard, you can change that by: 

  • Going into Settings.
  • Clicking on the Competitors tab.
  • Clicking on + Add competitor.

Enter your competitors manually or press the + to add them from the list below that shows the keyword intersection.

Adding competitors via settings menu, via Ahrefs' Rank Tracker

Once you are happy, click the Save button and return to the dashboard. You should now be able to compare your competitor’s SOV against your website’s SOV. 

Recommendation

Check out Michal Pecánek’s article on SOV to get a detailed guide on how to measure SOV.

2. It’s less intrusive than other types of marketing

Intrusive marketing is annoying.

It may seem obvious. But when our lives are full of adverts, cold calls, and emails from random people trying to sell you their products all the time, it’s very advantageous to be an inbound marketing channel

Intrusive outreach example, via Linkedin.com
An intrusive marketing example sent to me from LinkedIn.

SEO targets people who are actively searching for your services or products. 

Because of this, it’s excellent at converting—all you need to do is focus on what they are searching for.

You can find what your customers are searching for using Ahrefs’ Keywords Explorer.

Enter a relevant topic and go to the Matching terms report. Here, you’ll see many topics your customers are searching for, which you can target.

Matching terms report example, via Ahrefs' Keywords Explorer

You can search, discover, and analyze keywords using our very own Ahrefs’ Keywords Explorer.

SEO is not just for Christmas.

In all, 45.6% of SEOs say SEO takes about three to six months. It may seem like a long time for the channel to work, but SEO is a long-term marketing channel where growth typically compounds over time. In other words, if you put the time and effort into SEO, the results will likely be well worth it. 

Looking at Ahrefs’ organic traffic graph, most of our rapid growth occurred in the last year or so when our traffic started to compound. 

Ahrefs.com organic traffic performance

Google’s algorithm updates mean that your website’s traffic will fluctuate. But as you can see from the graph, it’s a positive trend in the long term.

Over time, you should be able to rely on SEO to bring in a constant stream of traffic to your site. 

When we asked ~4,300 SEOs how long SEO takes, they gave a variety of responses. But only 16.2% of SEOs said that SEO takes between one and three months.

Pie chart showing percentage breakdown of SEOs responses to how long SEO takes

I agree with the majority of SEOs here. But I would add a qualifier—that it depends on the type of website you are working on.

For example, if you set up a brand-new website, it will have little to no authority. This is because it will have no links pointing to it and will probably have minimal content on the site. 

These elements are just some indicators or ways in which Google judges your website’s authority and decides which top results should be on its SERPs.

Recommendation

If you are improving the SEO of a website that’s been around for a few years, then the SEO will likely take effect faster. This is based on my experience, but you may get different results with your website.

Unlike paying for PPC, organic search traffic is free. 

If Ahrefs used PPC to pay for its organic traffic, Ahrefs’ Site Explorer estimates it would cost an eye-watering $2.3 million per month or $27.6 million per year.

Monetary value of Ahrefs' organic traffic, via Ahrefs' Site Explorer

You can see from this example why improving SEO and increasing your organic traffic can be a highly valuable investment for your business.

Getting started with SEO doesn’t always mean you have to spend a lot, either. If you are willing to learn about SEO basics, you can do a lot of the work yourself to bring costs down.

Recommendation

When it comes to tools, you don’t have to spend a lot when you are starting out. You can use Ahrefs Webmaster Tools for auditing your site and our free SEO tools to optimize it further.

It’s always on—24/7

Unlike paid marketing, SEO is always on. It continues to work for you while you are asleep. 

You’ll get more overall value from SEO than other marketing channels. It doesn’t cost any extra to have it running all the time.

Another benefit is that once your website is established with good rankings, SEO will maintain them over time and drive consistent traffic to your website. 

The only thing that can stop this is if there is a serious technical issue with your site or you have fallen foul of Google’s search guidelines

Reduce your dependency on PPC 

It’s easy for a business to rely on pay-per-click (PPC) marketing, but it can be expensive to maintain this marketing strategy. 

SEO can help you change this.

Once you have some important keywords ranking number #1 on Google, you can consider turning off some of your PPC marketing, which could result in significant savings for your business.

5. It improves user experience

Many people have high expectations for websites these days. They expect them to be clear, intuitive, and lightning-fast. 

When websites don’t work as people expect, they get frustrated. And if they have a bad experience, this can create a negative perception of the brand. 

To do well in SEO, you’ll need to provide your visitors with the best possible user experience. 

But how can you optimize for user experience in SEO?

SEOs typically divide user experience issues into three categories: 

  • Site speed
  • Core Web Vitals
  • On-site optimization

Let’s take a closer look.

Site speed

Site speed is one of the most critical factors for your visitors. If your website is slow to use, visitors will likely leave your site and probably not return.

A few years ago, Google tested 900,000 websites worldwide. It reported that 53% of people would leave a website if it took three seconds or more to load.

To test your site speed, you can use a tool like webpagespeedtest.org. Let’s take a look at Ahrefs speed metrics using this tool.

Ahrefs' speed performance

We can see above that the speed index is under three seconds for Ahrefs. If your website loads in more than three seconds, then you may want to consider speeding up your website. 

Core Web Vitals

Core Web Vitals are Google’s quality signals it introduced to quantify the user experience of your website. 

They are:

  • Largest Contentful Paint (LCP) – For load performance.
  • First Input Delay (FID) – For visual stability.
  • Cumulative Layout Shift (CLS) – For interactivity.

Here’s what Google classifies as good and bad scores for these metrics.

Good Needs improvement Poor
LCP <=2.5s <=4s >4s
FID <=100ms <=300ms >300ms
CLS <=0.1 <=0.25 >0.25

Defining metrics for user experience is a benefit, as website owners can know exactly how their websites perform against Google’s expectations. 

Monitoring Core Web Vitals and site performance may sound technical, but you can keep an eye on them using Ahrefs’ Site Audit

For example, here’s a screenshot from the Performance dashboard highlighting two issues with CLS and LCP. 

Pages with poor CLS and poor LCP, via Ahrefs' Site Audit

As you can see from the above, Site Audit automatically identifies all low-performing pages for you.

On-page optimization

On-page optimization is another area of SEO that can help benefit your website. With on-page optimization, SEOs critically examine your website and audit it for any issues that may impact the user experience.

A good example of on-page optimization is adding subheadings, or heading tags, to your articles. Adding subheadings makes your content easier to read by establishing a visual hierarchy.

Subheadings improve readability by creating visual hierarchy

On-page optimization is less quantifiable than Core Web Vitals, but spending time on it will pay dividends for your website in the long run.

Ahrefs’ Site Audit can monitor headings, image alt text, internal linking, and other on-page optimization factors.

Here’s an example of a scheduled report you can get for heading optimization opportunities.

Heading optimization opportunities, via Ahrefs' Site Audit

You can use this report to identify many improvement opportunities for your website.

6. It puts your store online

If your business has a physical store, you can add it to a Google Business Profile for free.

Google My Business local listing example for Google San Francisco

Adding your business to Google’s business listings means you will appear on Google Maps when someone searches for your business or related keywords.

It’s a great way to highlight your business locally. For some businesses focused on local trade, this listing can be one of their most crucial organic search assets.

It also gives you a helpful way to communicate your business hours and opening times to your customers—something they will appreciate.

By having a solid organic presence and utilizing tools such as Google Business Profile, you can be sure that your online business will pick up sales even when you can’t open your physical store. 

So how do you set up your own Google Business Profile listing?

Setting up a Google Business Profile is straightforward and a three-step process.

  1. Claim your business profile 
  2. Add your business hours and details 
  3. Manage your profile, share any business updates, and respond to customer reviews

Once you have done this, you can monitor your business profile’s performance with the built-in analytics.

GMB analytics, via Google My Business

Using a Google Business Profile allows you to discover how people are searching for your website and helps you to understand how your business connects with customers online.

Building trust with customers is just as important online as it is offline. You wouldn’t buy something from a physical shop if the shop was run-down and the service was poor. 

The same applies to websites.

Your website should be fully operational and perform well in search engines. It should be secure and provide a great user experience to your customers. 

Having good SEO on your website shows you’re an authority in your industry. It shows you have the information and expertise customers are looking for.

It also means searchers will click on your website in the results because it is more prominent than your competitors—you will get the customers, and they won’t. 

The bottom line here is that by improving your website’s SEO, more visitors will trust your brand, which will drive more traffic and sales.

Learn more

Now you know the key benefits of SEO, you may want to start learning about it in more detail.

I’ve collected some helpful resources below to help you get started, so you can learn about SEO and start to reap the benefits:

Got more questions? Ping me on Twitter. 🙂



Source link

Continue Reading

SEO

How To Get More Traffic By Fixing Keyword Cannibalizations

Published

on

How To Get More Traffic By Fixing Keyword Cannibalizations

This post was sponsored by DinoRANK. The opinions expressed in this article are the sponsor’s own.

Google is a great source of qualified and recurring traffic for your business – that’s a fact.

Many people say that the key to SEO success is to establish yourself as an authoritative source for all the keywords in your industry, even niche keywords.

Unfortunately, your competition is doing the same thing. In some cases, you may even be competing with yourself.

Everyone is creating the same content to rank high on Google. So, you need to set yourself apart.

Your competition may be using the same niche keywords as you, but are they optimizing their domain’s SEO by cleaning up cannibalized content and keyword cannibalization?

They may not be; so, this is a great way to propel your website to the top of Google.

What Is Keyword Cannibalization In SEO?

As you may know, sometimes two or more URLs on the same domain may rank for the same keyword or group of keywords.

When this happens, Google does not know which piece of content to show on search engine results pages (SERPs).

When these URLs compete for the same search terms, it is called SEO cannibalization.

How Cleaning Up Cannibalization Can Instantly Boost Your Rankings

When you and your competition are ranking for the same keywords, yet your position on SERPs is lower, you may have a cannibalization issue.

If your content is cannibalizing itself, you not only have to fight your competitors for a top position – you also have to fight your own pages.

There’s enough competition out there without having it at home, right?

Cannibalization is a detrimental factor when URLs compete with each other. They can even harm your domain authority if there are too many of them, especially if you haven’t told Google that there is a clear difference between two similar pieces of content.

As you can see, cannibalizations have a negative influence on your domain’s SEO.

To solve the cannibalization problem, we must first know how to find them.

How To Find Cannibalization On Your Website

There are different ways to find cannibalization on your domain:

  • The Manual Way
  • The Easy Way

How To Find Cannibalized Keywords By Hand

There are two ways to locate content and keyword cannibalization without using a tool:

  1. Use a rank tracker or Google Search Console to see which positions your pages are ranking in the search results, as well as the keywords they are ranking for; then, find matching URLs for the same keywords.
  2. Review your site’s content manually to see if there are several pages that address the same topic or include the same keywords. Then, go to Google and search for that keyword you suspect you are cannibalizing and check if you really do it or not; this can also help you prevent it from happening.

How To Locate Keyword Cannibalization With A Single Click

Using DinoRANK, you can simply click one button and instantly see all your cannibalized content.

All you’ll need to do is create a project in the tool, and sync it with your Google Analytics and Google Search Console account. DinoRANK does the rest.

Screenshot from DinoRANK.com, January 2023

See what content is cannibalized on your website now →

Now, once you have located all the cannibalizations that affect your website, what decision should you make? 

What Is The Best Way To Fix Cannibalization?

Once you have found the cannibalizations on your website, you can take these steps to solve the problem:

  • Join or merge two URLs into one.
  • Make a 301 redirect.
  • Make a shift in focus to one of the two contents and de-optimize or optimize for the word that it cannibalizes.
  • Place a canonical tag in one of the two URLs.
  • Remove one of the two contents if you consider it thin content or duplicate content.

How Do I Pick The Best Method?

What analysis should you do when facing cannibalization to perform the most optimal action?

Let’s say that you have 2 URLs of the same domain that are positioned for the same keyword; 1 URL is in position 5, and the other URL is in position 7 on the SERPs.

In this case, you should:

  1. Check the content of both pages to see if they are addressing the same topic or include the same keywords. If so, they are probably competing with each other and causing cannibalization.
  2. Review the page authority of each page, as measured by the number of inbound links, quality of links, site structure, etc. The page with higher authority will likely perform better in search results.
  3. Check the traffic received by each page to see which one is receiving more visits, and analyze which URL is responding better to the search intention of that or those keywords.
  4. Check the conversion rates of each page to see which is generating more sales or conversion actions.

Once the analysis is done, you will have enough elements to recognize what action you should take in each cannibalization.

Still not sure?

Don’t worry. DinoRANK will show you recommendations on how to proceed, depending on the case.

How To Get More Traffic By Fixing Keyword Cannibalizations Quickly &#038; EffectivelyScreenshot from DinoRANK.com, January 2023

Usually, content is consolidated into one of the two cannibalized URLs (the one with higher authority or higher traffic), and the discarded URL should become a 301 redirect.

The impact of this repair is usually positive in a very high number of cases. 

Although if both contents fulfill their function (for example, one URL belongs to the product card and the other to a blog post), a better option would be to optimize or de-optimize the content as appropriate or choose to implement a canonical tag to one of the URLs to indicate to search engines which is the main page.

In any case, it is important to continuously monitor the performance of the pages and make adjustments if necessary to avoid cannibalization in the future; with DinoRANK you will have this under control.

How To Optimize Or De-Optimize Content That Is Cannibalizing

Search engines use algorithms to determine the relevance of a web page for a given search query, and one of the ways they do this is by identifying the keywords that appear most frequently on a page, relative to the keywords that appear on other similar pages.

Before optimizing or de-optimizing one of the two contents, it’s important to know all the keywords for each URL being cannibalized.

Then, you’ll need to analyze their semantic content with TF*IDF analysis.

So, analyzing a URL whose content you want to optimize/promote for a specific keyword with TF*IDF will allow you to understand which keywords are relevant to that content.

How To Get More Traffic By Fixing Keyword Cannibalizations Quickly &#038; EffectivelyScreenshot from DinoRANK.com, January 2023

Expanding and semantically optimizing content that is already ranking highly, whether it is cannibalizing or not, is one of the best ways to gain more relevance for those terms that only received impressions or a few clicks. This is because you are now providing a greater semantic richness to that content.

With DinoRANK, you can use TF*IDF analysis on already published URLs to see how to optimize them.

You can also use this feature when you want to create new content based on a keyword or a group of keywords you want to rank for.

How To Get More Traffic By Fixing Keyword Cannibalizations Quickly &#038; EffectivelyScreenshot from DinoRANK.com, January 2023

DinoRANK arranges the information visually, separating the graph into three filter layers: one keyword, two keywords, and three keywords.

How To Get More Traffic By Fixing Keyword Cannibalizations Quickly &#038; EffectivelyScreenshot from DinoRANK.com, January 2023

In addition to the recommendations, you will also be able to quickly see the header structure used by your most direct competitors in Google’s Top 10.

In a very short time, you’ll discover opportunities and content ideas with an optimal heading structure, including the related semantic keywords needed to rank.

If, in addition, this semantic content extension is complemented by some internal links to that optimized URL, and you do it with anchor text of the primary keyword or exact semantic keywords, you will strengthen the authority and the typicality of that URL you want to promote.

With internal links, you derive a greater semantic context and strengthen the authority of that page already optimized, thanks to the semantic prominence analysis TF*IDF.

If you want to use these features to work the SEO of your projects in a simple but effective way, you can try DinoRANK.

All the SEO your website needs can be found in DinoRANK.



Source link

Continue Reading

SEO

Microsoft Announces ChatGPT Capabilities Coming To Bing

Published

on

Microsoft Announces ChatGPT Capabilities Coming To Bing

Microsoft announced today that it is bringing cutting-edge AI capabilities to its Bing search engine, with the addition of a new ChatGPT-like feature.

Microsoft revealed its plans for integrating ChatGPT at a private event held at its Redmond headquarters today, which centered around its partnership with OpenAI.

Unlike recent virtual events, this particular press conference was held in person and not broadcast online.

During the event, Microsoft CEO Satya Nadella highlighted the significance of this new feature and how it will revolutionize the way people interact with search engines.

“I think this technology is going to reshape pretty much every software category,” says Nadella.

Nadella proclaimed, “The race starts today,” and Microsoft is going to “move and move fast.”

The event attendees were given a sneak peek at the latest search experience, which Microsoft refers to as “your AI-powered copilot for the web.”

This new experience combines the all-new Bing search engine and Edge web browser, which are designed to complement each other.

Nadella explained that the new Bing would provide direct answers to questions and encourage users to be more creative.

He also stated that the current search experience is not working as efficiently as it should be, as 40% of the time, people click on search links and then immediately click back.

This clearly indicates that the search experience needs to be updated and improved. Nadella claims that the search engine user experience hasn’t changed in 20 years, and it’s time for Microsoft to adapt.

Introducing The New Bing

The new Bing is powered by a next-generation language model from OpenAI, which has been specifically customized for search purposes. It’s even more powerful than the ChatGPT model.

Microsoft has implemented a new way of working with OpenAI called the “prometheus model,” which enhances the relevancy of answers, annotates them, keeps them up to date, and more.

The search index has also been improved by applying the AI model to the core search algorithm, which Nadella calls the largest jump in relevance ever.

It runs on a new user experience with an expanded search box that accepts up to 1,000 characters. Examples shared during the event look exactly like recent leaks.

The new Bing includes a chatbot that behaves similarly to ChatGPT, allowing users to interact with Bing in a natural language.

Bing’s new ChatGPT-like feature will take it a step further by allowing users to have an actual conversation with the search engine, with the ability to follow up on previous questions and provide more context for their search.

The new Bing is now available for a limited preview on desktop, and anyone can try it out by visiting Bing.com and performing sample searches.

You can also sign up to be notified when it becomes more widely available.

The preview will be expanded to millions of users in the near future, and a mobile version will be available soon.

The New Edge Browser

The chat interface Microsoft demonstrated in Bing is available as a sidebar feature in Edge, allowing users to access it without navigating to the Bing website. The interface can run alongside any webpage and interact with it.

During a demonstration, the AI assistant in Edge could summarize a 15-page PDF with one click and even translate a code snippet from Stack Overflow into another programming language.

Another benefit of the Edge browser’s “AI co-pilot” is having it complete tasks for you, such as filling out forms and writing emails.

In Summary

Microsoft has made a substantial leap in search engine technology by integrating a ChatGPT-like feature in its Bing search engine.

The new Bing is powered by a next-generation language model from OpenAI, which takes key learnings and advancements from ChatGPT and GPT-3.5.

Bing with the AI co-pilot is now available for a limited preview on desktop, and a mobile version will be available soon.

Additionally, the chat interface will be available as a sidebar feature in the new Edge browser, which has the ability to summarize information, translate code, and even complete tasks.


Source: Microsoft

Featured Image: Poetra.RH/Shutterstock



Source link

Continue Reading

Trending

en_USEnglish