Connect with us

SEO

When To Canonicalize, Noindex, Or Do Nothing With Similar Content

Published

on

When To Canonicalize, Noindex, Or Do Nothing With Similar Content

Picture your content as you do yourself. Are you carrying some baggage you could get rid of? Carrying something you want to keep but maybe want to repurpose or see differently?

This is no different when it comes to website content. We’ve all likely sat around as a group of minds thinking about the content we would like to slice off our website but realize there is still a need for it, whether it is for a specific prospect, internal team, etc.

While we look for ways to slim our websites as much as possible for content management purposes, we also want to do the same to appease crawling search engine bots.

We want their, hopefully, daily visit to our websites to be fast and succinct.

This hopefully shows them who we are, what we are about, and ultimately – if we have to have content that can’t be removed – how we are labeling it for them.

Advertisement

Luckily, search engine crawlers want to understand our content just as much as we want this of them. Given to us are chances to canonicalize content and noindex content.

However, beware, not doing this correctly could render important website content misunderstood by search engine crawlers or not read at all.

Canonicalize?

Screenshot by author, July 2022

Canonical tags provide a great way of instructing search engines: “Yes, we know this content is not that unique or valuable, but we must have it.”

It can also be a great way to point value to content originating from another domain or vice versa.

Nonetheless, now is your time to show the crawling bots how you perceive website content.

To utilize, you must place this tag within the head section of the source code.

The canonical tag can be a great way to deal with content that you know is duplicate or similar, but it must exist for user needs on the site or a slow site maintenance team.

Advertisement

If you think this tag is an ideal fit for your website, review your website and address site sections that appear to have separate URLs but have similar content (e.g., copy, image, headings, title elements, etc.).

Website auditing tools such as Screaming Frog and the Semrush Site Audit section are a quick way to see content similarities.

If you think there might be some other similar content culprits out there, you can take a deeper look with tools such as Similar Page Checker and Siteliner, which will review your site for similar content.

Now that you have a good feel for cases of similarity, you need to understand if this lack of uniqueness is worthy of canonicalization. Here are a few examples and solutions:

Example 1: Your website exists at both HTTP and HTTPS versions of site pages, or your website exists with both www. and non-www. page versions.

Solution: Place a canonical tag to the page version with the most significant amount of links, internal links, etc., until you can redirect all duplicating pages one-to-one. 

Advertisement

Example 2: You sell products that are highly similar where there is no unique copy on these pages but slight variations in the name, image, price, etc. Should you canonically point the specific product pages to the product parent page?

Solution: Here, my advice is to do nothing. These pages are unique enough to be indexed. They have unique names differentiating them, and this could help you for long-tail keyword instances.

Example 3: You sell t-shirts but have a page for every color and every shirt.

Solution: Canonical tag the color pages to reference the parent shirt page. Each page isn’t a particular product, just a very similar variation.

Use Case: Canonical Tagging Content That’s Unique Enough To Succeed

Similar to the example presented above, I wanted to explain that sometimes, slightly similar content can still be appropriate for indexation.

What if it was shirts with child pages for different shirt types like long sleeves, tank tops, etc.? This now becomes a different product, not just a variation. As also previously mentioned, this can serve successful for long-tail web searches.

Advertisement

Here’s a great example: An automotive sales site that features pages for car makes, associated models, and variations of those models (2Dr, 4Dr, V8, V6, deluxe edition, etc.). The initial thought with this site is that all variations are simply near duplications of the model pages.

You may think, why would we want to annoy search engines with this near duplicative content when we can canonicalize these pages to point to the model page as the representative page?

We moved in this direction but still, the anxiety on whether these pages could succeed made us move to canonically tag each respective model page.

Suppose you canonically tag to the parent model page. Even if you show the content importance/hierarchy to search engines, they may still rank the canonicalized page if the search is relatively specific.

So, what did we see?

We found that organic traffic increased to both child and parent pages. It’s my opinion that when you give credit back to the child pages, the parent page looks to have more authority as it has many child pages which are now given back “credit.”

Advertisement

Monthly traffic to all these pages together grew five times.

Since September of this year, when we revised the canonical tags, there is now 5x monthly organic traffic to this site area, with 754 pages driving organic traffic compared to the 154 recognized earlier in the previous year.

Monthly traffic to all these pages together grew five times.Screenshot by author with Semrush, July 2022

Don’t Make These Canonicalization Mistakes

  • Setting canonical tags that endure a redirect before resolving to the final page can do a great disservice. This will slow search engines as it forces them to try to understand content importance but are now jumping URLs.
  • Similarly, if you point canonical tags towards URL targets that are 404-ing error pages, then you essentially point them into a wall.
  • Canonical tagging to the wrong page version (i.e., www./non-www., HTTP/HTTPS). We discussed finding through website crawling tools that you may have unintentional website duplication. Don’t mistake pointing page importance to a weaker page version.

Noindex?

You can also utilize the meta robots noindex tag to exclude similar or duplicate content entirely.

Placing the noindex tag in the head section of your source code will stop search engines from indexing these pages.

Beware: While the meta robots noindex tag is a quick way to remove duplicate content from ranking consideration, it can be dangerous to your organic traffic if you fail to use it appropriately.

This tag has been used in the past to weed down large sites to present only search-critical site pages so that site crawl spend is as efficient as possible.

However, you want search engines to see all relevant site content to understand site taxonomy and the hierarchy of pages.

Advertisement

However, if this tag doesn’t scare you too much, you can use it to let search engines only crawl and index what you deem fresh, unique content.

Here are a couple of ways noindexing might be discussed as a solution:

Example 1: To aid your customers, you can provide documentation from the manufacturer, even though they already feature this on their website.

Solution: Continue providing documentation to aid your on-site customers but noindex these pages.

They are already owned and indexed with the manufacturer, which likely has much more domain authority than you. In other words, you will not likely be the ranking website for this content.

Example 2: You offer several different but similar products. The only differentiation is color, size, count, etc. We don’t want to waste crawl spend.

Advertisement

Solution: Solve via the use of canonical tags. A long-tail search could drive qualified traffic because a given page would still be indexed and able to rank.

Example 3: You have a lot of old products that you don’t sell much of anymore and are no longer a primary focus.

Solution: This perfect scenario is likely found in a content or sales audit. If the products do little for the company, consider retirement.

Consider either canonically pointing these pages to relevant categorical pages or redirecting them to relevant categorical pages. These pages have age/trust, may have links, and may possess rankings.

Use Case: Don’t Sacrifice Rankings/Traffic For Crawl Spend Considerations

Regarding our website, we know we want to put our best foot forward for search engines.

We don’t want to waste their time when crawling, and we don’t want to create a perception that most of our content lacks uniqueness.

Advertisement

In the example below, to reduce the bloat of somewhat similar product page content from search engine reviews, meta robots noindex tags were placed on child product variation pages during the time of a domain transition/relaunch.

The below graph shows the total keyword amounts which transitioned from one domain to another.

When the meta robots noindex tags were removed, the overall amount of ranking terms grew by 50%.

When the meta robots noindex tags were removed, the overall amount of ranking terms grew by 50%.Screenshot by author with Semrush, July 2022

Don’t Make These Meta Robots Noindex Mistakes

  • Don’t place a meta robots noindex tag on a page with an inbound link value. If so, you should permanently redirect the page in question to another relevant site page. Placing the tag will eliminate the valuable link equity that you have.
  • If you’re noindexing a page that is included in the main, footer, or supporting navigation, make sure that the directive isn’t “noindex, nofollow” but “noindex, follow” so search engines that are crawling the site can still pass through the links on the noindexed page.

Conclusion

Sometimes it is hard to part ways with website content.

The canonical and meta robots noindex tags are a great way to preserve website functionality for all users while also instructing search engines.

In the end, be careful how you tag! It’s easy to lose search presence if you do not fully understand the tagging process.

More Resources:

Advertisement

Featured Image: Jack Frog/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Studio By WordPress & Other Free Tools

Published

on

By

Studio by WordPress lets you create WordPress sites on your desktop, plus other similar tools.

WordPress announced the rollout of Studio by WordPress, a new local development tool that makes it easy for publishers to not just develop and update websites locally on their desktop or laptop but is also useful for learning how to use WordPress. Learn about Studio and other platforms that are make it easy to develop websites with WordPress right on your desktop.

Local Development Environments

Local Environments are like web hosting spaces on the desktop that can be used to set up a WordPress site. They’re a fantastic way to try out new WordPress themes and plugins to learn how they work without messing up a live website or publishing something to the web that might get accidentally indexed by Google. They are also useful for testing if an updated plugin causes a conflict with other plugins on a website, which is useful for testing updated plugins offline before committing to updating the plugins on a live website.

Studio joins a list of popular local development environments that are specific for WordPress and more advanced platforms that are that can be used for WordPress on the desktop but have greater flexibility and options but may be harder to use for non-developers.

Desktop WordPress Development Environments

There are currently a few local environments that are specific to WordPress. The advantages of using a dedicated WordPress environment is that they make it easy to start creating  with WordPress for those who only need to work with WordPress sites and nothing more complicated than that.

Studio By WordPress.com

Studio is an open source project that allows developers and publishers to set up a WordPress site on their desktop in order to design, test or learn how to use WordPress.

Advertisement

According to the WordPress announcement:

“Say goodbye to manual tool configuration, slow site setup, and clunky local development workflows, and say hello to Studio by WordPress.com, our new, free, open source local WordPress development environment.

Once you have a local site running, you can access WP Admin, the Site Editor, global styles, and patterns, all with just one click—and without needing to remember and enter a username or password.”

The goal of Studio is to be a simple and fast way to create WordPress sites on the desktop. It’s currently available for use on a Mac and a Windows version is coming soon.

Download the Mac version here.

Other Popular WordPress Local Development Environments

DevKinsta

DevKinsta, developed by Kinsta managed web host, is another development environment that’s specifically dedicated for quickly designing and testing WordPress sites on the desktop. It’s a popular choice that many developers endorse.

Advertisement

That makes it a great tool for publishers, SEOs and developers who just want a tool to do one thing, create WordPress sites. This makes DevKinsta a solid consideration for anyone who is serious about developing WordPress sites or just wants to learn how to use WordPress, especially the latest Gutenberg Blocks environment.

Download  DevKinsta for free here.

Local WP

Local WP is a popular desktop development environment specifically made for WordPress users by WP Engine, a managed WordPress hosting provider.

Useful Features of Local WP

Local WP has multiple features that make it useful beyond simply developing and testing WordPress websites.

  • Image Optimizer
    It features a free image optimizer add-on that optimizes images on your desktop which should be popular for those who are unable to optimize images on their own.
  • Upload Backups
    Another handy feature is the ability to upload backups to Dropbox and Google Drive.
  • Link Checker
    The tool has a built-in link checker that scans your local version of the website to identify broken links. This is a great way to check a site offline without using server resources and potentially slowing down your live site.
  • Import & Export Sites
    This has the super-handy ability to import WordPress website files and export them so that you can work on your current WordPress site on your desktop, test out new plugins or themes and if you’re ready you can upload the files to your website.

Advanced Local Development Environments

There are other local development environments that are not specific for WordPress but are nonetheless useful for designing and testing WordPress sites on the desktop. These tools are more advanced and are popular with developers who appreciate the freedom and options available in these platforms.

DDEV with Docker

An open source app that makes it easy to use the Docker software containerization to quickly install a content management system and start working, without having to deal with the Docker learning curve.

Download DDEV With Docker here.

Advertisement

Laragon

Laragon is a free local development environment that was recommended to me by someone who is an advanced coder because they said that it’s easy to use and fairly intuitive. They were right. I’ve used it and have had good experiences with it. It’s not a WordPress-specific tool so that must be kept in mind.

Laragon describes itself as an easy to use alternative to XXAMPP and WAMP.

Download DDEV here.

Mamp

Mamp is a local development platform that’s popular with advanced coders and is available for Mac and Windows.

David McCan (Facebook profile), a WordPress trainer who writes about advanced WordPress topics on WebTNG shared his experience with MAMP.

“MAMP is pretty easy to setup and it provides a full range of features. I currently have 51 local sites which are development versions of my production sites, that I use for testing plugins, and periodically use for new beta versions of WordPress core. It is easy to clone sites also. I haven’t noticed any system slowdown or lag.”

WAMP And XAMPP

WAMP is a Windows only development environment that’s popular with developers and WordPress theme and plugin publishers.

Advertisement

XAMPP is a PHP development platform that can be used on Linux, Mac, and Windows desktops.

Download Wamp here.

Download XAMPP here.

So Many Local Development Platforms

Studio by WordPress.com is an exciting new local development platform and I’m looking forward to trying it out. But it’s not the only one so it may be useful to try out different solutions to see which one works best for you.

Read more about Studio by WordPress:

Meet Studio by WordPress.com—a fast, free way to develop locally with WordPress

Advertisement

Featured Image by Shutterstock/Wpadington

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Big Update To Google’s Ranking Drop Documentation

Published

on

By

Google updates documentation for diagnosing ranking drops

Google updated their guidance with five changes on how to debug ranking drops. The new version contains over 400 more words that address small and large ranking drops. There’s room to quibble about some of the changes but overall the revised version is a step up from what it replaced.

Change# 1: Downplays Fixing Traffic Drops

The opening sentence was changed so that it offers less hope for bouncing back from an algorithmic traffic drop. Google also joined two sentences into one sentence in the revised version of the documentation.

The documentation previously said that most traffic drops can be reversed and that identifying the reasons for a drop aren’t straightforward. The part about most of them can be reversed was completely removed.

Here is the original two sentences:

“A drop in organic Search traffic can happen for several reasons, and most of them can be reversed. It may not be straightforward to understand what exactly happened to your site”

Now there’s no hope offered for “most of them can be reversed” and more emphasis on understanding what happened is not straightforward.

Advertisement

This is the new guidance

“A drop in organic Search traffic can happen for several reasons, and it may not be straightforward to understand what exactly happened to your site.”

Change #2 Security Or Spam Issues

Google updated the traffic graph illustrations so that they precisely align with the causes for each kind of traffic decline.

The previous version of the graph was labeled:

“Site-level technical issue (Manual Action, strong algorithmic changes)”

The problem with the previous label is that manual actions and strong algorithmic changes are not technical issues and the new version fixes that issue.

The updated version now reads:

“Large drop from an algorithmic update, site-wide security or spam issue”

Change #3 Technical Issues

There’s one more change to a graph label, also to make it more accurate.

Advertisement

This is how the previous graph was labeled:

“Page-level technical issue (algorithmic changes, market disruption)”

The updated graph is now labeled:

“Technical issue across your site, changing interests”

Now the graph and label are more specific as a sitewide change and “changing interests” is more general and covers a wider range of changes than market disruption. Changing interests includes market disruption (where a new product makes a previous one obsolete or less desirable) but it also includes products that go out of style or loses their trendiness.

Graph titled

Change #4 Google Adds New Guidance For Algorithmic Changes

The biggest change by far is their brand new section for algorithmic changes which replaces two smaller sections, one about policy violations and manual actions and a second one about algorithm changes.

The old version of this one section had 108 words. The updated version contains 443 words.

A section that’s particularly helpful is where the guidance splits algorithmic update damage into two categories.

Advertisement

Two New Categories:

  • Small drop in position? For example, dropping from position 2 to 4.
  • Large drop in position? For example, dropping from position 4 to 29.

The two new categories are perfect and align with what I’ve seen in the search results for sites that have lost rankings. The reasons for dropping up and down within the top ten are different from the reasons why a site drops completely out of the top ten.

I don’t agree with the guidance for large drops. They recommend reviewing your site for large drops, which is good advice for some sites that have lost rankings. But in other cases there’s nothing wrong with the site and this is where less experienced SEOs tend to be unable to fix the problems because there’s nothing wrong with the site. Recommendations for improving EEAT, adding author bios or filing link disavows do not solve what’s going on because there’s nothing wrong with the site. The problem is something else in some of the cases.

Here is the new guidance for debugging search position drops:

Algorithmic update
Google is always improving how it assesses content and updating its search ranking and serving algorithms accordingly; core updates and other smaller updates may change how some pages perform in Google Search results. We post about notable improvements to our systems on our list of ranking updates page; check it to see if there’s anything that’s applicable to your site.

If you suspect a drop in traffic is due to an algorithmic update, it’s important to understand that there might not be anything fundamentally wrong with your content. To determine whether you need to make a change, review your top pages in Search Console and assess how they were ranking:

Small drop in position? For example, dropping from position 2 to 4.
Large drop in position? For example, dropping from position 4 to 29.

Keep in mind that positions aren’t static or fixed in place. Google’s search results are dynamic in nature because the open web itself is constantly changing with new and updated content. This constant change can cause both gains and drops in organic Search traffic.

Small drop in position
A small drop in position is when there’s a small shift in position in the top results (for example, dropping from position 2 to 4 for a search query). In Search Console, you might see a noticeable drop in traffic without a big change in impressions.

Advertisement

Small fluctuations in position can happen at any time (including moving back up in position, without you needing to do anything). In fact, we recommend avoiding making radical changes if your page is already performing well.

Large drop in position
A large drop in position is when you see a notable drop out of the top results for a wide range of terms (for example, dropping from the top 10 results to position 29).

In cases like this, self-assess your whole website overall (not just individual pages) to make sure it’s helpful, reliable and people-first. If you’ve made changes to your site, it may take time to see an effect: some changes can take effect in a few days, while others could take several months. For example, it may take months before our systems determine that a site is now producing helpful content in the long term. In general, you’ll likely want to wait a few weeks to analyze your site in Search Console again to see if your efforts had a beneficial effect on ranking position.

Keep in mind that there’s no guarantee that changes you make to your website will result in noticeable impact in search results. If there’s more deserving content, it will continue to rank well with our systems.”

Change #5 Trivial Changes

The rest of the changes are relatively trivial but nonetheless makes the documentation more precise.

For example, one of the headings was changed from this:

Advertisement

You recently moved your site

To this new heading:

Site moves and migrations

Google’s Updated Ranking Drops Documentation

Google’s updated documentation is a well thought out but I think that the recommendations for large algorithmic drops are helpful for some cases and not helpful for other cases. I have 25 years of SEO experience and have experienced every single Google algorithm update. There are certain updates where the problem is not solved by trying to fix things and Google’s guidance used to be that sometimes there’s nothing to fix. The documentation is better but in my opinion it can be improved even further.

Read the new documentation here:

Debugging drops in Google Search traffic

Review the previous documentation:

Internet Archive Wayback Machine: Debugging drops in Google Search traffic

Advertisement

Featured Image by Shutterstock/Tomacco

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google March 2024 Core Update Officially Completed A Week Ago

Published

on

By

Graphic depicting the Google logo with colorful segments on a blue circuit board background, accompanied by the text "Google March 2024 Core Update.

Google has officially completed its March 2024 Core Update, ending over a month of ranking volatility across the web.

However, Google didn’t confirm the rollout’s conclusion on its data anomaly page until April 26—a whole week after the update was completed on April 19.

Many in the SEO community had been speculating for days about whether the turbulent update had wrapped up.

The delayed transparency exemplifies Google’s communication issues with publishers and the need for clarity during core updates

Google March 2024 Core Update Timeline & Status

First announced on March 5, the core algorithm update is complete as of April 19. It took 45 days to complete.

Advertisement

Unlike more routine core refreshes, Google warned this one was more complex.

Google’s documentation reads:

“As this is a complex update, the rollout may take up to a month. It’s likely there will be more fluctuations in rankings than with a regular core update, as different systems get fully updated and reinforce each other.”

The aftershocks were tangible, with some websites reporting losses of over 60% of their organic search traffic, according to data from industry observers.

The ripple effects also led to the deindexing of hundreds of sites that were allegedly violating Google’s guidelines.

Addressing Manipulation Attempts

In its official guidance, Google highlighted the criteria it looks for when targeting link spam and manipulation attempts:

  • Creating “low-value content” purely to garner manipulative links and inflate rankings.
  • Links intended to boost sites’ rankings artificially, including manipulative outgoing links.
  • The “repurposing” of expired domains with radically different content to game search visibility.

The updated guidelines warn:

“Any links that are intended to manipulate rankings in Google Search results may be considered link spam. This includes any behavior that manipulates links to your site or outgoing links from your site.”

John Mueller, a Search Advocate at Google, responded to the turbulence by advising publishers not to make rash changes while the core update was ongoing.

Advertisement

However, he suggested sites could proactively fix issues like unnatural paid links.

Mueller stated on Reddit:

“If you have noticed things that are worth improving on your site, I’d go ahead and get things done. The idea is not to make changes just for search engines, right? Your users will be happy if you can make things better even if search engines haven’t updated their view of your site yet.”

Emphasizing Quality Over Links

The core update made notable changes to how Google ranks websites.

Most significantly, Google reduced the importance of links in determining a website’s ranking.

In contrast to the description of links as “an important factor in determining relevancy,” Google’s updated spam policies stripped away the “important” designation, simply calling links “a factor.”

This change aligns with Google’s Gary Illyes’ statements that links aren’t among the top three most influential ranking signals.

Advertisement

Instead, Google is giving more weight to quality, credibility, and substantive content.

Consequently, long-running campaigns favoring low-quality link acquisition and keyword optimizations have been demoted.

With the update complete, SEOs and publishers are left to audit their strategies and websites to ensure alignment with Google’s new perspective on ranking.

Core Update Feedback

Google has opened a ranking feedback form related to this core update.

You can use this form until May 31 to provide feedback to Google’s Search team about any issues noticed after the core update.

While the feedback provided won’t be used to make changes for specific queries or websites, Google says it may help inform general improvements to its search ranking systems for future updates.

Advertisement

Google also updated its help documentation on “Debugging drops in Google Search traffic” to help people understand ranking changes after a core update.


Featured Image: Rohit-Tripathi/Shutterstock

FAQ

After the update, what steps should websites take to align with Google’s new ranking criteria?

After Google’s March 2024 Core Update, websites should:

  • Improve the quality, trustworthiness, and depth of their website content.
  • Stop heavily focusing on getting as many links as possible and prioritize relevant, high-quality links instead.
  • Fix any shady or spam-like SEO tactics on their sites.
  • Carefully review their SEO strategies to ensure they follow Google’s new guidelines.

Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS