Connect with us

SEO

Everything You Need To Know About The X-Robots-Tag HTTP Header

Published

on

Everything You Need To Know About The X-Robots-Tag HTTP Header

Search engine optimization, in its most basic sense, relies upon one thing above all others: Search engine spiders crawling and indexing your site.

But nearly every website is going to have pages that you don’t want to include in this exploration.

For example, do you really want your privacy policy or internal search pages showing up in Google results?

In a best-case scenario, these are doing nothing to drive traffic to your site actively, and in a worst-case, they could be diverting traffic from more important pages.

Luckily, Google allows webmasters to tell search engine bots what pages and content to crawl and what to ignore. There are several ways to do this, the most common being using a robots.txt file or the meta robots tag.

Advertisement

We have an excellent and detailed explanation of the ins and outs of robots.txt, which you should definitely read.

But in high-level terms, it’s a plain text file that lives in your website’s root and follows the Robots Exclusion Protocol (REP).

Robots.txt provides crawlers with instructions about the site as a whole, while meta robots tags include directions for specific pages.

Some meta robots tags you might employ include index, which tells search engines to add the page to their index; noindex, which tells it not to add a page to the index or include it in search results; follow, which instructs a search engine to follow the links on a page; nofollow, which tells it not to follow links, and a whole host of others.

Both robots.txt and meta robots tags are useful tools to keep in your toolbox, but there’s also another way to instruct search engine bots to noindex or nofollow: the X-Robots-Tag.

What Is The X-Robots-Tag?

The X-Robots-Tag is another way for you to control how your webpages are crawled and indexed by spiders. As part of the HTTP header response to a URL, it controls indexing for an entire page, as well as the specific elements on that page.

Advertisement

And whereas using meta robots tags is fairly straightforward, the X-Robots-Tag is a bit more complicated.

But this, of course, raises the question:

When Should You Use The X-Robots-Tag?

According to Google, “Any directive that can be used in a robots meta tag can also be specified as an X-Robots-Tag.”

While you can set robots.txt-related directives in the headers of an HTTP response with both the meta robots tag and X-Robots Tag, there are certain situations where you would want to use the X-Robots-Tag – the two most common being when:

  • You want to control how your non-HTML files are being crawled and indexed.
  • You want to serve directives site-wide instead of on a page level.

For example, if you want to block a specific image or video from being crawled – the HTTP response method makes this easy.

The X-Robots-Tag header is also useful because it allows you to combine multiple tags within an HTTP response or use a comma-separated list of directives to specify directives.

Maybe you don’t want a certain page to be cached and want it to be unavailable after a certain date. You can use a combination of “noarchive” and “unavailable_after” tags to instruct search engine bots to follow these instructions.

Advertisement

Essentially, the power of the X-Robots-Tag is that it is much more flexible than the meta robots tag.

The advantage of using an X-Robots-Tag with HTTP responses is that it allows you to use regular expressions to execute crawl directives on non-HTML, as well as apply parameters on a larger, global level.

To help you understand the difference between these directives, it’s helpful to categorize them by type. That is, are they crawler directives or indexer directives?

Here’s a handy cheat sheet to explain:

Crawler Directives Indexer Directives
Robots.txt – uses the user agent, allow, disallow, and sitemap directives to specify where on-site search engine bots are allowed to crawl and not allowed to crawl. Meta Robots tag – allows you to specify and prevent search engines from showing particular pages on a site in search results.

Nofollow – allows you to specify links that should not pass on authority or PageRank.

Advertisement

X-Robots-tag – allows you to control how specified file types are indexed.

Where Do You Put The X-Robots-Tag?

Let’s say you want to block specific file types. An ideal approach would be to add the X-Robots-Tag to an Apache configuration or a .htaccess file.

The X-Robots-Tag can be added to a site’s HTTP responses in an Apache server configuration via .htaccess file.

Real-World Examples And Uses Of The X-Robots-Tag

So that sounds great in theory, but what does it look like in the real world? Let’s take a look.

Let’s say we wanted search engines not to index .pdf file types. This configuration on Apache servers would look something like the below:

<Files ~ ".pdf$">
  Header set X-Robots-Tag "noindex, nofollow"
</Files>

In Nginx, it would look like the below:

Advertisement
location ~* .pdf$ {
  add_header X-Robots-Tag "noindex, nofollow";
}

Now, let’s look at a different scenario. Let’s say we want to use the X-Robots-Tag to block image files, such as .jpg, .gif, .png, etc., from being indexed. You could do this with an X-Robots-Tag that would look like the below:

<Files ~ ".(png|jpe?g|gif)$">
Header set X-Robots-Tag "noindex"
</Files>

Please note that understanding how these directives work and the impact they have on one another is crucial.

For example, what happens if both the X-Robots-Tag and a meta robots tag are located when crawler bots discover a URL?

If that URL is blocked from robots.txt, then certain indexing and serving directives cannot be discovered and will not be followed.

If directives are to be followed, then the URLs containing those cannot be disallowed from crawling.

Check For An X-Robots-Tag

There are a few different methods that can be used to check for an X-Robots-Tag on the site.

Advertisement

The easiest way to check is to install a browser extension that will tell you X-Robots-Tag information about the URL.

Screenshot of Robots Exclusion Checker, December 2022Robots Exclusion Checker

Another plugin you can use to determine whether an X-Robots-Tag is being used, for example, is the Web Developer plugin.

By clicking on the plugin in your browser and navigating to “View Response Headers,” you can see the various HTTP headers being used.

web developer plugin

web developer plugin

Another method that can be used for scaling in order to pinpoint issues on websites with a million pages is Screaming Frog.

After running a site through Screaming Frog, you can navigate to the “X-Robots-Tag” column.

This will show you which sections of the site are using the tag, along with which specific directives.

Screaming Frog Report. X-Robot-TagScreenshot of Screaming Frog Report. X-Robot-Tag, December 2022Screaming Frog Report. X-Robot-Tag

Using X-Robots-Tags On Your Site

Understanding and controlling how search engines interact with your website is the cornerstone of search engine optimization. And the X-Robots-Tag is a powerful tool you can use to do just that.

Just be aware: It’s not without its dangers. It is very easy to make a mistake and deindex your entire site.

Advertisement

That said, if you’re reading this piece, you’re probably not an SEO beginner. So long as you use it wisely, take your time and check your work, you’ll find the X-Robots-Tag to be a useful addition to your arsenal.

More Resources:


Featured Image: Song_about_summer/Shutterstock

var s_trigger_pixel_load = false;
function s_trigger_pixel(){
if( !s_trigger_pixel_load ){
setTimeout(function(){ striggerEvent( ‘load2’ ); }, 500);
window.removeEventListener(“scroll”, s_trigger_pixel, false );
window.removeEventListener(“mousemove”, s_trigger_pixel, false );
window.removeEventListener(“click”, s_trigger_pixel, false );
console.log(‘s_trigger_pixel’);
}
s_trigger_pixel_load = true;
}
window.addEventListener( ‘scroll’, s_trigger_pixel, false);
document.addEventListener( ‘mousemove’, s_trigger_pixel, false);
document.addEventListener( ‘click’, s_trigger_pixel, false);

window.addEventListener( ‘load2’, function() {

if( sopp != ‘yes’ && addtl_consent != ‘1~’ && !ss_u ){

Advertisement

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘everything-you-need-to-know-about-the-x-robots-tag’,
content_category: ‘seo technical-seo’
});
}
});

Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Studio By WordPress & Other Free Tools

Published

on

By

Studio by WordPress lets you create WordPress sites on your desktop, plus other similar tools.

WordPress announced the rollout of Studio by WordPress, a new local development tool that makes it easy for publishers to not just develop and update websites locally on their desktop or laptop but is also useful for learning how to use WordPress. Learn about Studio and other platforms that are make it easy to develop websites with WordPress right on your desktop.

Local Development Environments

Local Environments are like web hosting spaces on the desktop that can be used to set up a WordPress site. They’re a fantastic way to try out new WordPress themes and plugins to learn how they work without messing up a live website or publishing something to the web that might get accidentally indexed by Google. They are also useful for testing if an updated plugin causes a conflict with other plugins on a website, which is useful for testing updated plugins offline before committing to updating the plugins on a live website.

Studio joins a list of popular local development environments that are specific for WordPress and more advanced platforms that are that can be used for WordPress on the desktop but have greater flexibility and options but may be harder to use for non-developers.

Desktop WordPress Development Environments

There are currently a few local environments that are specific to WordPress. The advantages of using a dedicated WordPress environment is that they make it easy to start creating  with WordPress for those who only need to work with WordPress sites and nothing more complicated than that.

Studio By WordPress.com

Studio is an open source project that allows developers and publishers to set up a WordPress site on their desktop in order to design, test or learn how to use WordPress.

Advertisement

According to the WordPress announcement:

“Say goodbye to manual tool configuration, slow site setup, and clunky local development workflows, and say hello to Studio by WordPress.com, our new, free, open source local WordPress development environment.

Once you have a local site running, you can access WP Admin, the Site Editor, global styles, and patterns, all with just one click—and without needing to remember and enter a username or password.”

The goal of Studio is to be a simple and fast way to create WordPress sites on the desktop. It’s currently available for use on a Mac and a Windows version is coming soon.

Download the Mac version here.

Other Popular WordPress Local Development Environments

DevKinsta

DevKinsta, developed by Kinsta managed web host, is another development environment that’s specifically dedicated for quickly designing and testing WordPress sites on the desktop. It’s a popular choice that many developers endorse.

Advertisement

That makes it a great tool for publishers, SEOs and developers who just want a tool to do one thing, create WordPress sites. This makes DevKinsta a solid consideration for anyone who is serious about developing WordPress sites or just wants to learn how to use WordPress, especially the latest Gutenberg Blocks environment.

Download  DevKinsta for free here.

Local WP

Local WP is a popular desktop development environment specifically made for WordPress users by WP Engine, a managed WordPress hosting provider.

Useful Features of Local WP

Local WP has multiple features that make it useful beyond simply developing and testing WordPress websites.

  • Image Optimizer
    It features a free image optimizer add-on that optimizes images on your desktop which should be popular for those who are unable to optimize images on their own.
  • Upload Backups
    Another handy feature is the ability to upload backups to Dropbox and Google Drive.
  • Link Checker
    The tool has a built-in link checker that scans your local version of the website to identify broken links. This is a great way to check a site offline without using server resources and potentially slowing down your live site.
  • Import & Export Sites
    This has the super-handy ability to import WordPress website files and export them so that you can work on your current WordPress site on your desktop, test out new plugins or themes and if you’re ready you can upload the files to your website.

Advanced Local Development Environments

There are other local development environments that are not specific for WordPress but are nonetheless useful for designing and testing WordPress sites on the desktop. These tools are more advanced and are popular with developers who appreciate the freedom and options available in these platforms.

DDEV with Docker

An open source app that makes it easy to use the Docker software containerization to quickly install a content management system and start working, without having to deal with the Docker learning curve.

Download DDEV With Docker here.

Advertisement

Laragon

Laragon is a free local development environment that was recommended to me by someone who is an advanced coder because they said that it’s easy to use and fairly intuitive. They were right. I’ve used it and have had good experiences with it. It’s not a WordPress-specific tool so that must be kept in mind.

Laragon describes itself as an easy to use alternative to XXAMPP and WAMP.

Download DDEV here.

Mamp

Mamp is a local development platform that’s popular with advanced coders and is available for Mac and Windows.

David McCan (Facebook profile), a WordPress trainer who writes about advanced WordPress topics on WebTNG shared his experience with MAMP.

“MAMP is pretty easy to setup and it provides a full range of features. I currently have 51 local sites which are development versions of my production sites, that I use for testing plugins, and periodically use for new beta versions of WordPress core. It is easy to clone sites also. I haven’t noticed any system slowdown or lag.”

WAMP And XAMPP

WAMP is a Windows only development environment that’s popular with developers and WordPress theme and plugin publishers.

Advertisement

XAMPP is a PHP development platform that can be used on Linux, Mac, and Windows desktops.

Download Wamp here.

Download XAMPP here.

So Many Local Development Platforms

Studio by WordPress.com is an exciting new local development platform and I’m looking forward to trying it out. But it’s not the only one so it may be useful to try out different solutions to see which one works best for you.

Read more about Studio by WordPress:

Meet Studio by WordPress.com—a fast, free way to develop locally with WordPress

Advertisement

Featured Image by Shutterstock/Wpadington

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Big Update To Google’s Ranking Drop Documentation

Published

on

By

Google updates documentation for diagnosing ranking drops

Google updated their guidance with five changes on how to debug ranking drops. The new version contains over 400 more words that address small and large ranking drops. There’s room to quibble about some of the changes but overall the revised version is a step up from what it replaced.

Change# 1: Downplays Fixing Traffic Drops

The opening sentence was changed so that it offers less hope for bouncing back from an algorithmic traffic drop. Google also joined two sentences into one sentence in the revised version of the documentation.

The documentation previously said that most traffic drops can be reversed and that identifying the reasons for a drop aren’t straightforward. The part about most of them can be reversed was completely removed.

Here is the original two sentences:

“A drop in organic Search traffic can happen for several reasons, and most of them can be reversed. It may not be straightforward to understand what exactly happened to your site”

Now there’s no hope offered for “most of them can be reversed” and more emphasis on understanding what happened is not straightforward.

Advertisement

This is the new guidance

“A drop in organic Search traffic can happen for several reasons, and it may not be straightforward to understand what exactly happened to your site.”

Change #2 Security Or Spam Issues

Google updated the traffic graph illustrations so that they precisely align with the causes for each kind of traffic decline.

The previous version of the graph was labeled:

“Site-level technical issue (Manual Action, strong algorithmic changes)”

The problem with the previous label is that manual actions and strong algorithmic changes are not technical issues and the new version fixes that issue.

The updated version now reads:

“Large drop from an algorithmic update, site-wide security or spam issue”

Change #3 Technical Issues

There’s one more change to a graph label, also to make it more accurate.

Advertisement

This is how the previous graph was labeled:

“Page-level technical issue (algorithmic changes, market disruption)”

The updated graph is now labeled:

“Technical issue across your site, changing interests”

Now the graph and label are more specific as a sitewide change and “changing interests” is more general and covers a wider range of changes than market disruption. Changing interests includes market disruption (where a new product makes a previous one obsolete or less desirable) but it also includes products that go out of style or loses their trendiness.

Graph titled

Change #4 Google Adds New Guidance For Algorithmic Changes

The biggest change by far is their brand new section for algorithmic changes which replaces two smaller sections, one about policy violations and manual actions and a second one about algorithm changes.

The old version of this one section had 108 words. The updated version contains 443 words.

A section that’s particularly helpful is where the guidance splits algorithmic update damage into two categories.

Advertisement

Two New Categories:

  • Small drop in position? For example, dropping from position 2 to 4.
  • Large drop in position? For example, dropping from position 4 to 29.

The two new categories are perfect and align with what I’ve seen in the search results for sites that have lost rankings. The reasons for dropping up and down within the top ten are different from the reasons why a site drops completely out of the top ten.

I don’t agree with the guidance for large drops. They recommend reviewing your site for large drops, which is good advice for some sites that have lost rankings. But in other cases there’s nothing wrong with the site and this is where less experienced SEOs tend to be unable to fix the problems because there’s nothing wrong with the site. Recommendations for improving EEAT, adding author bios or filing link disavows do not solve what’s going on because there’s nothing wrong with the site. The problem is something else in some of the cases.

Here is the new guidance for debugging search position drops:

Algorithmic update
Google is always improving how it assesses content and updating its search ranking and serving algorithms accordingly; core updates and other smaller updates may change how some pages perform in Google Search results. We post about notable improvements to our systems on our list of ranking updates page; check it to see if there’s anything that’s applicable to your site.

If you suspect a drop in traffic is due to an algorithmic update, it’s important to understand that there might not be anything fundamentally wrong with your content. To determine whether you need to make a change, review your top pages in Search Console and assess how they were ranking:

Small drop in position? For example, dropping from position 2 to 4.
Large drop in position? For example, dropping from position 4 to 29.

Keep in mind that positions aren’t static or fixed in place. Google’s search results are dynamic in nature because the open web itself is constantly changing with new and updated content. This constant change can cause both gains and drops in organic Search traffic.

Small drop in position
A small drop in position is when there’s a small shift in position in the top results (for example, dropping from position 2 to 4 for a search query). In Search Console, you might see a noticeable drop in traffic without a big change in impressions.

Advertisement

Small fluctuations in position can happen at any time (including moving back up in position, without you needing to do anything). In fact, we recommend avoiding making radical changes if your page is already performing well.

Large drop in position
A large drop in position is when you see a notable drop out of the top results for a wide range of terms (for example, dropping from the top 10 results to position 29).

In cases like this, self-assess your whole website overall (not just individual pages) to make sure it’s helpful, reliable and people-first. If you’ve made changes to your site, it may take time to see an effect: some changes can take effect in a few days, while others could take several months. For example, it may take months before our systems determine that a site is now producing helpful content in the long term. In general, you’ll likely want to wait a few weeks to analyze your site in Search Console again to see if your efforts had a beneficial effect on ranking position.

Keep in mind that there’s no guarantee that changes you make to your website will result in noticeable impact in search results. If there’s more deserving content, it will continue to rank well with our systems.”

Change #5 Trivial Changes

The rest of the changes are relatively trivial but nonetheless makes the documentation more precise.

For example, one of the headings was changed from this:

Advertisement

You recently moved your site

To this new heading:

Site moves and migrations

Google’s Updated Ranking Drops Documentation

Google’s updated documentation is a well thought out but I think that the recommendations for large algorithmic drops are helpful for some cases and not helpful for other cases. I have 25 years of SEO experience and have experienced every single Google algorithm update. There are certain updates where the problem is not solved by trying to fix things and Google’s guidance used to be that sometimes there’s nothing to fix. The documentation is better but in my opinion it can be improved even further.

Read the new documentation here:

Debugging drops in Google Search traffic

Review the previous documentation:

Internet Archive Wayback Machine: Debugging drops in Google Search traffic

Advertisement

Featured Image by Shutterstock/Tomacco

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google March 2024 Core Update Officially Completed A Week Ago

Published

on

By

Graphic depicting the Google logo with colorful segments on a blue circuit board background, accompanied by the text "Google March 2024 Core Update.

Google has officially completed its March 2024 Core Update, ending over a month of ranking volatility across the web.

However, Google didn’t confirm the rollout’s conclusion on its data anomaly page until April 26—a whole week after the update was completed on April 19.

Many in the SEO community had been speculating for days about whether the turbulent update had wrapped up.

The delayed transparency exemplifies Google’s communication issues with publishers and the need for clarity during core updates

Google March 2024 Core Update Timeline & Status

First announced on March 5, the core algorithm update is complete as of April 19. It took 45 days to complete.

Advertisement

Unlike more routine core refreshes, Google warned this one was more complex.

Google’s documentation reads:

“As this is a complex update, the rollout may take up to a month. It’s likely there will be more fluctuations in rankings than with a regular core update, as different systems get fully updated and reinforce each other.”

The aftershocks were tangible, with some websites reporting losses of over 60% of their organic search traffic, according to data from industry observers.

The ripple effects also led to the deindexing of hundreds of sites that were allegedly violating Google’s guidelines.

Addressing Manipulation Attempts

In its official guidance, Google highlighted the criteria it looks for when targeting link spam and manipulation attempts:

  • Creating “low-value content” purely to garner manipulative links and inflate rankings.
  • Links intended to boost sites’ rankings artificially, including manipulative outgoing links.
  • The “repurposing” of expired domains with radically different content to game search visibility.

The updated guidelines warn:

“Any links that are intended to manipulate rankings in Google Search results may be considered link spam. This includes any behavior that manipulates links to your site or outgoing links from your site.”

John Mueller, a Search Advocate at Google, responded to the turbulence by advising publishers not to make rash changes while the core update was ongoing.

Advertisement

However, he suggested sites could proactively fix issues like unnatural paid links.

Mueller stated on Reddit:

“If you have noticed things that are worth improving on your site, I’d go ahead and get things done. The idea is not to make changes just for search engines, right? Your users will be happy if you can make things better even if search engines haven’t updated their view of your site yet.”

Emphasizing Quality Over Links

The core update made notable changes to how Google ranks websites.

Most significantly, Google reduced the importance of links in determining a website’s ranking.

In contrast to the description of links as “an important factor in determining relevancy,” Google’s updated spam policies stripped away the “important” designation, simply calling links “a factor.”

This change aligns with Google’s Gary Illyes’ statements that links aren’t among the top three most influential ranking signals.

Advertisement

Instead, Google is giving more weight to quality, credibility, and substantive content.

Consequently, long-running campaigns favoring low-quality link acquisition and keyword optimizations have been demoted.

With the update complete, SEOs and publishers are left to audit their strategies and websites to ensure alignment with Google’s new perspective on ranking.

Core Update Feedback

Google has opened a ranking feedback form related to this core update.

You can use this form until May 31 to provide feedback to Google’s Search team about any issues noticed after the core update.

While the feedback provided won’t be used to make changes for specific queries or websites, Google says it may help inform general improvements to its search ranking systems for future updates.

Advertisement

Google also updated its help documentation on “Debugging drops in Google Search traffic” to help people understand ranking changes after a core update.


Featured Image: Rohit-Tripathi/Shutterstock

FAQ

After the update, what steps should websites take to align with Google’s new ranking criteria?

After Google’s March 2024 Core Update, websites should:

  • Improve the quality, trustworthiness, and depth of their website content.
  • Stop heavily focusing on getting as many links as possible and prioritize relevant, high-quality links instead.
  • Fix any shady or spam-like SEO tactics on their sites.
  • Carefully review their SEO strategies to ensure they follow Google’s new guidelines.

Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS