Connect with us

SEO

Everything You Need To Know About The X-Robots-Tag HTTP Header

Published

on

Everything You Need To Know About The X-Robots-Tag HTTP Header

Search engine optimization, in its most basic sense, relies upon one thing above all others: Search engine spiders crawling and indexing your site.

But nearly every website is going to have pages that you don’t want to include in this exploration.

For example, do you really want your privacy policy or internal search pages showing up in Google results?

In a best-case scenario, these are doing nothing to drive traffic to your site actively, and in a worst-case, they could be diverting traffic from more important pages.

Luckily, Google allows webmasters to tell search engine bots what pages and content to crawl and what to ignore. There are several ways to do this, the most common being using a robots.txt file or the meta robots tag.

We have an excellent and detailed explanation of the ins and outs of robots.txt, which you should definitely read.

But in high-level terms, it’s a plain text file that lives in your website’s root and follows the Robots Exclusion Protocol (REP).

Robots.txt provides crawlers with instructions about the site as a whole, while meta robots tags include directions for specific pages.

Some meta robots tags you might employ include index, which tells search engines to add the page to their index; noindex, which tells it not to add a page to the index or include it in search results; follow, which instructs a search engine to follow the links on a page; nofollow, which tells it not to follow links, and a whole host of others.

Both robots.txt and meta robots tags are useful tools to keep in your toolbox, but there’s also another way to instruct search engine bots to noindex or nofollow: the X-Robots-Tag.

What Is The X-Robots-Tag?

The X-Robots-Tag is another way for you to control how your webpages are crawled and indexed by spiders. As part of the HTTP header response to a URL, it controls indexing for an entire page, as well as the specific elements on that page.

And whereas using meta robots tags is fairly straightforward, the X-Robots-Tag is a bit more complicated.

But this, of course, raises the question:

When Should You Use The X-Robots-Tag?

According to Google, “Any directive that can be used in a robots meta tag can also be specified as an X-Robots-Tag.”

While you can set robots.txt-related directives in the headers of an HTTP response with both the meta robots tag and X-Robots Tag, there are certain situations where you would want to use the X-Robots-Tag – the two most common being when:

  • You want to control how your non-HTML files are being crawled and indexed.
  • You want to serve directives site-wide instead of on a page level.

For example, if you want to block a specific image or video from being crawled – the HTTP response method makes this easy.

The X-Robots-Tag header is also useful because it allows you to combine multiple tags within an HTTP response or use a comma-separated list of directives to specify directives.

Maybe you don’t want a certain page to be cached and want it to be unavailable after a certain date. You can use a combination of “noarchive” and “unavailable_after” tags to instruct search engine bots to follow these instructions.

Essentially, the power of the X-Robots-Tag is that it is much more flexible than the meta robots tag.

The advantage of using an X-Robots-Tag with HTTP responses is that it allows you to use regular expressions to execute crawl directives on non-HTML, as well as apply parameters on a larger, global level.

To help you understand the difference between these directives, it’s helpful to categorize them by type. That is, are they crawler directives or indexer directives?

Here’s a handy cheat sheet to explain:

Crawler Directives Indexer Directives
Robots.txt – uses the user agent, allow, disallow, and sitemap directives to specify where on-site search engine bots are allowed to crawl and not allowed to crawl. Meta Robots tag – allows you to specify and prevent search engines from showing particular pages on a site in search results.

Nofollow – allows you to specify links that should not pass on authority or PageRank.

X-Robots-tag – allows you to control how specified file types are indexed.

Where Do You Put The X-Robots-Tag?

Let’s say you want to block specific file types. An ideal approach would be to add the X-Robots-Tag to an Apache configuration or a .htaccess file.

The X-Robots-Tag can be added to a site’s HTTP responses in an Apache server configuration via .htaccess file.

Real-World Examples And Uses Of The X-Robots-Tag

So that sounds great in theory, but what does it look like in the real world? Let’s take a look.

Let’s say we wanted search engines not to index .pdf file types. This configuration on Apache servers would look something like the below:

<Files ~ ".pdf$">
  Header set X-Robots-Tag "noindex, nofollow"
</Files>

In Nginx, it would look like the below:

location ~* .pdf$ {
  add_header X-Robots-Tag "noindex, nofollow";
}

Now, let’s look at a different scenario. Let’s say we want to use the X-Robots-Tag to block image files, such as .jpg, .gif, .png, etc., from being indexed. You could do this with an X-Robots-Tag that would look like the below:

<Files ~ ".(png|jpe?g|gif)$">
Header set X-Robots-Tag "noindex"
</Files>

Please note that understanding how these directives work and the impact they have on one another is crucial.

For example, what happens if both the X-Robots-Tag and a meta robots tag are located when crawler bots discover a URL?

If that URL is blocked from robots.txt, then certain indexing and serving directives cannot be discovered and will not be followed.

If directives are to be followed, then the URLs containing those cannot be disallowed from crawling.

Check For An X-Robots-Tag

There are a few different methods that can be used to check for an X-Robots-Tag on the site.

The easiest way to check is to install a browser extension that will tell you X-Robots-Tag information about the URL.

Screenshot of Robots Exclusion Checker, December 2022Robots Exclusion Checker

Another plugin you can use to determine whether an X-Robots-Tag is being used, for example, is the Web Developer plugin.

By clicking on the plugin in your browser and navigating to “View Response Headers,” you can see the various HTTP headers being used.

web developer plugin

web developer plugin

Another method that can be used for scaling in order to pinpoint issues on websites with a million pages is Screaming Frog.

After running a site through Screaming Frog, you can navigate to the “X-Robots-Tag” column.

This will show you which sections of the site are using the tag, along with which specific directives.

Screaming Frog Report. X-Robot-TagScreenshot of Screaming Frog Report. X-Robot-Tag, December 2022Screaming Frog Report. X-Robot-Tag

Using X-Robots-Tags On Your Site

Understanding and controlling how search engines interact with your website is the cornerstone of search engine optimization. And the X-Robots-Tag is a powerful tool you can use to do just that.

Just be aware: It’s not without its dangers. It is very easy to make a mistake and deindex your entire site.

That said, if you’re reading this piece, you’re probably not an SEO beginner. So long as you use it wisely, take your time and check your work, you’ll find the X-Robots-Tag to be a useful addition to your arsenal.

More Resources:


Featured Image: Song_about_summer/Shutterstock

window.addEventListener( ‘load’, function() {
setTimeout(function(){ striggerEvent( ‘load2’ ); }, 2000);
});

window.addEventListener( ‘load2’, function() {

if( sopp != ‘yes’ && addtl_consent != ‘1~’ && !ss_u ){

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘everything-x-robots-tag’,
content_category: ‘seo technical-seo’
});
}
});

Source link

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

SEO

AI Content In Search Results

Published

on

AI Content In Search Results

Google has released a statement regarding its approach to AI-generated content in search results.

The company has a long-standing policy of rewarding high-quality content, regardless of whether humans or machines produce it.

Above all, Google’s ranking systems aim to identify content that demonstrates expertise, experience, authoritativeness, and trustworthiness (E-E-A-T).

Google advises creators looking to succeed in search results to produce original, high-quality, people-first content that demonstrates E-E-A-T.

The company has updated its “Creating helpful, reliable, people-first content” help page with guidance on evaluating content in terms of “Who, How, and Why.”

Here’s how AI-generated content fits into Google’s approach to ranking high-quality content in search results.

Quality Over Production Method

Focusing on the quality of content rather than the production method has been a cornerstone of Google’s approach to ranking search results for many years.

A decade ago, there were concerns about the rise in mass-produced human-generated content.

Rather than banning all human-generated content, Google improved its systems to reward quality content.

Google’s focus on rewarding quality content, regardless of production method, continues to this day through its ranking systems and helpful content system introduced last year.

Automation & AI-Generated Content

Using automation, including AI, to generate content with the primary purpose of manipulating ranking in search results violates Google’s spam policies.

Google’s spam-fighting efforts, including its SpamBrain system, will continue to combat such practices.

However, Google realizes not all use of automation and AI-generated content is spam.

For example, publishers automate helpful content such as sports scores, weather forecasts, and transcripts.

Google says it will continue to take a responsible approach toward AI-generated content while maintaining a high bar for information quality and helpfulness in search results.

Google’s Advice For Publishers

For creators considering AI-generated content, here’s what Google advises.

Google’s concept of E-E-A-T is outlined in the “Creating helpful, reliable, people-first content” help page, which has been updated with additional guidance.

The updated help page asks publishers to think about “Who, How, and Why” concerning how content is produced.

“Who” refers to the person who created the content, and it’s important to make this clear by providing a byline or background information about the author.

“How” relates to the method used to create the content, and it’s helpful to readers to know if automation or AI was involved. If AI was involved in the content production process, Google wants you to be transparent and explain why it was used.

“Why” refers to the purpose of creating content, which should be to help people rather than to manipulate search rankings.

Evaluating your content in this way, regardless of whether AI-generated or not, will help you stay in line with what Google’s systems reward.


Featured Image: Alejandro Corral Mena/Shutterstock



Source link

Continue Reading

SEO

Seven tips to optimize page speed in 2023

Published

on

Tips-to-optimize-page-speed-in-2023

30-second summary:

  • There has been a gradual increase in Google’s impact of page load time on website rankings
  • Google has introduced the three Core Web Vitals metrics as ranking factors to measure user experience
  • The following steps can help you get a better idea of the performance of your website through multiple tests

A fast website not only delivers a better experience but can also increase conversion rates and improve your search engine rankings. Google has introduced the three Core Web Vitals metrics to measure user experience and is using them as a ranking factor.

Let’s take a look at what you can do to test and optimize the performance of your website.

Start in Google Search Console

Want to know if optimizing Core Web Vitals is something you should be thinking about? Use the page experience report in Google Search Console to check if any of the pages on your website are loading too slowly.

Search Console shows data that Google collects from real users in Chrome, and this is also the data that’s used as a ranking signal. You can see exactly what page URLs need to be optimized.

Optimize-to-Start-in-Google-Search-Console

Run a website speed test

Google’s real user data will tell you how fast your website is, but it won’t provide an analysis that explains why your website is slow.

Run a free website speed test to find out. Simply enter the URL of the page you want to test. You’ll get a detailed performance report for your website, including recommendations on how to optimize it.

Run-a-website-speed-test-for-optimization

Use priority hints to optimize the Largest Contentful Paint

Priority Hints are a new browser feature that came out in 2022. It allows website owners to indicate how important an image or other resource is on the page.

This is especially important when optimizing the Largest Contentful Paint, one of the three Core Web Vitals metrics. It measures how long it takes for the main page content to appear after opening the page.

By default, browsers assume that all images are low priority until the page starts rendering and the browser knows which images are visible to the user. That way bandwidth isn’t wasted on low-priority images near the bottom of the page or in the footer. But it also slows down important images at the top of the page.

Adding a fetchpriority=”high” attribute to the img element that’s responsible for the Largest Contentful Paint ensures that it’s downloaded quickly.

Use native image lazy loading for optimization

Image lazy loading means only loading images when they become visible to the user. It’s a great way to help the browser focus on the most important content first.

However, image lazy loading can also slow cause images to take longer to load, especially when using a JavaScript lazy loading library. In that case, the browser first needs to load the JavaScript library before starting to load images. This long request chain means that it takes a while for the browser to load the image.

Use-native-image-lazy-loading-for-optimization

Today browsers support native lazy loading with the loading=”lazy” attribute for images. That way you can get the benefits of lazy loading without incurring the cost of having to download a JavaScript library first.

Remove and optimize render-blocking resources

Render-blocking resources are network requests that the browser needs to make before it can show any page content to the user. They include the HTML document, CSS stylesheets, as well as some JavaScript files.

Since these resources have such a big impact on page load time you should check each one to see if it’s truly necessary. The async keyword on the HTML script tag lets you load JavaScript code without blocking rendering.

If a resource has to block rendering check if you can optimize the request to load the resource more quickly, for example by improving compression or loading the file from your main web server instead of from a third party.

Remove-and-optimize-render-blocking-resources

Optimize with the new interaction to Next Paint metric

Google has announced a new metric called Interaction to Next Paint. This metric measures how quickly your site responds to user input and is likely to become one of the Core Web Vitals in the future.

You can already see how your website is doing on this metric using tools like PageSpeed Insights.

Optimize-with-new-Interaction-to-Next-Paint-metric

Continuously monitor your site performance

One-off site speed tests can identify performance issues on your website, but they don’t make it easy to keep track of your test results and confirm that your optimizations are working.

DebugBear continuously monitors your website to check and alerts you when there’s a problem. The tool also makes it easy to show off the impact of your work to clients and share test results with your team.

Try DebugBear with a free 14-day trial.

Continuously-monitor-your-site-performance

 

Source link

Continue Reading

SEO

What Is User Experience? How Design Matters To SEO

Published

on

What Is User Experience? How Design Matters To SEO

User experience is the foundation of a site’s usability, and it’s an aspect of on-page SEO that many people overlook.

If your site lacks the positive user experience and ease of use that end users require to navigate your site, you’ll push visitors to your competitors.

In this guide, you’ll learn what user experience (UX) entails, the types of experiences, the difference between UI and UX, and why it matters to SEO.

What Is User Experience (UX)?

UX is how people interact with your website.

You’ll also find this term used for products, but we’re focusing strictly on websites at the moment.

If you have a, intuitive user interface design, users will have an easier time navigating your site and finding the information they want.

If you do have a digital product, such as a SaaS solution, this interaction will also occur on your digital product.

User experience elicits a couple of things:

In short, user experience can provide a positive experience with your website – or it can lead to frustration among users.

Note: Usability is not UX design. It’s a component of UX that works with design to create the experience your users desire.

What Are The Types Of User Experience?

User experience evaluation must look at the three types of UX design to best understand the needs of the end user.

The three types of UX include:

  • Information: One aspect of a content strategy that goes overlooked is information architecture. Time must be spent on how information on a site is organized and presented. User flows and navigation must be considered for all forms of information you present.
  • Interaction: Your site has an interaction design pattern – or a certain way that users interact with the site. Components of a site that fall under the interaction UX type include buttons, interfaces, and menus.
  • Visual design: Look and feel matter for the end user. You want your website to have cohesion between its color, typography, and images. User interface (UI) will fall under this type of UX, but it’s important to note that UI is not interchangeable with UX.

What Is The Difference Between UI & UX?

Speaking of UX and UI, it’s important to have a firm understanding of the difference between the two to better understand user experience.

User Interface

UI design is your site’s visual elements, including:

Visual elements on your site are part of the user interface.

UI definitely overlaps with UX to an extent, but they’re not the same.

Steve Krug also has a great book on usability, titled “Don’t Make Me Think, Revisited: A Common Sense Approach to Web Usability.” It was first published in 2000, and the book is a #1 bestseller today.

Steve’s insight from over 20 years ago (although we’re now on the 3rd edition of the book) provides guidelines on usability that include:

  • Desktop.
  • Mobile.
  • Ease of use.
  • Layouts.
  • Everything UX.

If there’s one thing this book will teach you about usability, it’s to focus on intuitive navigation. Frustrating website users is the exact opposite of a good user experience.

User Experience

UX works on UI and how the user will:

  • Interact with your site.
  • Feel during the interaction.

Think of Google for a moment.

A simple landing page that is visually appealing, but Spartan in nature, is the face of the Internet. In terms of UX, Google is one of the best sites in the world, although it lacks a spectacular UI.

In fact, the UI needs to be functional and appealing, but the UX is what will stand out the most.

Imagine if you tried performing a search on Google and it displayed the wrong results or took one minute for a query to run. In this case, even the nicest UI would not compensate for the poor UX.

Peter Morville’s user experience honeycomb is one of the prime examples of how to move beyond simple usability and focus on UX in new, exciting ways.

The honeycomb includes multiple points that are all combined to maximize the user experience. These facets are:

  • Accessible.
  • Credible.
  • Desirable.
  • Findable.
  • Usable.
  • Useful.
  • Valuable.

When you focus on all of these elements, you’ll improve the user experience dramatically.

Why User Experience Matters To SEO

By this point, you understand that UX is very important to your site’s visitors and audience.

A lot of time, analysis, and refinement must go into UX design. However, there’s another reason to redirect your attention to user experience: SEO.

Google Page Experience Update

When Google’s Page Experience Update was fully rolled out, it had an impact on websites that offered a poor user experience.

The page experience update is now slowly rolling out for desktop. It will be complete by the end of March 2022. Learn more about the update: https://t.co/FQvMx3Ymaf

— Google Search Central (@googlesearchc) February 22, 2022

Multiple aspects of UX are part of the ranking factors of the update, including:

  • Intrusive adverts.
  • Core Web Vitals.
  • HTTPS Security.

You can run a Core Web Vitals report here and make corrections to meet these requirements. Additionally, you should know whether your site has intrusive ads that irritate users, and if your site lacks HTTPS.

Page performance works to improve your SEO. Google’s research shows that focusing on UX can:

  • Reduce site abandonment by as much as 24%.
  • Improve web conversions.
  • Increase the average page views per session by as much as 15%.
  • Boost advertising revenue by 18% or more.

When you spend time improving your site’s UX, you benefit from higher rankings, lower page abandonment, improved conversions, and even more revenue.

Plus, many of the practices to improve UX are also crucial components of a site’s on-page SEO, such as:

  • Proper header usage.
  • Adding lists to your content.
  • Making use of images.
  • Optimizing images for faster loading times.
  • Filling content gaps with useful information.
  • Reducing “content fluff.”
  • Using graphs.
  • Testing usability across devices.

When you improve UX, you create a positive experience for users, while also improving many of the on-page SEO foundations of your website.

Final Comments

Customer experience must go beyond simple responsive web design.

Hick’s law dictates that when you present more choices to users, it takes longer to reach a decision. You’ve likely seen this yourself when shopping online and finding hundreds of options.

When people land on your site, they’re looking for answers or knowledge – not confusion.

User research, usability testing, and revisiting user experience design often will help you inch closer to satisfying the SEO requirements of design while keeping your visitors (or customers) happier.

More resources: 


Featured Image: NicoElNino/Shutterstock



Source link

Continue Reading

Trending

en_USEnglish