Connect with us

SEO

How to Make Fewer HTTP Requests

Published

on

When you browse the internet, do you ever stop to wonder what’s happening in the background? If the answer’s no, don’t worry. You’re not alone. Most marketers, even great ones, don’t give the “tech stuff” much thought. How a website performs is just something for IT specialists to worry about, right?

No, unfortunately.

If your website’s slow or clunky, it directly affects the user experience. In fact, 40 percent of people won’t hang around if your website takes more than 3 seconds to load. With this in mind, it’s crucial you know how to fix a sluggish website and streamline your page loading times before you lose leads.

Where do you start? Well, one way is to make fewer HTTP requests for your website.

Although an HTTP request sounds like a really technical term best reserved for engineers and IT pros, don’t panic. It’s something any good marketer can understand. Now, let’s take a deep dive into how these requests work and how you can use this knowledge to boost your website’s performance.

What Are HTTP Requests?

Before we get started, it’s crucial you’re clear on what HTTP requests actually are.

HTTP stands for “HyperText Transfer Protocol.” Think of HTTP as the language browsers and web servers use to talk to each other. We (thankfully) don’t need to cover all the intricacies of web code to understand how HTTP affects load time, but here’s a breakdown of the key steps marketers need to know.

When someone wants to visit your website, their browser sends a “request” to your server. This is known as an HTTP request. Your server acknowledges the response and kicks into gear, ready to display the webpage.

Here’s where it gets a little tricky, though. The browser can’t display the page right away. It needs copies of the various different files, such as plug-ins and images, to load the page properly.

How does the browser get these files? By making multiple HTTP requests. If the browser doesn’t make these requests, the page components won’t load.

Depending on how many components your page has, these requests can really add up, which is a problem. Here’s why.

Why You Need Fewer HTTP Requests

There are two simple reasons why every website should aim to reduce the HTTP requests associated with it.

Firstly, let’s start with page load time. The more HTTP requests your site receives, the longer it takes for the requested page to load. For example, a page with 20 HTTP requests will load faster than a page with 70 requests.

The issue? People don’t want to hang around waiting on a website loading.

  • 39 percent of visitors won’t return if your images or videos don’t load properly, as research by SAG IPL shows.
  • 45 percent of respondents won’t buy from a retailer if the website takes too long to load, according to research by Unbounce.

In short, with much competition out there, you’ll lose leads if your website takes too long to load or it doesn’t load properly at all.

Next, let’s think about what impact these lost leads have on your metrics.

According to Google, bounce rate increases by 32 percent when loading time slows from 1-3 seconds, and to make matters worse, poor loading time affects your SEO ranking. Delays in page loading time can cut page views by 11 percent, which tells Google your page isn’t offering value.

Think about it this way: If your website doesn’t impress visitors, they won’t shop with you. They won’t recommend you to their friends. In time, this leads to a lower search ranking, less visitors, and reduced conversion rates overall.

What can we take from all this? Well, too many HTTP requests directly affect your key metrics and your marketability online.

How to Identify Unnecessary HTTP Requests

OK, we’re clear on how HTTP requests work and why you need less of them. How do you identify these excess requests, though? By doing two things: identifying how many requests you’re dealing with, and grading your website performance.

Establish the Number of HTTP Requests Your Website Receives

You can’t eliminate HTTP requests until you know how many your website receives. Luckily, there are tools available to help you identify the number.

For example, HubSpot’s Website Grader give you a free website “health check” so you can instantly see how many requests you’re receiving:

Make Fewer HTTP Requests - Website Grader HubSpot

If you use Chrome, you can also use Chrome’s DevTools to identify the number of HTTP requests. Simply right-click the page you want to check, click “Inspect,” then click the “Network” option. The image you’ll see looks something like this:

Make Fewer HTTP Requests - Chrome’s DevTools

This page receives 112 requests.

Grade Your Website Performance

When was the last time you assessed your website’s performance and, most importantly, page loading time? If you can’t remember, now’s a great time to run an audit.

You can try Ubersuggest for this. It’s really simple to use. Simply open Ubersuggest, type in your URL, and click “Site Audit” from the sidebar when the search results finish loading.

Once you’ve clicked “Site Audit,” you’ll see an overview of your website’s speed. It’ll look something like this:

Make Fewer HTTP Requests -Site Audit with Ubersuggest

A low score indicates you’re suffering from poor loading times. For example, if your mobile website takes 6 seconds to load, but your desktop site loads in 2 seconds, there’s a problem with your mobile site, and so on.

Don’t worry if you’re unhappy with your page loading times or the number of HTTP requests you’re seeing. Now you know there’s a problem, you can begin streamlining those HTTP requests and ensure your page loads as quickly as possible. Let’s look at how to do just that.

8 Steps to Make Fewer HTTP Requests

Although every website is unique, we can usually blame excessive HTTP requests on a few common problems. With this in mind, here are eight simple steps you can take right now to reduce the number of requests passing through your website.

1. Remove Unnecessary Plug-Ins

Plug-ins are great. They add new functionality to your website and make your web pages more engaging. However, too many plug-ins clutter your page and hold up loading times. While there’s no “right” number of plug-ins, a good rule of thumb is to keep them minimal.

First, identify which plug-ins you use. Do they add value to your website? If the answer’s no, they can go. If it’s a plug-in you only use now and then, you can always reinstall it when it’s required then delete it again.

Not sure how to identify your plug-ins? Reach out to me and see how I can help you better understand your website’s performance.

2. Replace Heavy Plug-Ins With Streamlined Ones

OK, so you can’t remove every plug-in. However, if you want to make fewer HTTP requests, you can often replace resource-heavy plug-ins with more streamlined options.

For example, maybe you want to add social media buttons to your page. Great. Social media shares can increase engagement and boost your exposure. However, the plug-ins can be resource-intensive.

To streamline your social media plug-ins, use tools like Novashare. This tool won’t slow your page down, but it will help you reduce the HTTP requests generated by your social sharing plug-ins:

Steps to Make Fewer HTTP Requests - Replace Heavy Plug-Ins With Streamlined Ones

3. Remove Images You Do Not Need

Sure, images can improve your website’s visual appeal and boost the user experience. Unless the image helps your reader understand your content in some way, or it’s a highly useful piece of content in its own right like an infographic, it might be worth deleting it.

Remember, every image creates an HTTP request. While those fun GIFs might have visual appeal, they won’t impress your audience if they affect load time.

Audit every individual web page and don’t be afraid to get a little ruthless. If the image doesn’t add value to your content, delete it.

4. Reduce the File Size for Remaining Images

Once you’ve deleted the unnecessary images, you need to optimize the ones you plan on keeping. In this context, “optimizing” doesn’t mean using alt text or keywords, although you should optimize for SEO, too.

Instead, what I mean is compressing each image. Compression preserves the image quality while reducing the overall file size, which improves load time.

If you don’t have access to image editing tools like Adobe, try free tools like Squoosh instead. You can tinker with the image to find the perfect balance between file size (which should be less than 1 MB, ideally) and image quality:

Steps to Make Fewer HTTP Requests - Reduce the File Size for Remaining Images

5. Drop Unnecessary Videos

Just like not every image adds value to your content, some videos detract from the user experience and increase the page loading time.

To be honest, this tip’s really simple. Just like you should cull any images or plug-ins you don’t need, limit how many videos you’re playing on any webpage.

How do you know which videos to delete? Well, there’s no rule here. However, if it doesn’t educate your audience or add value in some way, cut it or replace it with a shorter, comparable video.

6. Enable Lazy Load

“Lazy loading” means an image or video won’t load until the user begins scrolling down your webpage. How does this reduce HTTP requests?

Since the media doesn’t load right away, it won’t trigger an HTTP request for the initial page load. It doesn’t affect the user experience either, since users won’t know the difference between a regular or lazy load. All they’ll know is that the images or videos are viewable once they scroll down.

To enable lazy load, try out plug-ins like the aptly-named LazyLoad. The script takes up less than 10 KB of space, so it’s not resource-intensive. Just install the plug-in and it gets to work immediately:

Steps to Make Fewer HTTP Requests -Enable Lazy Load

7. Use Content Caching

Caching is a great way to reduce HTTP requests.

Essentially, caching means a visitor’s browser stores copies of the scripts it used to display your webpage, rather than delete them all. When the visitor returns, there’s no need to make all those HTTP requests again, because the scripts they need are already stored in the browser unless they clear their cache.

Let me give you some tips for priming your website for content caching.

  • Don’t use cookies unless they’re essential.
  • Always use the same URL if you serve the content across different pages.
  • Build up a library of images and videos and reuse them.
  • Try out free tools like REDbot to assess your website’s cacheability.
Steps to Make Fewer HTTP Requests - Use Content Caching

8. Reduce Third-Party Requests

If a visitor’s browser needs to request or download data from a third party to display a website properly, like YouTube or Google Analytics, it’s called a third-party request. The issue? How long your page takes to load depends on how quickly the third-party server responds.

This is a huge problem because you’re not in control of your page loading time. To take back control, think about lazy loading third-party content like embedded YouTube videos. You could also try hosting scripts for necessary programs like Google Analytics locally rather than externally.

Finally, if a plug-in you use makes its own third-party requests, switch it for another plug-in where possible.

How to Make Fewer HTTP Requests

  1. Remove Unnecessary Plug-Ins

    Figure out which plug-ins are installed and remove those that you don’t use.

  2. Replace Heavy Plug-Ins With Streamlined Ones

    Audit the plug-ins you keep and replace them with more efficient ones if they’re available.

  3. Remove Unnecessary Images

    Delete images that don’t add value since each one creates an HTTP request.

  4. Reduce the File Size for Remaining Images

    Compress the images you keep to reduce load time.

  5. Drop Unnecessary Videos

    Only keep videos that add value to your page.

  6. Enable Lazy Load

    Use a plug-in that allows images and videos to load once a user scrolls.

  7. Use Content Caching

    To prepare your site for content caching avoid using cookies; use the same URL for content used on different pages; build an image library and re-use them; and audit your site’s ability to be cached.

  8. Reduce Third-Party Requests

    Try not to include content that pulls from a third party, like YouTube, since your page load time depends on theirs. You should also replace plug-ins that rely on third-party requests.

Conclusion

HTTP requests are essential to displaying a website and giving your audience an engaging experience. However, too many HTTP requests can disrupt your website performance and deter would-be customers from doing business with you.

The good news? With a few simple tweaks, you can ensure browsers make fewer HTTP requests to your website. You can boost page loading time, improve a webpage’s visual appeal, and, ultimately, increase conversions in the long run.

If you’re not sure where to get started with improving your website’s performance, check out my consulting services and we’ll see how I can help.

Have you tried reducing the number of HTTP requests on your website? Which strategies are working for you?

See How My Agency Can Drive Massive Amounts of Traffic to Your Website

  • SEO – unlock massive amounts of SEO traffic. See real results.
  • Content Marketing – our team creates epic content that will get shared, get links, and attract traffic.
  • Paid Media – effective paid strategies with clear ROI.

Book a Call

Neilpatel.com

SEO

The Current State of Google PageRank & How It Evolved

Published

on

The Current State of Google PageRank & How It Evolved

PageRank (PR) is an algorithm that improves the quality of search results by using links to measure the importance of a page. It considers links as votes, with the underlying assumption being that more important pages are likely to receive more links.

PageRank was created by Google co-founders Sergey Brin and Larry Page in 1997 when they were at Stanford University, and the name is a reference to both Larry Page and the term “webpage.” 

In many ways, it’s similar to a metric called “impact factor” for journals, where more cited = more important. It differs a bit in that PageRank considers some votes more important than others. 

By using links along with content to rank pages, Google’s results were better than competitors. Links became the currency of the web.

Want to know more about PageRank? Let’s dive in.

Google still uses PageRank

In terms of modern SEO, PageRank is one of the algorithms comprising Experience Expertise Authoritativeness Trustworthiness (E-E-A-T).

Google’s algorithms identify signals about pages that correlate with trustworthiness and authoritativeness. The best known of these signals is PageRank, which uses links on the web to understand authoritativeness.

Source: How Google Fights Disinformation

We’ve also had confirmation from Google reps like Gary Illyes, who said that Google still uses PageRank and that links are used for E-A-T (now E-E-A-T).

When I ran a study to measure the impact of links and effectively removed the links using the disavow tool, the drop was obvious. Links still matter for rankings.

PageRank has also been a confirmed factor when it comes to crawl budget. It makes sense that Google wants to crawl important pages more often.

Fun math, why the PageRank formula was wrong 

Crazy fact: The formula published in the original PageRank paper was wrong. Let’s look at why. 

PageRank was described in the original paper as a probability distribution—or how likely you were to be on any given page on the web. This means that if you sum up the PageRank for every page on the web together, you should get a total of 1.

Here’s the full PageRank formula from the original paper published in 1997:

PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))

Simplified a bit and assuming the damping factor (d) is 0.85 as Google mentioned in the paper (I’ll explain what the damping factor is shortly), it’s:

PageRank for a page = 0.15 + 0.85 (a portion of the PageRank of each linking page split across its outbound links)

In the paper, they said that the sum of the PageRank for every page should equal 1. But that’s not possible if you use the formula in the paper. Each page would have a minimum PageRank of 0.15 (1-d). Just a few pages would put the total at greater than 1. You can’t have a probability greater than 100%. Something is wrong!

The formula should actually divide that (1-d) by the number of pages on the internet for it to work as described. It would be:

PageRank for a page = (0.15/number of pages on the internet) + 0.85 (a portion of the PageRank of each linking page split across its outbound links)

It’s still complicated, so let’s see if I can explain it with some visuals.

1. A page is given an initial PageRank score based on the links pointing to it. Let’s say I have five pages with no links. Each gets a PageRank of (1/5) or 0.2.

PageRank example of five pages with no links yet

2. This score is then distributed to other pages through the links on the page. If I add some links to the five pages above and calculate the new PageRank for each, then I end up with this: 

PageRank example of five pages after one iteration

You’ll notice that the scores are favoring the pages with more links to them.

3. This calculation is repeated as Google crawls the web. If I calculate the PageRank again (called an iteration), you’ll see that the scores change. It’s the same pages with the same links, but the base PageRank for each page has changed, so the resulting PageRank is different.

PageRank example of five pages after two iterations

The PageRank formula also has a so-called “damping factor,” the “d” in the formula, which simulates the probability of a random user continuing to click on links as they browse the web. 

Think of it like this: The probability of you clicking a link on the first page you visit is reasonably high. But the likelihood of you then clicking a link on the next page is slightly lower, and so on and so forth.

If a strong page links directly to another page, it’s going to pass a lot of value. If the link is four clicks away, the value transferred from that strong page will be a lot less because of the damping factor.

Example showing PageRank damping factor
History of PageRank

The first PageRank patent was filed on January 9, 1998. It was titled “Method for node ranking in a linked database.” This patent expired on January 9, 2018, and was not renewed. 

Google first made PageRank public when the Google Directory launched on March 15, 2000. This was a version of the Open Directory Project but sorted by PageRank. The directory was shut down on July 25, 2011.

It was December 11, 2000, when Google launched PageRank in the Google toolbar, which was the version most SEOs obsessed over.

This is how it looked when PageRank was included in Google’s toolbar. 

PageRank 8/10 in Google's old toolbar

PageRank in the toolbar was last updated on December 6, 2013, and was finally removed on March 7, 2016.

The PageRank shown in the toolbar was a little different. It used a simple 0–10 numbering system to represent the PageRank. But PageRank itself is a logarithmic scale where achieving each higher number becomes increasingly difficult.

PageRank even made its way into Google Sitemaps (now known as Google Search Console) on November 17, 2005. It was shown in categories of high, medium, low, or N/A. This feature was removed on October 15, 2009.

Link spam

Over the years, there have been a lot of different ways SEOs have abused the system in the search for more PageRank and better rankings. Google has a whole list of link schemes that include:

  • Buying or selling links—exchanging links for money, goods, products, or services.
  • Excessive link exchanges.
  • Using software to automatically create links.
  • Requiring links as part of a terms of service, contract, or other agreement.
  • Text ads that don’t use nofollow or sponsored attributes.
  • Advertorials or native advertising that includes links that pass ranking credit.
  • Articles, guest posts, or blogs with optimized anchor text links.
  • Low-quality directories or social bookmark links.
  • Keyword-rich, hidden, or low-quality links embedded in widgets that get put on other websites.
  • Widely distributed links in footers or templates. For example, hard-coding a link to your website into the WP Theme that you sell or give away for free.
  • Forum comments with optimized links in the post or signature.

The systems to combat link spam have evolved over the years. Let’s look at some of the major updates.

Nofollow

On January 18, 2005, Google announced it had partnered with other major search engines to introduce the rel=“nofollow” attribute. It encouraged users to add the nofollow attribute to blog comments, trackbacks, and referrer lists to help combat spam.

Here’s an excerpt from Google’s official statement on the introduction of nofollow:

If you’re a blogger (or a blog reader), you’re painfully familiar with people who try to raise their own websites’ search engine rankings by submitting linked blog comments like “Visit my discount pharmaceuticals site.” This is called comment spam, we don’t like it either, and we’ve been testing a new tag that blocks it. From now on, when Google sees the attribute (rel=“nofollow”) on hyperlinks, those links won’t get any credit when we rank websites in our search results. 

Almost all modern systems use the nofollow attribute on blog comment links. 

SEOs even began to abuse nofollow—because of course we did. Nofollow was used for PageRank sculpting, where people would nofollow some links on their pages to make other links stronger. Google eventually changed the system to prevent this abuse.

In 2009, Google’s Matt Cutts confirmed that this would no longer work and that PageRank would be distributed across links even if a nofollow attribute was present (but only passed through the followed link).

Google added a couple more link attributes that are more specific versions of the nofollow attribute on September 10, 2019. These included rel=“ugc” meant to identify user-generated content and rel=“sponsored” meant to identify links that were paid or affiliate.

Algorithms targeting link spam

As SEOs found new ways to game links, Google worked on new algorithms to detect this spam. 

When the original Penguin algorithm launched on April 24, 2012, it hurt a lot of websites and website owners. Google gave site owners a way to recover later that year by introducing the disavow tool on October 16, 2012.

When Penguin 4.0 launched on September 23, 2016, it brought a welcome change to how link spam was handled by Google. Instead of hurting websites, it began devaluing spam links. This also meant that most sites no longer needed to use the disavow tool. 

Google launched its first Link Spam Update on July 26, 2021. This recently evolved, and a Link Spam Update on December 14, 2022, announced the use of an AI-based detection system called SpamBrain to neutralize the value of unnatural links. 

The original version of PageRank hasn’t been used since 2006, according to a former Google employee. The employee said it was replaced with another less resource-intensive algorithm.

They replaced it in 2006 with an algorithm that gives approximately-similar results but is significantly faster to compute. The replacement algorithm is the number that’s been reported in the toolbar, and what Google claims as PageRank (it even has a similar name, and so Google’s claim isn’t technically incorrect). Both algorithms are O(N log N) but the replacement has a much smaller constant on the log N factor, because it does away with the need to iterate until the algorithm converges. That’s fairly important as the web grew from ~1-10M pages to 150B+.

Remember those iterations and how PageRank kept changing with each iteration? It sounds like Google simplified that system.

What else has changed?

Some links are worth more than others

Rather than splitting the PageRank equally between all links on a page, some links are valued more than others. There’s speculation from patents that Google switched from a random surfer model (where a user may go to any link) to a reasonable surfer model (where some links are more likely to be clicked than others so they carry more weight).

Some links are ignored

There have been several systems put in place to ignore the value of certain links. We’ve already talked about a few of them, including:

  • Nofollow, UGC, and sponsored attributes.
  • Google’s Penguin algorithm.
  • The disavow tool.
  • Link Spam updates.

Google also won’t count any links on pages that are blocked by robots.txt. It won’t be able to crawl these pages to see any of the links. This system was likely in place from the start.

Some links are consolidated

Google has a canonicalization system that helps it determine what version of a page should be indexed and to consolidate signals from duplicate pages to that main version.

Canonicalization signals

Canonical link elements were introduced on February 12, 2009, and allow users to specify their preferred version.

Redirects were originally said to pass the same amount of PageRank as a link. But at some point, this system changed and no PageRank is currently lost.

A bit is still unknown

When pages are marked as noindex, we don’t exactly know how Google treats the links. Even Googlers have conflicting statements.

According to John Mueller, pages that are marked noindex will eventually be treated as noindex, nofollow. This means that the links eventually stop passing any value.

According to Gary, Googlebot will discover and follow the links as long as a page still has links to it.

These aren’t necessarily contradictory. But if you go by Gary’s statement, it could be a very long time before Google stops crawling and counting links—perhaps never.

Can you still check your PageRank?

There’s currently no way to see Google’s PageRank.

URL Rating (UR) is a good replacement metric for PageRank because it has a lot in common with the PageRank formula. It shows the strength of a page’s link profile on a 100-point scale. The bigger the number, the stronger the link profile.

Screenshot showing UR score from Ahrefs overview 2.0

Both PageRank and UR account for internal and external links when being calculated. Many of the other strength metrics used in the industry completely ignore internal links. I’d argue link builders should be looking more at UR than metrics like DR, which only accounts for links from other sites.

However, it’s not exactly the same. UR does ignore the value of some links and doesn’t count nofollow links. We don’t know exactly what links Google ignores and don’t know what links users may have disavowed, which will impact Google’s PageRank calculation. We also may make different decisions on how we treat some of the canonicalization signals like canonical link elements and redirects.

So our advice is to use it but know that it may not be exactly like Google’s system.

We also have Page Rating (PR) in Site Audit’s Page Explorer. This is similar to an internal PageRank calculation and can be useful to see what the strongest pages on your site are based on your internal link structure.

Page rating in Ahrefs' Site Audit

How to improve your PageRank

Since PageRank is based on links, to increase your PageRank, you need better links. Let’s look at your options.

Redirect broken pages

Redirecting old pages on your site to relevant new pages can help reclaim and consolidate signals like PageRank. Websites change over time, and people don’t seem to like to implement proper redirects. This may be the easiest win, since those links already point to you but currently don’t count for you.

Here’s how to find those opportunities:

I usually sort this by “Referring domains.”

Best by links report filtered to 404 status code to show pages you may want to redirect

Take those pages and redirect them to the current pages on your site. If you don’t know exactly where they go or don’t have the time, I have an automated redirect script that may help. It looks at the old content from archive.org and matches it with the closest current content on your site. This is where you likely want to redirect the pages.

Internal links

Backlinks aren’t always within your control. People can link to any page on your site they choose, and they can use whatever anchor text they like.

Internal links are different. You have full control over them.

Internally link where it makes sense. For instance, you may want to link more to pages that are more important to you.

We have a tool within Site Audit called Internal Link Opportunities that helps you quickly locate these opportunities. 

This tool works by looking for mentions of keywords that you already rank for on your site. Then it suggests them as contextual internal link opportunities.

For example, the tool shows a mention of “faceted navigation” in our guide to duplicate content. As Site Audit knows we have a page about faceted navigation, it suggests we add an internal link to that page.

Example of an internal link opportunity

External links

You can also get more links from other sites to your own to increase your PageRank. We have a lot of guides around link building already. Some of my favorites are:

Final thoughts

Even though PageRank has changed, we know that Google still uses it. We may not know all the details or everything involved, but it’s still easy to see the impact of links.

Also, Google just can’t seem to get away from using links and PageRank. It once experimented with not using links in its algorithm and decided against it.

So we don’t have a version like that that is exposed to the public but we have our own experiments like that internally and the quality looks much much worse. It turns out backlinks, even though there is some noise and certainly a lot of spam, for the most part are still a really really big win in terms of quality of search results.

We played around with the idea of turning off backlink relevance and at least for now backlinks relevance still really helps in terms of making sure that we turn the best, most relevant, most topical set of search results.

Source: YouTube (Google Search Central)

If you have any questions, message me on Twitter.



Source link

Continue Reading

SEO

Chrome 110 Changes How Web Share API Embeds Third Party Content

Published

on

Chrome 110 Changes How Web Share API Embeds Third Party Content

Chrome 110, scheduled to roll out on February 7, 2023, contains a change to how it handles the Web Share API that improves privacy and security by requiring a the Web Share API to explicitly allow third-party content.

This might not be something that an individual publisher needs to act on.

It’s probably more relevant on the developer side where they are making things like web apps that use the Web Share API.

Nevertheless, it’s good to know what it is for the rare situation when it might be useful for diagnosing why a webpage doesn’t work.

The Mozilla developer page describes the Web Share API:

“The Web Share API allows a site to share text, links, files, and other content to user-selected share targets, utilizing the sharing mechanisms of the underlying operating system.

These share targets typically include the system clipboard, email, contacts or messaging applications, and Bluetooth or Wi-Fi channels.

…Note: This API should not be confused with the Web Share Target API, which allows a website to specify itself as a share target”

allow=”web-share” Attribute

An attribute is an HTML markup that modifies an HTML element in some way.

For example, the nofollow attribute modifies the <a> anchor element, by signaling the search engines that the link is not trusted.

The <iframe> is an HTML element and it can be modified with the allow=”web-share” attribute

An <iframe> allows a webpage to embed HTML, usually from another website.

Iframes are everywhere, such as in advertisements and embedded videos.

The problem with an iframe that contains content from another site is that it creates the possibility of showing unwanted content or allow malicious activities.

And that’s the problem that the allow=”web-share” attribute solves by setting a permission policy for the iframe.

This specific permission policy (allow=”web-share”) tells the browser that it’s okay to display 3rd party content from within an iframe.

Google’s announcement uses this example of the attribute in use:

<iframe allow="web-share" src="https://third-party.example.com/iframe.html"></iframe>

Google calls this a “a potentially breaking change in the Web Share API.

The announcement warns:

“If a sharing action needs to happen in a third-party iframe, a recent spec change requires you to explicitly allow the operation.

Do this by adding an allow attribute to the <iframe> tag with a value of web-share.

This tells the browser that the embedding site allows the embedded third-party iframe to trigger the share action.”

Read the announcement at Google’s Chrome webpage:

New requirements for the Web Share API in third-party iframes

Featured image by Shutterstock/Krakenimages.com



Source link

Continue Reading

SEO

All You Need to Know to Get Them

Published

on

All You Need to Know to Get Them

Do you want to jump to the first position in Google without building links or significantly updating your content? Featured snippets can help you with that.

Featured snippets are a special type of search result showing a quick answer to the search query at the top of Google’s results page. Google pulls this information from one of the top-ranking pages that then gets elevated to the top of the organic search results this way.

You may be wondering how that’s a good thing for the website that owns the featured snippet. Users see your content on the SERP, and that may mean losing clicks, right?

Well, yes and no. Check this example:

Featured snippet example

If this question were possible to answer thoroughly in a few sentences, most of us would be out of work. So while the snippet tells you the absolute basics, you still have to click to learn more.

That’s just one example. Featured snippets are one of the most prominent SERP features—and they’re evolving all the time.

Follow this guide to learn everything you need to know about featured snippets and what it takes to optimize for them. 

What types of featured snippets are there?

There are five types of featured snippets that Google shows depending on the intent behind the search query: 

  1. Paragraph
  2. Numbered list
  3. Bullet list
  4. Table
  5. Video

Let’s check an example for each type. 

1. Paragraph

Paragraph featured snippet

This one is a bit special because Google sometimes combines featured snippets with People Also Ask (PAA) boxes. You can see additional questions related to the search query there and click on them to see more information. That often comes from a different source than the featured snippet itself, as you can see in this case:

Featured snippet with PAA box

2. Numbered list

Numbered list featured snippet
This is an interesting case of a featured snippet where Google shows only the first point along with its own numbered list.

3. Bullet list

Bullet list featured snippet

4. Table

Table featured snippet

5. Video

Video featured snippet

It’s also important to note that there are other “snippet-like” results. You need to know about these to avoid any confusion:

Knowledge panel

Knowledge panel example

Knowledge card

Knowledge card example

Entity carousel

Entity carousel example

These three SERP features have one thing in common. They don’t pull answers from just one of the top-ranking search results, as they’re based on entities in the knowledge graph. While they may contain a link to the source of information (song lyrics, for example), it’s never in the form of a clickable title as we have in featured snippets.

How featured snippets influence search and SEO

Google introduced featured snippets in 2014, and I would say that they’re one of the most prominent SERP changes of the past decade. There are quite a few things that featured snippets changed for both users and SEOs.

Shortcut to the top organic position

If your content is ranking on the first SERP for a search query that shows a featured snippet, you can “win” that snippet and shortcut your way to the top position. Let’s break this down.

Our study found that featured snippets come from pages that already rank in the top 10. Moreover, the vast majority of featured snippets pages rank in the top five.

In conclusion, the higher your content ranks, the more likely it is to get a featured snippet.

Getting to the first SERP is a more manageable goal than ranking number #1 for a keyword. But if that keyword triggers a featured snippet, it makes the first position a bit more attainable.

Fewer clicks… sometimes

In the past, the page owning the featured snippet would also be listed in the standard “blue link” search results somewhere on the first SERP. But in January 2020, Google introduced featured snippet deduplication.

Once your page gets elevated to the featured snippet, you lose that “regular” search result.

Besides the little traffic losses back then, some people also think that featured snippets reduce clicks on the search results. After all, if the answer to the query is on the SERP, why would you click on a result?

While this is the case for some queries, it’s certainly not the case for them all. It depends on whether Google can provide a satisfactory answer in the snippet.

For example, take a look at the featured snippet for this query:

Featured snippet with a straightforward answer

The answer is right there for most people. And that’s why there’s only a 19% chance, on average, that the search for this query results in a click.

Example of a keyword with low Clicks Per Search

Now take a look at the snippet for “how does the stock market work”:

Featured snippet providing only a basic answer

Because it gives a basic answer to the question, most searchers will probably want to know more. 

That is most likely why, on average, 82% of searches for this query result in a click.

Example of a keyword with high Clicks Per Search

The takeaway here is that targeting keywords with a low number of Clicks Per Search (CPS) is rarely a good idea.

Pay attention to this when researching keywords in Ahrefs’ Keywords Explorer.

CPS column in Ahrefs' Keywords Explorer

Featured snippets as superb branding opportunities

Clicks aside, featured snippets are the first thing that users see in the search results if there are no search ads. They’re even more prominent on mobile devices where they’re often the only thing people initially see:

Featured snippet on mobile

This is a very compelling argument in favor of featured snippets.

Increasing your share of voice on the SERPs is arguably one of the most important SEO KPIs. That’s because brand-building is proven to be the primary driver of long-term growth.

The more your brand is visible on the SERPs for relevant topics, the more you will be associated as a market leader.

You can opt out of featured snippets (don’t do that, though)

Cyrus Shepard led the way in experimenting with opting out of featured snippets after the SERP deduplication and discovered that it led to a 12% traffic loss.

That said, if you still want to opt out of featured snippets, Google offers various ways to do that. Just be aware that both nosnippet robots meta tags methods also block your content from appearing in traditional “blue link” snippets. I don’t recommend using those because Google could then only use your hard-coded title tag and meta description.

So the best way to remove your page from appearing in featured snippets is to include max-snippet robots meta tag. This tag specifies the maximum number of characters Google can show in the text snippets.

And because featured snippets are longer than descriptions in regular snippets, you can set the character limit to the usual maximum length of descriptions. That’s around 160 characters.

You’ll just have to paste this code snippet into the <head> section of the page that you wish to remove from the featured snippets:

<meta name="robots" content="max-snippet:170">

While this method doesn’t guarantee not appearing in shorter featured snippets, it still outweighs the cons of using the more restrictive methods.

Recommendation

If you’re thinking of opting out, it pays to first check which position your page would rank for the keyword without owning the featured snippet.

For example, here’s a featured snippet that we own:

Featured snippet for the keyword "h1 tag"

If we appended “&num=9” to the URL, preferably in Incognito mode, we can see where we’d rank if we weren’t in the snippet:

Seeing the true position of the featured snippet page

In this case, if we decided to opt out, we would be in the second or third position—depending on the page that would take over the featured snippet (you’ll see how to do that too).

Being in lower positions and opting out can hurt your traffic. You’ve been warned. 

How to find and optimize featured snippets that you already own

Google Search Console doesn’t show any information regarding featured snippets. You’ll have to use third-party tools like Ahrefs’ Site Explorer to dig into them.

Let’s stick with Site Explorer. Paste in your site, then head to the Organic keywords report to see the keywords you rank for, then filter only for those where Google shows your page in the featured snippet:

Filtering for your own featured snippets

As you can see above, Ahrefs’ domain currently ranks for 1,042 keywords with featured snippets in the U.S. 

In the previous version of this article, I recommended filtering for keywords with the highest search volume and checking the most important featured snippets manually. That’s because Google sometimes pulls content that isn’t optimal, and you’d want these important featured snippets to be perfect.

However, Google is still improving. Now, I didn’t find a single keyword where I’d bother editing the section Google pulls it from.

While you may come across featured snippets that can do with a bit of polishing, I don’t recommend editing things unless Google pulls poorly formatted, misleading, or just plain wrong information.

It’s better to own an imperfect featured snippet than to risk losing it to a competitor by revising it.

How to get more featured snippets

Winning more featured snippets is a simple way to potentially increase organic traffic to your site. Below, we’ll discuss a few ways to do that.

Leverage content that you already have and rank for

Here, we’ll be looking at pages that already rank in the top 10 for a particular term yet don’t own the snippet. It’s possible to win the snippet just by making a few tweaks to your page.

How to find these opportunities? It’s easy.

Go to Site Explorer and filter keywords that trigger featured snippets where your website is ranking in positions #2–10.

Checking featured snippet opportunities

This is an easy way to filter out the vast majority if not all the featured snippets that you rank for, since they’re predominantly ranking at the first position. There are cases where they appear at lower positions, but it’s rare these days. In fact, all of our 1,042 featured snippets are ranking at the first position.

In other words, we now have a list of low-hanging opportunities to steal featured snippets from your competitors. Let’s get you prepared for the heist.

We need to prioritize. Stealing 7,064 featured snippets at once is mission impossible.

I reduced the list to just 21 keywords by prioritizing those with higher search volumes where we rank in positions #2–5.

Filtering down featured snippet opportunities

Now things look much more manageable.

The search volume filter is an obvious one, as there’s no point in targeting long-tail keywords at this point. Regarding the positions and referring back to our study, the probability of owning a featured snippet increases with your organic position for that search query.

Again, these filters will be different for you. However, if you don’t rank for a substantial number of keywords already, I’ll suggest focusing on creating more great content and building links.

So we’ve got the list. What’s the battle plan?

In our case, I’ll prioritize further by manually checking for keywords with solid business value. Let’s take a look at some of those keywords:

Keywords with good featured snippet opportunities

For example, the search query “most searched thing on google” at the top is less valuable for us than “seo content” at the bottom even though the first has twice the search volume. People who want to learn about creating search-optimized content are much more likely to become our customers one day.

Taking that “seo content” query into account, this is what I see:

Competing featured snippet example

First thing I’ll do here is to check whether our page even qualifies for the featured snippet at the moment. That can dictate how big of a change we need to make. You do that by excluding the domain that ranks for the current featured snippet using the - search operator.

Checking the featured snippet queue

In this case, there’s no other page in the featured snippet “queue,” which is an indicator that we currently don’t provide a good, short answer to the search query in the eyes of Google.

Just so you know, here’s an example of a featured snippet that has other eligible pages in line:

Example of a featured snippet with a queue

After excluding the Coursera domain, we can see what Google considers as the second-best option:

Second featured snippet in line

And you can go on to even see the third domain in line, and so on. But back to optimizing for the “seo content” featured snippet.

Competing featured snippet

We can clearly tell that a short, definition-style paragraph is the way to go here. Let’s check what we have in our content:

Featured snippet content section to be optimized

So the appropriate section exists; that’s a check. An interesting thing here is that Google ranks a page that targets the keyword in reverse order. Let’s see if other pages qualified for ranking there in the past by opening that keyword in Ahrefs’ Keywords Explorer and scrolling down to the Position history:

Position history in Ahrefs' Keywords Explorer

I only filtered for URLs that had the featured snippet at one point in the past two years. We can see that the rest targets “seo content” in the original order, and Backlinko claimed the first position for a long time. But we need to check whether Google was even showing the featured snippet back then.

You do that by scrolling further down in Keywords Explorer to the SERP overview. Select a date where you want to investigate the SERP for comparison. In this case, I need any SERP between July and September 2021:

Historical SERP overview for the keyword "seo content"

There it is: The featured snippet was there, claimed by another page. The last thing I need here is to check the section that was ranking back then by opening the URL on Archive.org after clicking on the caret:

Checking a page on Archive.org

And selecting a screenshot of that page during the time it was ranking for the featured snippet:

Historically ranking featured snippet section

We see three rather different definitions. There’s definitely room for the featured snippet optimization. I’d make our definition a bit longer, change the second sentence, and fit in the mention of keywords because I think that’s important. I’d change it from:

SEO content is content designed to rank in search engines. It could be a blog post, product or landing page, interactive tool, or something else.

To something like this:

SEO content is content designed to rank high in search engines for a specific keyword. Creating it requires researching and covering what searchers would find valuable.

I can honestly say that I feel this definition is superior to the competing ones. That should be your ultimate goal when it comes to optimizing for featured snippets regardless of the format.

This was quite an interesting example. One last thing to note here is that your snippet-worthy information needs to be formatted in a way that Google can easily parse, understand, and interpret. A good rule of thumb is that if the reader comes across that information easily, then Google should be able to as well.

Create new content with featured snippets in mind

Let’s make one thing clear from the start: Scoring a featured snippet should be just the icing on the cake, not the main purpose of why and how you cover a certain topic.

The prerequisite for winning the featured snippet is ranking well, so that should still be the focus. For this reason, I investigate potential featured snippet opportunities only after selecting a topic.

Since the major factor of being successful in SEO is aligning with the search intent, you should always analyze the competing pages on the SERP. Let’s take our main topic here as an example because it doesn’t get better than optimizing content to win featured snippets for “featured snippets” keywords.

I have my “featured snippets” topic, and you should select yours based on your keyword research. Look it up in Ahrefs’ Keywords Explorer and scroll down to the SERP overview:

SERP overview for the keyword "featured snippet"

I see that the main keyword triggers a featured snippet, so I’m in the difficult position of trying to dethrone Google there:

Featured snippet for the keyword "featured snippets"

Honestly, this is a case of a bad featured snippet. It doesn’t really provide value to the searcher. I don’t learn what it is or how it works. Google has a clear advantage of coining the term, so it’s kind of a branded search. But I’ll try my best to create a definition-type paragraph that I think searchers likely want to see.

We already went through the process of creating content for the “seo content” featured snippet, so this is just a rinse-and-repeat process—provide the best answer possible using a suitable format.

Since pages can rank for thousands of keywords, there are naturally many more featured snippet opportunities than just the one for the main keyword. The easiest way to check these is to click through a few top-ranking pages to see all the keywords they rank for: 

Checking organic keywords of the top-ranking page

And filter the report for keywords that trigger featured snippets and have a certain minimum search volume to make it worthwhile (as we’ve already shown earlier). I also included a “1–20” Position filter to make the list as relevant as possible:

Checking other featured snippets opportunities

Some of those keywords will be almost the same, having the same search intent and featured snippet. I don’t need to check the featured snippets for keywords like “snippet google” or “what is a featured snippet” because the answer and optimizing your content for them remain the same.

We’re looking for keywords that can trigger different featured snippets and are aligned with sections we cover in the article. There are a bunch of these opportunities around optimizing and getting featured snippets:

Other featured snippet opportunities

Look them up and see what Google shows there:

Featured snippet for the keyword "how to get featured snippet"

So if I want to have a chance to rank for this, I should include a straight-to-the-point paragraph on how to get a featured snippet instead of explaining the whole process across many pages. This looks like something that can fit nicely into the “Final thoughts” section to sum it up, so I’ll do that.

And since different pages rank for different keywords, it pays off to repeat this process for one to two more top-ranking pages. I found that I should also optimize for the “types of featured snippets” keyword here.

Even if you don’t end up winning the featured snippets, we’re still trying to answer searchers’ questions in the best way possible. That in itself is critical to your content’s success on the SERPs.

Here are a few copywriting tips for winning featured snippets to wrap this section up. You should:

  1. Format and structure your content correctly (H1–H6, etc.).
  2. Try to avoid overcomplicated sentences. Succinct explanations win.
  3. Use the language of your audience. In the end, Google uses featured snippets as answers in voice search.
  4. Use the ”inverted pyramid” method (where it makes sense).

pro tip

If your content includes sections that contain a sequence of steps to achieve a certain result or you have FAQ sections, use appropriate schema markup to highlight these structured sections for Google.

First, it’s a good idea to do so regardless of featured snippets because it can enhance your plain search result into a rich snippet. But I’ve also seen such pages dominate the combined featured snippets with PAA boxes where everything was from a single source. 

How to keep track of your featured snippets

Getting a featured snippet is equal to ranking first for a keyword. You may already be tracking keyword ranking positions, so let me help you expand it to tracking featured snippets.

Enter Ahrefs’ Rank Tracker.

First of all, I track all important keywords regardless of their SERP features. But we can begin by adding the most important keywords that also trigger featured snippets.

You can do that in a few clicks through the Organic keywords report we’ve already shown multiple times here. You just have to create a Rank Tracking project first for it to appear here:

Adding keywords triggering featured snippets to Ahrefs' Rank Tracker

You’re all set to see when you win or lose a featured snippet. Go to the Rank Tracker’s Overview report, click on the “SERP features” tab, and check the “Featured snippet” row:

Checking SERP featured in Rank Tracker

As you can see, from the keywords I’m tracking, the project lost eight featured snippets, while 12 new ones appeared on the SERP over the tracked time period (last 30 days). 

Here are the key parts to keep an eye on:

  1. Number of featured snippets you currently own (plus the +/- change in the selected period)
  2. Number of featured snippets in total for the keywords you’re tracking (plus the +/- change for the period)
  3. Percentage of all the featured snippets among the tracked keywords that you own (9%, in this example)

You can also change the view from “all tracked features” to “featured snippets” to see your progress over time:

Progress of featured snippets in Rank Tracker

To delve deeper into the specifics on the keyword level, select the “Featured snippet” filter:

Filtering for keywords that only trigger featured snippets

And scroll down to the keywords list to see the time comparison data (30 days, in this example):

Featured snippet changes over the past 30 days

We can see that the top keyword is among our new featured snippets. But it is more helpful to isolate the featured snippet movements only.

To isolate the winning cases, we’ll need to apply two filters:

  • Position – Improved (you rank higher than at the start of your selected period).
  • SERP features – You rank for the featured snippet.
Filtering for won featured snippets

Again, scroll down and see the featured snippet winners of the month (or whatever period you choose):

Won featured snippets

To see lost featured snippets, just apply reverse filters -> decline in positions in the top 10 and only show featured snippets that you don’t own. Unfortunately, you can’t currently isolate cases where you lost the snippet, so you’ll see all declines in the top 10.

Look for keywords that dropped from the first position, like these first two:

Lost featured snippets

You may want to consider checking the position drops regardless of featured snippets anyway. Sort the table by traffic and pay attention to huge traffic drops. 

Final thoughts

You should now know everything necessary to win those coveted SERP jumps to the first position. To sum it up:

Optimizing for featured snippets is about providing a brief and valuable answer to the search query in the most suitable format. Getting the featured snippet involves following all the best SEO practices to make the content rank well for the target keyword.

If you have any comments or questions, don’t hesitate to ping me on Twitter.



Source link

Continue Reading

Trending

en_USEnglish