Connect with us

SEO

How HTTP/3 Helps Feed SEO’s Need For Speed

Published

on

The evolution of the web never stands still.

As new technologies are developed, consumer behaviors change and the core infrastructure that underpins the internet is forced to adapt.

The HTTP protocol – used to transfer data between client and server – has gone through a number of different iterations, all of which have enhanced the core functionality with new and exciting features.

After an 18-year gap between the adoption of HTTP/1.1 in 1997 and HTTP/2 in 2015, development has picked up the pace, with the draft proposal for HTTP/3 submitted merely three years later.

What Is HTTP/3?

At its core, HTTP/3 is an overhaul of the underlying transport layer used to manage file transfers.

It represents a move away from TCP (Transmission Control Protocol) to UDP (User Datagram Protocol), addressing several TCP limitations and improving performance and security for users.

Although it’s still waiting for final review before publication, 73% of web browsers already support the protocol.

This number will significantly increase once Safari makes it a core feature; currently, it’s experimental and has to be enabled via the developer menu.

http3 browser support caniuse
Screenshot from HTTP/3 support Caniuse.com, April 2022

The HTTP/3 protocol is already used by 25% of the top 10 million websites, including Google and Facebook.

In fact, if you’re using technologies like Google Analytics, Tag Manager, or Fonts, you’re already partially utilizing the protocol.

What Are HTTP/3’s Main Advantages Over HTTP/2 And HTTP/1?

To fully appreciate the advantages of HTTP/3, it’s worth stepping back to understand how HTTP/1.1 worked, and the problems HTTP/2 was designed to solve.

When being sent, files (HTML, JS, CSS, images, etc.) are broken down into smaller, individual packets with the data transmitted over time.

HTTP/1.1 was designed to give each file its own connection. As websites became increasingly complex, more files were needed to load each page.

website total requests over timewebsite total requests over time
Image from HTTP archive, April 2022

Browsers limit the number of parallel connections available, creating a bottleneck and slowing loading times. This resulted in several necessary workarounds to maximize performance, such as domain sharding and image sprites.

By introducing multiplexing, HTTP/2 solved the problem caused by connection limits, allowing the transfer of multiple files over a singular connection.

The other major improvement was the introduction of better header compression, alongside a few other features that have proved less successful in practice (see Ruth’s excellent HTTP/2 guide for more details).

Yet these enhancements didn’t fix all of the problems with the TCP protocol.

TCP transfers packets chronologically, meaning that if a packet is missed, the entire connection is held up until the packet is successfully received. This problem, known as head of line blocking, negates some of the benefits of multiplexing.

Another challenge with TCP is it’s entirely detached from the TLS protocol.

This is by design, as sites can be both secure and insecure.

As a result, a server and client must make multiple round trips to negotiate a connection before transmitting data.

How Does HTTP/3 Solve These Problems?

By moving from TCP to UDP, HTTP/3 introduces three main features that set it apart from HTTP/1.1 and HTTP/2.

Independent Byte Streams

HTTP/3 solves head-of-line blocking by introducing independent byte streams for individual files. Only the data for an individual stream is blocked while the lost packet is resent, not the entire connection.

To illustrate this further, it’s worth thinking back to the fantastic truck analogy Tom Anthony used in his seminal presentation on HTTP/2 (now updated for HTTP/3).

The basic premise is that with HTTP/1.1, you end up with multiple trucks queuing to go on the same road (connection).

trucks http1.1 limitationScreenshot from @TomAnthonySEO, An introduction to HTTP/3, April 2022trucks http1.1 limitation

In contrast, HTTP/2 allows multiple trucks to be in the same lane simultaneously.

htt2 trucksScreenshot from @TomAnthonySEO, An introduction to HTTP/3, April 2022htt2 trucks

Unfortunately, with TCP, if a truck stalls, the entire road is blocked until the truck starts moving again.

http2 trucks tcp packet lossScreenshot from @TomAnthonySEO, An introduction to HTTP/3, April 2022http2 trucks tcp packet loss

With HTTP/3 and UDP, the other trucks can just drive around it.

TLS Integration

By incorporating TLS 1.3 into HTTP/3 itself, rather than having two distinct protocols operating independently, only a singular handshake is required reducing the number of roundtrips from two (or three if using TLS 1.2) to one.

This change means faster – and more secure – connections for users.

One consequence of this change is that HTTP/3 can only be used on a secure site because TLS and UDP are closely intertwined. Interestingly, this wasn’t the case with HTTP/2, which can technically be used on an insecure site – although none of the major browsers allow you to do so.

Connection Migration

Rather than using IPs to route packets, HTTP/3 instead uses connection IDs.

By doing so, it can handle network changes without the need to re-establish a connection.

This is hugely advantageous in a mobile-first world, where users often swap between wifi and cellular networks, both in terms of speed and connection stability.

Going back to our truck analogy, this is like coming to a junction and having to queue again before you can move on to the next road.

With HTTP/3, there’s a slip-road, allowing you to exchange between the two seamlessly.

Does HTTP/3 Have Any Disadvantages?

Although HTTP/3 has some clear performance benefits, its detractors have emphasized several disadvantages.

First,  the protocol will provide limited benefit to users on fast connections, with the slowest 1% to 10% seeing most of the gains.

But, as far as Core Web Vitals go, this could actually be very beneficial.

CWV scores are global, so it’s entirely possible to pull them down by a specific subset of users in a distant geographic location.

Equally, in a mobile-first world, even users with fast devices and close geographic proximity can suffer from temporary network issues, which may have an adverse effect on CWV.

The more mobile your users, the higher the probability of this having an impact.

Another complaint is that switching to HTTP/3 requires a fairly major server upgrade because it fundamentally changes how the transport layer works.

Additionally, the usage of UDP also introduces higher CPU requirements, which may put more pressure on servers.

Both arguments are fair, but CPU usage is currently being optimized.

Also, as we’ll see in the implementation section below, many CDN providers are already providing relatively simple HTTP/3 solutions that can easily be deployed at the edge.

Does HTTP/3 Matter For SEO?

While Googlebot has supported HTTP/2 since November 2020, with half of all URLs now crawled using the protocol, it’s not currently supporting HTTP/3.

HTTP/2 is only used when there is a clear benefit to doing so, i.e., when using HTTP/2 will lead to significant resource savings for both servers and Googlebot.

This will undoubtedly continue to ramp up over time, but given the five-year gap between the publication of the HTTP/2 protocol and Googlebot support, HTTP/3 is likely a way off still.

That said, implementing HTTP/3 could still have an indirect SEO impact – if supporting the protocol leads to better Core Web Vitals scores.

Upgrading your server infrastructure to support HTTP/3 – or, for that matter, HTTP/2 – is just one of many potential enhancements that you can leverage to ensure your website is as performant as possible.

And the benefits of having a performant website, including reduced bounce rates, increased time on site, and higher conversion rates, extend beyond SEO.

To see what protocol Googlebot is using to crawl a site, you can look for a notification in GSC or check Googlebot requests within your server access logs.

While formats vary, the protocol used is commonly listed in the HTTP request found within quotation marks, alongside the request method and URL path.

50.56.92.47 [18/Apr/2022:10:00:00 -0100] "GET /seo/technical-seo-auditing/ HTTP/1.1" 200 684 "https://moz.com/" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"

Example of an Apache request (Combined Log Format).

How To Check If A Website Supports HTTP/3

If you’re unsure whether or not a website supports HTTP/3, then you can check using an online tool like: https://http3check.net/

http3check.net h3 checkScreenshot of http3check.net, April 2022http3check.net h3 check

Alternatively, both Chrome and Firefox display the protocol per request within the dev tools network tab.

These fields aren’t visible by default but can be enabled by right-clicking on the navigation bar and selecting “Protocol.” The HTTP/3 requests are labeled “h3.”

network tab chrome http3Screenshot from network tab chrome http3, April 2022network tab chrome http3

It’s also possible to check using the command line and curl.

curl --http3 https://website.com/

As many sites will only have HTTP/3 enabled for page resources (usually those hosted on a CDN), using dev tools will give a more accurate picture and allow you to assess the opportunities available better.

How Can I Implement HTTP/3?

The easiest way by far to enable HTTP/3 is via a CDN.

Several major providers, including Cloudflare, Google Cloud, and Fastly already support the protocol.

According to W3Techs, 22% of the top 10 million websites use Cloudflare, where you can easily enable HTTP/3 in the dashboard.

cloudflare http3 enableScreenshot of Cloudflare dashboard, April 2022cloudflare http3 enable

If you’re unsure what tech stack you’re dealing with, use Builtwith or Wappalyzer and see if a CDN is listed.

Wappalyzer CDN cloudflareScreenshot of Wappalyzer, April 2022Wappalyzer CDN cloudflare

If a site is using Cloudflare and all of the requests are HTTP/2, you’ve found an easy and impactful recommendation to make.

If implementation via a CDN isn’t possible, a server change is required.

Various implementations are available, depending on the language used, but web servers haven’t universally adopted these.

Therefore, the feasibility of implementing HTTP/3 is likely to depend on the type of software you’re using.

Server http3 supportServer HTTP/3 support, April 2022Server http3 support

Unfortunately, 32% of web servers use Apache, but it is yet to begin working on support due to limited dev resources.

Similarly, enabling the protocol on Node requires a workaround due to the lack of OpenSSL support.

Windows (IIS) is the latest provider to offer the protocol natively, but it requires Windows Server 2022 and Windows 11 or later.

Wrapping Up

HTTP/3 is another significant step forward for the web and will provide a much-needed performance boost to support its continuing evolution.

As SEO and digital marketing professionals, we should be aware of the benefits the protocol brings ahead of its imminent publication, so we can start recommending its use and allow our users to reap the benefits for years to come.

More resources:


Featured Image: VectorHot/Shutterstock

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘http3-guide’,
content_category: ‘enterprise technical-seo ‘
});

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Rolls Out New ‘Web’ Filter For Search Results

Published

on

By

Google logo inside the Google Indonesia office in Jakarta

Google is introducing a filter that allows you to view only text-based webpages in search results.

The “Web” filter, rolling out globally over the next two days, addresses demand from searchers who prefer a stripped-down, simplified view of search results.

Danny Sullivan, Google’s Search Liaison, states in an announcement:

“We’ve added this after hearing from some that there are times when they’d prefer to just see links to web pages in their search results, such as if they’re looking for longer-form text documents, using a device with limited internet access, or those who just prefer text-based results shown separately from search features.”

The new functionality is a throwback to when search results were more straightforward. Now, they often combine rich media like images, videos, and shopping ads alongside the traditional list of web links.

How It Works

On mobile devices, the “Web” filter will be displayed alongside other filter options like “Images” and “News.”

Screenshot from: twitter.com/GoogleSearchLiaison, May 2024.

If Google’s systems don’t automatically surface it based on the search query, desktop users may need to select “More” to access it.

1715727362 7 Google Rolls Out New Web Filter For Search ResultsScreenshot from: twitter.com/GoogleSearchLiaison, May 2024.

More About Google Search Filters

Google’s search filters allow you to narrow results by type. The options displayed are dynamically generated based on your search query and what Google’s systems determine could be most relevant.

The “All Filters” option provides access to filters that are not shown automatically.

Alongside filters, Google also displays “Topics” – suggested related terms that can further refine or expand a user’s original query into new areas of exploration.

For more about Google’s search filters, see its official help page.


Featured Image: egaranugrah/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Why Google Can’t Tell You About Every Ranking Drop

Published

on

By

Why Google Can't Tell You About Every Ranking Drop

In a recent Twitter exchange, Google’s Search Liaison, Danny Sullivan, provided insight into how the search engine handles algorithmic spam actions and ranking drops.

The discussion was sparked by a website owner’s complaint about a significant traffic loss and the inability to request a manual review.

Sullivan clarified that a site could be affected by an algorithmic spam action or simply not ranking well due to other factors.

He emphasized that many sites experiencing ranking drops mistakenly attribute it to an algorithmic spam action when that may not be the case.

“I’ve looked at many sites where people have complained about losing rankings and decide they have a algorithmic spam action against them, but they don’t. “

Sullivan’s full statement will help you understand Google’s transparency challenges.

Additionally, he explains why the desire for manual review to override automated rankings may be misguided.

Challenges In Transparency & Manual Intervention

Sullivan acknowledged the idea of providing more transparency in Search Console, potentially notifying site owners of algorithmic actions similar to manual actions.

However, he highlighted two key challenges:

  1. Revealing algorithmic spam indicators could allow bad actors to game the system.
  2. Algorithmic actions are not site-specific and cannot be manually lifted.

Sullivan expressed sympathy for the frustration of not knowing the cause of a traffic drop and the inability to communicate with someone about it.

However, he cautioned against the desire for a manual intervention to override the automated systems’ rankings.

Sullivan states:

“…you don’t really want to think “Oh, I just wish I had a manual action, that would be so much easier.” You really don’t want your individual site coming the attention of our spam analysts. First, it’s not like manual actions are somehow instantly processed. Second, it’s just something we know about a site going forward, especially if it says it has change but hasn’t really.”

Determining Content Helpfulness & Reliability

Moving beyond spam, Sullivan discussed various systems that assess the helpfulness, usefulness, and reliability of individual content and sites.

He acknowledged that these systems are imperfect and some high-quality sites may not be recognized as well as they should be.

“Some of them ranking really well. But they’ve moved down a bit in small positions enough that the traffic drop is notable. They assume they have fundamental issues but don’t, really — which is why we added a whole section about this to our debugging traffic drops page.”

Sullivan revealed ongoing discussions about providing more indicators in Search Console to help creators understand their content’s performance.

“Another thing I’ve been discussing, and I’m not alone in this, is could we do more in Search Console to show some of these indicators. This is all challenging similar to all the stuff I said about spam, about how not wanting to let the systems get gamed, and also how there’s then no button we would push that’s like “actually more useful than our automated systems think — rank it better!” But maybe there’s a way we can find to share more, in a way that helps everyone and coupled with better guidance, would help creators.”

Advocacy For Small Publishers & Positive Progress

In response to a suggestion from Brandon Saltalamacchia, founder of RetroDodo, about manually reviewing “good” sites and providing guidance, Sullivan shared his thoughts on potential solutions.

He mentioned exploring ideas such as self-declaration through structured data for small publishers and learning from that information to make positive changes.

“I have some thoughts I’ve been exploring and proposing on what we might do with small publishers and self-declaring with structured data and how we might learn from that and use that in various ways. Which is getting way ahead of myself and the usual no promises but yes, I think and hope for ways to move ahead more positively.”

Sullivan said he can’t make promises or implement changes overnight, but he expressed hope for finding ways to move forward positively.


Featured Image: Tero Vesalainen/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

56 Google Search Statistics to Bookmark for 2024

Published

on

56 Google Search Statistics to Bookmark for 2024

If you’re curious about the state of Google search in 2024, look no further.

Each year we pick, vet, and categorize a list of up-to-date statistics to give you insights from trusted sources on Google search trends.

  1. Google has a web index of “about 400 billion documents”. (The Capitol Forum)
  2. Google’s search index is over 100 million gigabytes in size. (Google)
  3. There are an estimated 3.5 billion searches on Google each day. (Internet Live Stats)
  4. 61.5% of desktop searches and 34.4% of mobile searches result in no clicks. (SparkToro)
  5. 15% of all Google searches have never been searched before. (Google)
  6. 94.74% of keywords get 10 monthly searches or fewer. (Ahrefs)
  7. The most searched keyword in the US and globally is “YouTube,” and youtube.com gets the most traffic from Google. (Ahrefs)
  8. 96.55% of all pages get zero search traffic from Google. (Ahrefs)
  9. 50-65% of all number-one spots are dominated by featured snippets. (Authority Hacker)
  10. Reddit is the most popular domain for product review queries. (Detailed)

  1. Google is the most used search engine in the world, with a mobile market share of 95.32% and a desktop market share of 81.95%. (Statista)
    63.41% of all US web traffic referrals come from Google.63.41% of all US web traffic referrals come from Google.
  2. Google.com generated 84.2 billion visits a month in 2023. (Statista)
  3. Google generated $307.4 billion in revenue in 2023. (Alphabet Investor Relations)
  4. 63.41% of all US web traffic referrals come from Google. (SparkToro)
  5. 92.96% of global traffic comes from Google Search, Google Images, and Google Maps. (SparkToro)
  6. Only 49% of Gen Z women use Google as their search engine. The rest use TikTok. (Search Engine Land)

  1. 58.67% of all website traffic worldwide comes from mobile phones. (Statista)
  2. 57% of local search queries are submitted using a mobile device or tablet. (ReviewTrackers)
    57% of local search queries are submitted using a mobile device or tablet. 57% of local search queries are submitted using a mobile device or tablet.
  3. 51% of smartphone users have discovered a new company or product when conducting a search on their smartphones. (Think With Google)
  4. 54% of smartphone users search for business hours, and 53% search for directions to local stores. (Think With Google)
  5. 18% of local searches on smartphones lead to a purchase within a day vs. 7% of non-local searches. (Think With Google)
  6. 56% of in-store shoppers used their smartphones to shop or research items while they were in-store. (Think With Google)
  7. 60% of smartphone users have contacted a business directly using the search results (e.g., “click to call” option). (Think With Google)
  8. 63.6% of consumers say they are likely to check reviews on Google before visiting a business location. (ReviewTrackers)
  9. 88% of consumers would use a business that replies to all of its reviews. (BrightLocal)
  10. Customers are 2.7 times more likely to consider a business reputable if they find a complete Business Profile on Google Search and Maps. (Google)
  11. Customers are 70% more likely to visit and 50% more likely to consider purchasing from businesses with a complete Business Profile. (Google)
  12. 76% of people who search on their smartphones for something nearby visit a business within a day. (Think With Google)
  13. 28% of searches for something nearby result in a purchase. (Think With Google)
  14. Mobile searches for “store open near me” (such as, “grocery store open near me” have grown by over 250% in the last two years. (Think With Google)

  1. People use Google Lens for 12 billion visual searches a month. (Google)
  2. 50% of online shoppers say images helped them decide what to buy. (Think With Google)
  3. There are an estimated 136 billion indexed images on Google Image Search. (Photutorial)
  4. 15.8% of Google SERPs show images. (Moz)
  5. People click on 3D images almost 50% more than static ones. (Google)

  1. More than 800 million people use Google Discover monthly to stay updated on their interests. (Google)
  2. 46% of Google Discover URLs are news sites, 44% e-commerce, 7% entertainment, and 2% travel. (Search Engine Journal)
  3. Even though news sites accounted for under 50% of Google Discover URLs, they received 99% of Discover clicks. (Search Engine Journal)
    Even though news sites accounted for under 50% of Google Discover URLs, they received 99% of Discover clicks.Even though news sites accounted for under 50% of Google Discover URLs, they received 99% of Discover clicks.
  4. Most Google Discover URLs only receive traffic for three to four days, with most of that traffic occurring one to two days after publishing. (Search Engine Journal)
  5. The clickthrough rate (CTR) for Google Discover is 11%. (Search Engine Journal)
  1. 91.45% of search volumes in Google Ads Keyword Planner are overestimates. (Ahrefs)
  2. For every $1 a business spends on Google Ads, they receive $8 in profit through Google Search and Ads. (Google)
  3. Google removed 5.5 billion ads, suspended 12.7 million advertiser accounts, restricted over 6.9 billion ads, and restricted ads from showing up on 2.1 billion publisher pages in 2023. (Google)
  4. The average shopping click-through rate (CTR) across all industries is 0.86% for Google Ads. (Wordstream)
  5. The average shopping cost per click (CPC) across all industries is $0.66 for Google Ads. (Wordstream)
  6. The average shopping conversion rate (CVR) across all industries is 1.91% for Google Ads. (Wordstream)

  1. 58% of consumers ages 25-34 use voice search daily. (UpCity)
  2. 16% of people use voice search for local “near me” searches. (UpCity)
  3. 67% of consumers say they’re very likely to use voice search when seeking information. (UpCity)
  4. Active users of the Google Assistant grew 4X over the past year, as of 2019. (Think With Google)
  5. Google Assistant hit 1 billion app installs. (Android Police)

  1. AI-generated answers from SGE were available for 91% of entertainment queries but only 17% of healthcare queries. (Statista)
  2. The AI-generated answers in Google’s Search Generative Experience (SGE) do not match any links from the top 10 Google organic search results 93.8% of the time. (Search Engine Journal)
  3. Google displays a Search Generative element for 86.8% of all search queries. (Authoritas)
    Google displays a Search Generative element for 86.8% of all search queries. Google displays a Search Generative element for 86.8% of all search queries.
  4. 62% of generative links came from sources outside the top 10 ranking organic domains. Only 20.1% of generative URLs directly match an organic URL ranking on page one. (Authoritas)
  5. 70% of SEOs said that they were worried about the impact of SGE on organic search (Aira)

Learn more

Check out more resources on how Google works:



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending