Connect with us

SEO

HTTP/2 Rapid Reset DDOS Vulnerability Affects Virtually Any Site

Published

on

HTTP/2 Rapid Reset DDOS Vulnerability Affects Virtually Any Site

Details of a new form of DDOS that requires relatively minimal resources to launch an attack of unprecedented scale, making it a clear danger for websites as server software companies race to release patches to protect against it.

HTTP/2 Rapid Reset Exploit

The vulnerability takes advantage of the HTTP/2 and HTTP/3 network protocols that allow multiple streams of data to and from a server and a browser.

This means that the browser can request multiple resources from a server and get them all returned, rather than having to wait for each resource to download one at a time.

The exploit that was publicly announced by Cloudflare, Amazon Web Services (AWS) and Google is called HTTP/2 Rapid Reset.

The vast majority of modern web servers use the HTTP/2 network protocol.

Because there is currently no software patch to fix the HTTP/2 security hole, it means that virtually every server is vulnerable.

An exploit that is new and has no way to mitigate it is called a zero-day exploit.

The good news is that server software companies are working on developing patches to close the HTTP/2 weakness.

How The HTTP/2 Rapid Reset Vulnerability Works

The HTTP/2 network protocol has a server setting that allows a set number of requests at any given time.

Requests that exceed that number are denied.

Another feature of the HTTP/2 protocol allows a request to be cancelled, which removes that data stream from the preset request limit.

This is a good thing because it frees up the server to turn around and process another data stream.

However, what the attackers discovered is that it’s possible to send millions (yes, millions) of requests and cancellations to a server and overwhelm it.

How Bad Is HTTP/2 Rapid Reset?

The HTTP/2 Rapid Reset exploit is extraordinarily bad because servers currently have no defense against it.

Cloudflare noted that it had blocked a DDOS attack that was 300% larger than the largest ever DDOS attack in history.

The largest one they blocked exceeded 201 million requests per second (RPS).

Google is reporting a DDOS attack that exceeded 398 million RPS.

But that’s not the full extent of how bad this exploit is.

What makes this exploit even worse is that it takes a relatively trivial amount of resources to launch an attack.

DDOS attacks of this size normally require hundreds of thousands to millions of infected computers (called a botnet) to launch attacks at this scale.

The HTTP/2 Rapid Reset exploit requires as few as 20,000 infected computers to launch attacks that are three times larger than the largest DDOS attacks ever recorded.

That means that the bar is much lower for hackers to gain the ability to launch devastating DDOS attacks.

How To Protect Against HTTP/2 Rapid Reset?

Server software publishers are currently working to release patches to close the HTTP/2 exploit weakness. Cloudflare customers are currently protected and don’t have to worry.

Cloudflare advises that in the worst case scenario, if a server is under attack and defenseless, the server administrator can downgrade the HTTP network protocol to HTTP/1.1.

Downgrading the network protocol will stop the hackers from being able to continue their attack but the server performance may slow down (which at least is better than being offline).

Read The Security Bulletins

Cloudflare Blog Post:
HTTP/2 Zero-Day Vulnerability Results in Record-Breaking DDoS Attacks

Google Cloud Security Alert:
Google mitigated the largest DDoS attack to date, peaking above 398 million rps

AWS Security Alert:
CVE-2023-44487 – HTTP/2 Rapid Reset Attack

Featured Image by Shutterstock/Illusmile

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Rolls Out New ‘Web’ Filter For Search Results

Published

on

By

Google logo inside the Google Indonesia office in Jakarta

Google is introducing a filter that allows you to view only text-based webpages in search results.

The “Web” filter, rolling out globally over the next two days, addresses demand from searchers who prefer a stripped-down, simplified view of search results.

Danny Sullivan, Google’s Search Liaison, states in an announcement:

“We’ve added this after hearing from some that there are times when they’d prefer to just see links to web pages in their search results, such as if they’re looking for longer-form text documents, using a device with limited internet access, or those who just prefer text-based results shown separately from search features.”

The new functionality is a throwback to when search results were more straightforward. Now, they often combine rich media like images, videos, and shopping ads alongside the traditional list of web links.

How It Works

On mobile devices, the “Web” filter will be displayed alongside other filter options like “Images” and “News.”

Screenshot from: twitter.com/GoogleSearchLiaison, May 2024.

If Google’s systems don’t automatically surface it based on the search query, desktop users may need to select “More” to access it.

1715727362 7 Google Rolls Out New Web Filter For Search ResultsScreenshot from: twitter.com/GoogleSearchLiaison, May 2024.

More About Google Search Filters

Google’s search filters allow you to narrow results by type. The options displayed are dynamically generated based on your search query and what Google’s systems determine could be most relevant.

The “All Filters” option provides access to filters that are not shown automatically.

Alongside filters, Google also displays “Topics” – suggested related terms that can further refine or expand a user’s original query into new areas of exploration.

For more about Google’s search filters, see its official help page.


Featured Image: egaranugrah/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Why Google Can’t Tell You About Every Ranking Drop

Published

on

By

Why Google Can't Tell You About Every Ranking Drop

In a recent Twitter exchange, Google’s Search Liaison, Danny Sullivan, provided insight into how the search engine handles algorithmic spam actions and ranking drops.

The discussion was sparked by a website owner’s complaint about a significant traffic loss and the inability to request a manual review.

Sullivan clarified that a site could be affected by an algorithmic spam action or simply not ranking well due to other factors.

He emphasized that many sites experiencing ranking drops mistakenly attribute it to an algorithmic spam action when that may not be the case.

“I’ve looked at many sites where people have complained about losing rankings and decide they have a algorithmic spam action against them, but they don’t. “

Sullivan’s full statement will help you understand Google’s transparency challenges.

Additionally, he explains why the desire for manual review to override automated rankings may be misguided.

Challenges In Transparency & Manual Intervention

Sullivan acknowledged the idea of providing more transparency in Search Console, potentially notifying site owners of algorithmic actions similar to manual actions.

However, he highlighted two key challenges:

  1. Revealing algorithmic spam indicators could allow bad actors to game the system.
  2. Algorithmic actions are not site-specific and cannot be manually lifted.

Sullivan expressed sympathy for the frustration of not knowing the cause of a traffic drop and the inability to communicate with someone about it.

However, he cautioned against the desire for a manual intervention to override the automated systems’ rankings.

Sullivan states:

“…you don’t really want to think “Oh, I just wish I had a manual action, that would be so much easier.” You really don’t want your individual site coming the attention of our spam analysts. First, it’s not like manual actions are somehow instantly processed. Second, it’s just something we know about a site going forward, especially if it says it has change but hasn’t really.”

Determining Content Helpfulness & Reliability

Moving beyond spam, Sullivan discussed various systems that assess the helpfulness, usefulness, and reliability of individual content and sites.

He acknowledged that these systems are imperfect and some high-quality sites may not be recognized as well as they should be.

“Some of them ranking really well. But they’ve moved down a bit in small positions enough that the traffic drop is notable. They assume they have fundamental issues but don’t, really — which is why we added a whole section about this to our debugging traffic drops page.”

Sullivan revealed ongoing discussions about providing more indicators in Search Console to help creators understand their content’s performance.

“Another thing I’ve been discussing, and I’m not alone in this, is could we do more in Search Console to show some of these indicators. This is all challenging similar to all the stuff I said about spam, about how not wanting to let the systems get gamed, and also how there’s then no button we would push that’s like “actually more useful than our automated systems think — rank it better!” But maybe there’s a way we can find to share more, in a way that helps everyone and coupled with better guidance, would help creators.”

Advocacy For Small Publishers & Positive Progress

In response to a suggestion from Brandon Saltalamacchia, founder of RetroDodo, about manually reviewing “good” sites and providing guidance, Sullivan shared his thoughts on potential solutions.

He mentioned exploring ideas such as self-declaration through structured data for small publishers and learning from that information to make positive changes.

“I have some thoughts I’ve been exploring and proposing on what we might do with small publishers and self-declaring with structured data and how we might learn from that and use that in various ways. Which is getting way ahead of myself and the usual no promises but yes, I think and hope for ways to move ahead more positively.”

Sullivan said he can’t make promises or implement changes overnight, but he expressed hope for finding ways to move forward positively.


Featured Image: Tero Vesalainen/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

56 Google Search Statistics to Bookmark for 2024

Published

on

56 Google Search Statistics to Bookmark for 2024

If you’re curious about the state of Google search in 2024, look no further.

Each year we pick, vet, and categorize a list of up-to-date statistics to give you insights from trusted sources on Google search trends.

  1. Google has a web index of “about 400 billion documents”. (The Capitol Forum)
  2. Google’s search index is over 100 million gigabytes in size. (Google)
  3. There are an estimated 3.5 billion searches on Google each day. (Internet Live Stats)
  4. 61.5% of desktop searches and 34.4% of mobile searches result in no clicks. (SparkToro)
  5. 15% of all Google searches have never been searched before. (Google)
  6. 94.74% of keywords get 10 monthly searches or fewer. (Ahrefs)
  7. The most searched keyword in the US and globally is “YouTube,” and youtube.com gets the most traffic from Google. (Ahrefs)
  8. 96.55% of all pages get zero search traffic from Google. (Ahrefs)
  9. 50-65% of all number-one spots are dominated by featured snippets. (Authority Hacker)
  10. Reddit is the most popular domain for product review queries. (Detailed)

  1. Google is the most used search engine in the world, with a mobile market share of 95.32% and a desktop market share of 81.95%. (Statista)
    63.41% of all US web traffic referrals come from Google.63.41% of all US web traffic referrals come from Google.
  2. Google.com generated 84.2 billion visits a month in 2023. (Statista)
  3. Google generated $307.4 billion in revenue in 2023. (Alphabet Investor Relations)
  4. 63.41% of all US web traffic referrals come from Google. (SparkToro)
  5. 92.96% of global traffic comes from Google Search, Google Images, and Google Maps. (SparkToro)
  6. Only 49% of Gen Z women use Google as their search engine. The rest use TikTok. (Search Engine Land)

  1. 58.67% of all website traffic worldwide comes from mobile phones. (Statista)
  2. 57% of local search queries are submitted using a mobile device or tablet. (ReviewTrackers)
    57% of local search queries are submitted using a mobile device or tablet. 57% of local search queries are submitted using a mobile device or tablet.
  3. 51% of smartphone users have discovered a new company or product when conducting a search on their smartphones. (Think With Google)
  4. 54% of smartphone users search for business hours, and 53% search for directions to local stores. (Think With Google)
  5. 18% of local searches on smartphones lead to a purchase within a day vs. 7% of non-local searches. (Think With Google)
  6. 56% of in-store shoppers used their smartphones to shop or research items while they were in-store. (Think With Google)
  7. 60% of smartphone users have contacted a business directly using the search results (e.g., “click to call” option). (Think With Google)
  8. 63.6% of consumers say they are likely to check reviews on Google before visiting a business location. (ReviewTrackers)
  9. 88% of consumers would use a business that replies to all of its reviews. (BrightLocal)
  10. Customers are 2.7 times more likely to consider a business reputable if they find a complete Business Profile on Google Search and Maps. (Google)
  11. Customers are 70% more likely to visit and 50% more likely to consider purchasing from businesses with a complete Business Profile. (Google)
  12. 76% of people who search on their smartphones for something nearby visit a business within a day. (Think With Google)
  13. 28% of searches for something nearby result in a purchase. (Think With Google)
  14. Mobile searches for “store open near me” (such as, “grocery store open near me” have grown by over 250% in the last two years. (Think With Google)

  1. People use Google Lens for 12 billion visual searches a month. (Google)
  2. 50% of online shoppers say images helped them decide what to buy. (Think With Google)
  3. There are an estimated 136 billion indexed images on Google Image Search. (Photutorial)
  4. 15.8% of Google SERPs show images. (Moz)
  5. People click on 3D images almost 50% more than static ones. (Google)

  1. More than 800 million people use Google Discover monthly to stay updated on their interests. (Google)
  2. 46% of Google Discover URLs are news sites, 44% e-commerce, 7% entertainment, and 2% travel. (Search Engine Journal)
  3. Even though news sites accounted for under 50% of Google Discover URLs, they received 99% of Discover clicks. (Search Engine Journal)
    Even though news sites accounted for under 50% of Google Discover URLs, they received 99% of Discover clicks.Even though news sites accounted for under 50% of Google Discover URLs, they received 99% of Discover clicks.
  4. Most Google Discover URLs only receive traffic for three to four days, with most of that traffic occurring one to two days after publishing. (Search Engine Journal)
  5. The clickthrough rate (CTR) for Google Discover is 11%. (Search Engine Journal)
  1. 91.45% of search volumes in Google Ads Keyword Planner are overestimates. (Ahrefs)
  2. For every $1 a business spends on Google Ads, they receive $8 in profit through Google Search and Ads. (Google)
  3. Google removed 5.5 billion ads, suspended 12.7 million advertiser accounts, restricted over 6.9 billion ads, and restricted ads from showing up on 2.1 billion publisher pages in 2023. (Google)
  4. The average shopping click-through rate (CTR) across all industries is 0.86% for Google Ads. (Wordstream)
  5. The average shopping cost per click (CPC) across all industries is $0.66 for Google Ads. (Wordstream)
  6. The average shopping conversion rate (CVR) across all industries is 1.91% for Google Ads. (Wordstream)

  1. 58% of consumers ages 25-34 use voice search daily. (UpCity)
  2. 16% of people use voice search for local “near me” searches. (UpCity)
  3. 67% of consumers say they’re very likely to use voice search when seeking information. (UpCity)
  4. Active users of the Google Assistant grew 4X over the past year, as of 2019. (Think With Google)
  5. Google Assistant hit 1 billion app installs. (Android Police)

  1. AI-generated answers from SGE were available for 91% of entertainment queries but only 17% of healthcare queries. (Statista)
  2. The AI-generated answers in Google’s Search Generative Experience (SGE) do not match any links from the top 10 Google organic search results 93.8% of the time. (Search Engine Journal)
  3. Google displays a Search Generative element for 86.8% of all search queries. (Authoritas)
    Google displays a Search Generative element for 86.8% of all search queries. Google displays a Search Generative element for 86.8% of all search queries.
  4. 62% of generative links came from sources outside the top 10 ranking organic domains. Only 20.1% of generative URLs directly match an organic URL ranking on page one. (Authoritas)
  5. 70% of SEOs said that they were worried about the impact of SGE on organic search (Aira)

Learn more

Check out more resources on how Google works:



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending