Connect with us
Cloak And Track Your Affiliate Links With Our User-Friendly Link Cloaking Tool, Try It Free

SEO

How to Use PageSpeed Insights

Published

on

How to Use PageSpeed Insights

PageSpeed Insights (PSI) is a tool provided by Google that analyzes a page’s performance and provides suggestions for improving its speed and user experience.

PageSpeed Insights works by analyzing a webpage’s HTML, CSS, fonts, and JavaScript, and provides suggestions to optimize the performance of the page. This includes things like compressing images, minifying code, and reducing the number of HTTP requests made by the page.

Let’s look at PageSpeed Insights more closely.

How to use PageSpeed Insights

To start, go to PageSpeed Insights. Enter a URL and click “Analyze.”

Enter a URL in PageSpeed Insights

You’ll have the option to switch between the Desktop and Mobile analysis. Mobile scores will usually be worse than Desktop scores. Mobile data may reveal more issues for you to resolve, and that’s what I recommend looking at.

Select between Mobile or Desktop analysis

Field data

The next section contains data from real users of your website. PageSpeed Insights is pulling this from the Chrome User Experience Report (CrUX), which contains the data of Chrome users who opted in to sharing that data. 

At the top is a tab to switch between page and origin (similar to domain) level data, which aggregates the data for many pages. You may not have data for all pages or even origin data. It depends on how many people visit your website and opt in to sharing this information.

As of April 2023, there are ~29.5 million origins in the CrUX dataset.

Select between URL or Origin metrics

The next section is all about Core Web Vitals (CWV), including a pass/fail assessment. The main metrics are Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS). These CWV metrics are the ones Google uses in its rankings.

CWV metrics in PageSpeed Insights

The numbers are color coded to show you that green = good, orange = needs improvement, and red = poor. In all, 75% of user experiences need to meet the threshold set for a given metric for it to be considered “good.” These are the thresholds:

Metric Good Needs improvement Poor
LCP 2500 ms 2500 ms–4000 ms > 4000 ms
FID 100 ms 100 ms–300 ms > 300 ms
CLS 0.1 0.1–0.25 > 0.25

If you click “Expand view,” you’ll see the distribution for each metric.

Distribution of CWV metrics

There are additional metrics from the CrUX database that are not currently used in the rankings. These include First Contentful Paint (FCP), Interaction to Next Paint (INP), and Time to First Byte (TTFB). INP will be replacing FID as a CWV metric in March 2024.

Additional metrics from CrUX

The numbers are color coded to show you that green = good, orange = needs improvement, and red = poor. 75% of user experiences need to meet the threshold set for a given metric for it to be considered “good.” These are the thresholds:

Metric Good Needs improvement Poor
FCP 1800 ms 1800 ms–3000 ms > 3000 ms
INP 200 ms 200 ms–500 ms > 500 ms
TTFB 800 ms 800 ms–1800 ms > 1800 ms

If you click “Expand view,” you’ll see the distribution for each metric.

Distribution of additional CrUX metrics

The last section tells you a bit about where this data comes from. The data is from real user experiences and is a rolling average over a 28-day period.

More info about the CrUX data

Lab data

Lighthouse is an open-source tool for measuring the performance and quality of webpages. It can be run in your own browser. But in the case of PageSpeed Insights, it runs on Google’s servers.

You’ll see several numbers for Performance, Accessibility, Best Practices, and SEO. All of these really just check for best practices, but they don’t tell you how well you’re doing in each of the areas.

Scores for Performance, Accessibility, Best Practices, and SEO

Once again, the metrics are color coded to quickly give you an idea of what is good and what you may need to work on. 

For the purpose of this article, we’re going to focus on the “Performance” section, since that is what SEOs use the tool for. First up, you have a performance score and a screenshot of the page.

Performance score in PageSpeed Insights

You’ll be scored between 0 and 100. The current score thresholds are:

  • Good: scores of 90–100
  • Needs improvement: scores of 50–89
  • Poor: scores of 0–49

As I mentioned, you can have a good score but still have a slow page that doesn’t pass CWV. Other factors, such as network conditions, server load, caching, and the user device, also affect page load time. 

Only 2% of tested pages score 100. A score of 50 puts you in the top 25% of tested pages. 

The score and metrics can change each time you run a test. This can happen because of network conditions, load, or browsers that make different decisions in the page-loading process. I recommend running three to five tests and averaging the results. 

The score is based on a calculation and involves weighting several of the metrics. The weights change between Mobile and Desktop. These are the current scores for Mobile, but check the calculator for the latest info.

How Lighthouse scores are computed

There’s another metric section, this time for the lab test metrics. You’ll find LCP and CLS here but won’t find the FID or INP metrics from CWV. Those require clicks on the page, which lab testing doesn’t reproduce. Instead, you can use Total Blocking Time (TBT) as a proxy metric to work on improving.

Lighthouse metrics in PageSpeed Insights

You can also click the “Expand view” button to get an expanded view that includes definitions for the metrics and links with more details.

Additional details for Lighthouse metrics in PageSpeed Insights

The last section tells you a bit about where this data comes from.

Details on the lab test conditions

If you hover over this information, you’ll get even more info on the test conditions. While PageSpeed Insights has traditionally used a Moto G4 as the test device for many years, it looks like that is now a Moto G Power. You can also get data on the location of the test, which will be North America, Europe, or Asia.

Hover over the data to see even more details on the lab test conditions

There are some snapshots that visually show you how a page loaded over time.

Screenshots of the page loading progression

If you click “View Treemap,” you can find the largest files and how much of the code is unused on the page.

Treemap view of the largest files on the page

By default, you’ll see issues related to all of the metrics. There are buttons where you can filter to issues impacting specific metrics that you may want to improve.

Select a metric to see only issues related to that metric

The “Opportunities” and “Diagnostics” sections will show you issues that can help your page performance.

Opportunities for improvement in PageSpeed Insights
Diagnostics in PageSpeed Insights reveal more information and issues

The estimated savings and improvements they show aren’t realistic. There can be other blockers so you won’t get the improvements shown or, in some cases, any improvement at all if you fix an issue. Sometimes, you have to fix multiple issues to actually see an improvement.

You can click to expand any of the elements. You’ll receive some guidance on how to fix each issue. The recommendations can change based on the system that’s being tested. For instance, I tested a page on our WordPress blog, and I saw WordPress-specific guidance.

Additional guidance for fixing issues

The tips are useful for translating some of the issues into terms you may have heard. For instance, the “defer offscreen elements” issue tells you that you should be lazy loading images. You can then search for a plugin in WordPress that handles lazy loading.

There is additional information that shows what the LCP image is, what elements are causing CLS, and what elements are blocking the main thread (what you need to reduce to improve FID/INP). This information can help you target fixes toward these elements.

Find your LCP element
See what layout shifts are happening on your page
Long main thread tasks lead to higher delays and impact FID and INP

There’s also a section for passed audits, showing you where you’re already doing a good job. You may still be able to improve these. But you’re likely better off spending time on other issues.

Passed audits show you what you're doing well

PageSpeed Insights has a great API. It lets you pull the field data from CrUX and the lab data from the Lighthouse test. You can also get page-level CWV data in bulk that can only be accessed via PageSpeed Insights.

The problem is that not everyone has the skills to query the data in bulk and store it. But we make it easy for you in Ahrefs’ Site Audit. Follow the instructions for setting up CWV in the Crawl settings.

Enter your PSI key in Ahrefs' Site Audit

When you run a crawl, we’ll pull the data from PageSpeed Insights into the Performance report, and you can drill into problematic pages.

Data from PSI in Performance report, via Ahrefs' Site Audit

We also show the data from previous crawls, which you can use to monitor performance over time. Additionally, you can see the distribution of affected pages for each individual metric.

CrUX and Lighthouse test history in Ahrefs' Site Audit

For evaluation and monitoring, I’d use Ahrefs, as shown above, to see the problematic pages and trends, as well as the Core Web Vitals report in Google Search Console (GSC)

The benefit of GSC is that it buckets similar URLs. For the bucketed pages, you will likely be working in one system or template. And when you fix the issues, you will fix them for all of the pages in that bucket.

Groupings of pages with similar issues in Google Search Console

Once you know what pages or templates you need to work on, you can use the guidance from PageSpeed Insights to make improvements. I’d also recommend checking out our guides on LCP, FID, and CLS for how to fix various issues.

To check the improvements, you can use PageSpeed Insights or run another crawl in Site Audit to get the data in bulk. The CWV data will take longer to show the impact of any changes because it is a 28-day average, so use PSI or the PSI data in Ahrefs to check if the changes you made improved the lab test metrics.

If you have any questions, message me on Twitter.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Reddit Limits Search Engine Access, Google Remains Exception

Published

on

By

Reddit Limits Search Engine Access, Google Remains Exception

Reddit has recently tightened its grip on who can access its content, blocking major search engines from indexing recent posts and comments.

This move has sparked discussions in the SEO and digital marketing communities about the future of content accessibility and AI training data.

What’s Happening?

First reported by 404 Media, Reddit updated its robots.txt file, preventing most web crawlers from accessing its latest content.

Google, however, remains an exception, likely due to a $60 million deal that allows the search giant to use Reddit’s content for AI training.

Brent Csutoras, founder of Search Engine Journal, offers some context:

“Since taking on new investors and starting their pathway to IPO, Reddit has moved away from being open-source and allowing anyone to scrape their content and use their APIs without paying.”

The Google Exception

Currently, Google is the only major search engine able to display recent Reddit results when users search with “site:reddit.com.”

This exclusive access sets Google apart from competitors like Bing and DuckDuckGo.

Why This Matters

For users who rely on appending “Reddit” to their searches to find human-generated answers, this change means they’ll be limited to using Google or search engines that pull from Google’s index.

It presents new challenges for SEO professionals and marketers in monitoring and analyzing discussions on one of the internet’s largest platforms.

The Bigger Picture

Reddit’s move aligns with a broader trend of content creators and platforms seeking compensation for using their data in AI training.

As Csutoras points out:

“Publications, artists, and entertainers have been suing OpenAI and other AI companies, blocking AI companies, and fighting to avoid using public content for AI training.”

What’s Next?

While this development may seem surprising, Csutoras suggests it’s a logical step for Reddit.

He notes:

“It seems smart on Reddit’s part, especially since similar moves in the past have allowed them to IPO and see strong growth for their valuation over the last two years.”


FAQ

What is the recent change Reddit has made regarding content accessibility?

Reddit has updated its robots.txt file to block major search engines from indexing its latest posts and comments. This change exempts Google due to a $60 million deal, allowing Google to use Reddit’s content for AI training purposes.

Why does Google have exclusive access to Reddit’s latest content?

Google has exclusive access to Reddit’s latest content because of a $60 million deal that allows Google to use Reddit’s content for AI training. This agreement sets Google apart from other search engines like Bing and DuckDuckGo, which are unable to index new Reddit posts and comments.

What broader trend does Reddit’s recent move reflect?

Reddit’s decision to limit search engine access aligns with a larger trend where content creators and platforms seek compensation for the use of their data in AI training. Many publications, artists, and entertainers are taking similar actions to either block or demand compensation from AI companies using their content.


Featured Image: Mamun sheikh K/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Cautions On Blocking GoogleOther Bot

Published

on

By

Google cautions about blocking and opting out of getting crawled by the GoogleOther crawler

Google’s Gary Illyes answered a question about the non-search features that the GoogleOther crawler supports, then added a caution about the consequences of blocking GoogleOther.

What Is GoogleOther?

GoogleOther is a generic crawler created by Google for the various purposes that fall outside of those of bots that specialize for Search, Ads, Video, Images, News, Desktop and Mobile. It can be used by internal teams at Google for research and development in relation to various products.

The official description of GoogleOther is:

“GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.”

Something that may be surprising is that there are actually three kinds of GoogleOther crawlers.

Three Kinds Of GoogleOther Crawlers

  1. GoogleOther
    Generic crawler for public URLs
  2. GoogleOther-Image
    Optimized to crawl public image URLs
  3. GoogleOther-Video
    Optimized to crawl public video URLs

All three GoogleOther crawlers can be used for research and development purposes. That’s just one purpose that Google publicly acknowledges that all three versions of GoogleOther could be used for.

What Non-Search Features Does GoogleOther Support?

Google doesn’t say what specific non-search features GoogleOther supports, probably because it doesn’t really “support” a specific feature. It exists for research and development crawling which could be in support of a new product or an improvement in a current product, it’s a highly open and generic purpose.

This is the question asked that Gary narrated:

“What non-search features does GoogleOther crawling support?”

Gary Illyes answered:

“This is a very topical question, and I think it is a very good question. Besides what’s in the public I don’t have more to share.

GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.

Historically Googlebot was used for this, but that kind of makes things murky and less transparent, so we launched GoogleOther so you have better controls over what your site is crawled for.

That said GoogleOther is not tied to a single product, so opting out of GoogleOther crawling might affect a wide range of things across the Google universe; alas, not Search, search is only Googlebot.”

It Might Affect A Wide Range Of Things

Gary is clear that blocking GoogleOther wouldn’t have an affect on Google Search because Googlebot is the crawler used for indexing content. So if blocking any of the three versions of GoogleOther is something a site owner wants to do, then it should be okay to do that without a negative effect on search rankings.

But Gary also cautioned about the outcome that blocking GoogleOther, saying that it would have an effect on other products and services across Google. He didn’t state which other products it could affect nor did he elaborate on the pros or cons of blocking GoogleOther.

Pros And Cons Of Blocking GoogleOther

Whether or not to block GoogleOther doesn’t necessarily have a straightforward answer. There are several considerations to whether doing that makes sense.

Pros

Inclusion in research for a future Google product that’s related to search (maps, shopping, images, a new feature in search) could be useful. It might be helpful to have a site included in that kind of research because it might be used for testing something good for a site and be one of the few sites chosen to test a feature that could increase earnings for a site.

Another consideration is that blocking GoogleOther to save on server resources is not necessarily a valid reason because GoogleOther doesn’t seem to crawl so often that it makes a noticeable impact.

If blocking Google from using site content for AI is a concern then blocking GoogleOther will have no impact on that at all. GoogleOther has nothing to do with crawling for Google Gemini apps or Vertex AI, including any future products that will be used for training associated language models. The bot for that specific use case is Google-Extended.

Cons

On the other hand it might not be helpful to allow GoogleOther if it’s being used to test something related to fighting spam and there’s something the site has to hide.

It’s possible that a site owner might not want to participate if GoogleOther comes crawling for market research or for training machine learning models (for internal purposes) that are unrelated to public-facing products like Gemini and Vertex.

Allowing GoogleOther to crawl a site for unknown purposes is like giving Google a blank check to use your site data in any way they see fit outside of training public-facing LLMs or purposes related to named bots like GoogleBot.

Takeaway

Should you block GoogleOther? It’s a coin toss. There are possible potential benefits but in general there isn’t enough information to make an informed decision.

Listen to the Google SEO Office Hours podcast at the 1:30 minute mark:

Featured Image by Shutterstock/Cast Of Thousands

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

AI Search Boosts User Satisfaction

Published

on

By

AI chat robot on search engine bar. Artificial intelligence bot innovation technology answer question with smart solution. 3D vector created from graphic software.

A new study finds that despite concerns about AI in online services, users are more satisfied with search engines and social media platforms than before.

The American Customer Satisfaction Index (ACSI) conducted its annual survey of search and social media users, finding that satisfaction has either held steady or improved.

This comes at a time when major tech companies are heavily investing in AI to enhance their services.

Search Engine Satisfaction Holds Strong

Google, Bing, and other search engines have rapidly integrated AI features into their platforms over the past year. While critics have raised concerns about potential negative impacts, the ACSI study suggests users are responding positively.

Google maintains its position as the most satisfying search engine with an ACSI score of 81, up 1% from last year. Users particularly appreciate its AI-powered features.

Interestingly, Bing and Yahoo! have seen notable improvements in user satisfaction, notching 3% gains to reach scores of 77 and 76, respectively. These are their highest ACSI scores in over a decade, likely due to their AI enhancements launched in 2023.

The study hints at the potential of new AI-enabled search functionality to drive further improvements in the customer experience. Bing has seen its market share improve by small but notable margins, rising from 6.35% in the first quarter of 2023 to 7.87% in Q1 2024.

Customer Experience Improvements

The ACSI study shows improvements across nearly all benchmarks of the customer experience for search engines. Notable areas of improvement include:

  • Ease of navigation
  • Ease of using the site on different devices
  • Loading speed performance and reliability
  • Variety of services and information
  • Freshness of content

These improvements suggest that AI enhancements positively impact various aspects of the search experience.

Social Media Sees Modest Gains

For the third year in a row, user satisfaction with social media platforms is on the rise, increasing 1% to an ACSI score of 74.

TikTok has emerged as the new industry leader among major sites, edging past YouTube with a score of 78. This underscores the platform’s effective use of AI-driven content recommendations.

Meta’s Facebook and Instagram have also seen significant improvements in user satisfaction, showing 3-point gains. While Facebook remains near the bottom of the industry at 69, Instagram’s score of 76 puts it within striking distance of the leaders.

Challenges Remain

Despite improvements, the study highlights ongoing privacy and advertising challenges for search engines and social media platforms. Privacy ratings for search engines remain relatively low but steady at 79, while social media platforms score even lower at 73.

Advertising experiences emerge as a key differentiator between higher- and lower-satisfaction brands, particularly in social media. New ACSI benchmarks reveal user concerns about advertising content’s trustworthiness and personal relevance.

Why This Matters For SEO Professionals

This study provides an independent perspective on how users are responding to the AI push in online services. For SEO professionals, these findings suggest that:

  1. AI-enhanced search features resonate with users, potentially changing search behavior and expectations.
  2. The improving satisfaction with alternative search engines like Bing may lead to a more diverse search landscape.
  3. The continued importance of factors like content freshness and site performance in user satisfaction aligns with long-standing SEO best practices.

As AI becomes more integrated into our online experiences, SEO strategies may need to adapt to changing user preferences.


Featured Image: kate3155/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending