Connect with us

SEO

A Complete Guide To Local Markup & Rich Results

Published

on

A Complete Guide To Local Markup & Rich Results

How important is schema markup for local search engine optimization (SEO)?

Most local SEO experts and webmasters are familiar with the impact of having well-optimized SEO elements on their landing pages, such as optimized title tags, well-written content, and more.

However, what exactly can you accomplish by applying schema markup to your local business website?

Quite a bit, actually.

When it comes to organic search, there are several reasons why having a proper and thorough schema applied to your website is a substantial competitive advantage.

In fact, it’s been reiterated by Google time and time again that schema helps search crawlers do their job more effectively by helping them comprehend a landing page and delivering relevant information in the SERPs.

In this post, we will share a few recommendations to help your local business get the most out of using schema to boost your local SEO.

First, let’s start with defining what exactly schema markup is.

The Difference Between Schema, Structured Data & Rich Results

The terms “structured data” and “schema” are often used interchangeably in webmaster and SEO verticals.

However, before we dive into the recommendations it’s helpful to know the semantic differences between these terms.

Structured Data

Google defines structured data as “a standardized format for providing information about a page and classifying the page content.”

To put it simply, this format was developed to help search engines accurately understand a webpage to properly display snippets of information in the search results pages.

Schema

Schema is a form of structured data that was officially launched via schema.org.

Schema was created via a collaborative project by all the major search engines (Google, Yahoo, Bing, and Yandex) in 2011.

Utilizing the markup available on schema.org enables a landing page to be eligible for rich results.

Rich Results

Rich results (formerly called rich snippets) are any extra information you see in the search engine results pages (SERPs) that are beyond the atypical blue title tag and meta description (breadcrumbs, review stars, sitelinks, etc.).

Google provides two tools to audit structured data on your website: the Schema Markup Validator and the Rich Results Test.

Below are a few examples of local businesses that are benefitting from rich results:

Review Rich Results Example

Image from Google, May 2022Review Rich Results

Breadcrumb Rich Results Example

Breadcrumb rich resultsImage from Google, May 2022Breadcrumb rich results

Sitelink Rich Results Example

Sitelink Rich Results exampleImage from Google, May 2022Sitelink Rich Results example

FAQ Rich Results Example

FAQ Rich Results exampleImage from Google, May 2022FAQ Rich Results example

Is Structured Data A Local Ranking Signal?

There has been much debate over the years about whether or not structured data in itself is a search engine ranking signal.

Prominent Google engineer John Mueller has specified more than once that structured data by itself is not a direct search engine ranking signal.

However, structured data indirectly improves search engine visibility through the following means.

Structured Data Helps Search Engine Crawlers Better Comprehend Landing Pages

Properly and thoroughly implemented structured data makes the search crawler’s job easier.

A good analogy would be comparing website properties (content, images, media files, etc.) to a garage full of various boxes and items (snow shovel for the winter, inflatable pool for the summer, etc.).

Let’s say you are having a garage sale and you want visitors (i.e. more website visitors).

It’s Google’s job to advertise your garage sale on the search results pages.

For most websites, Google provides the bare minimum blue title tag and meta descriptions.

However, if your website is properly marked up with structured data then Google may very well reward your websites with a bigger advertisement (i.e. rich results) about your garage sale.

Structured data essentially puts labels on the different objects in your garage making the Google search crawler’s job easier.

Structured Data Improves The Possibility Of Obtaining Rich Results Which Improves Click-through Rates

A rich result is much more eye-catching in the search results and will most likely improve CTR (click-through rates).

The CTR boost can vary depending on what kind of rich result is obtained, for example, FAQ results do very well.

This means your landing page is receiving more traffic because users are seeing relevant snippets about what it contains.

There is also some debate that increased CTR might be a positive SEO signal in itself (signals more engagement & relevancy).

Either way, having an improved CTR means more traffic wherever your website ranks.

What Structured Data Is Recommended For Local Business Websites?

Most local websites have at least some basic structured data enabled.

However, the more thorough and detailed structured data is properly applied the better.

Next, we’ll offer some step-by-step recommendations for how to properly apply structured data:

Select The Best Schema.org Category

Schema.org provides several different schema property options that are uniquely relevant for local businesses.

In order to have necessary local business schema properties (which will be discussed further in detail below), it is imperative to select the most relevant schema category for your local business.

For example, if you are promoting an ice cream chain, the most relevant category is schema.org/IceCreamShop.

If you are trying to promote a local hardware chain then you’d select schema.org/HardwareStore.

Relevant schema categories will help Google better topically understand your website.

What If There Are No Relevant Schema Categories For My Local Business?

If you can’t find a schema.org category that is relevant for your business then the default category should be schema.org/LocalBusiness.

If you’re technically inclined, it is possible to post new schema category recommendations on the schema.org Github forum.

The schema.org developers respond to detailed recommendations on this forum and occasionally create new schema.org properties.

I Selected The Most Accurate Category So What Should I Implement?

After you’ve selected the appropriate category for your business you must have the below schema.org sub-properties to ensure your schema validates.

Errors could disqualify you from obtaining rich results.

The below schema properties are required for validation:

  • Url: The URL of the associated landing page.
  • Name: Name of the business.
  • OpeningHours: Opening and closing hours of a business.
  • Telephone: Contact telephone number for the business.
  • Image: This can be any relevant image file on your landing page.  It is recommended to use a storefront image if that’s available.
  • Logo: This should be a link to your business logo image.
  • Address: The business address which should be visible on the landing page.
  • Geo: This is the geo coordinates of your business location.
  • AreaServed: It is recommended to use a zipcode for this schema property.
  • MainContentOfPage: Main body content of your landing page.

Common schema properties that are highly recommended:

  • Review: A review of your local business.
  • AggregrateRating: The overall rating, based on a collection of reviews or ratings, of the item.  Make sure to follow Google’s rules on Review Rich Results on this.
  • FAQPage: If you have a FAQ page it is imperative to add this specialty schema. Make sure to follow Google’s rules and guidelines.
  • AlternateName: Businesses commonly have related names e.g. Acme Stores vs. Acme Inc. The alternateName property marks up other well-known corporate name variations (including abbreviations).
  • SameAs: This is a reference to a 3rd party websites that are related to the website’s identity i.e. Facebook pages, Youtube Channel pages, Wikipedia pages, etc.
  • HasMap: A URL to the map of your local business.
  • Breadcrumb: This schema marks up the existing breadcrumb navigation structure on your website. This schema is highly recommended because it often appears in the SERPS as a rich result.
  • Department: Many chain retailers have internal departments (e.g. pharmacies inside grocery stores). This specialty schema helps markup these department stores.
  • PriceRange: The price range of the business, for example, $$$.

More advanced schema types:

  • Sitelinks Search Box: A sitelinks search box is a quick way for users to do an internal search on your website via the Google SERP vs. visiting your website directly.
  • AdditionalType: This is a specialty schema that helps Google understand what your website is topically related to. This can be accomplished by using Wikipedia categories as values for this property. For example, if a local business sells sporting gear it is recommended to have the additionalType property  https://en.wikipedia.org/wiki/Sports_equipment.

How Do You Make Sure Your Structured Data Is Validated?

It is very important to make sure your structured data is properly validated.

If it’s not then your landing page will most likely not qualify for rich results.

Google specifically says that if there are error(s): The rich result cannot appear in Google Search as a rich result.

As mentioned earlier there are two different tools to make sure your schema is properly validated: Schema Markup Validator and the Rich Results Test.

Google Search Console also provides enhancement reports on structured data which will be explained in further detail below.

Schema Markup Validator

The Schema Markup Validator enables you to get into the details of structured data itself.

It shows both errors and warnings.

It also allows you to test structured data before it’s enabled on your webpages via pasting code directly into the tool.

Example Of Schema Markup Validator Result

Example of Schema Markup Validator ResultImage from Schema Markup Validator, May 2022Example of Schema Markup Validator Result

Also just to note that while it’s imperative to correct structured data errors you will also often see structured “warnings.”

These warnings are of much lesser concern and Google’s John Muller even mentioned you don’t have to fix all warnings.

A lot of sites have earnings with structured data and that’s perfectly fine.

Rich Results Test

The Rich Results Test is Google’s official tool to see which rich results can be generated by structured data.

This tool also lets you preview how rich results will look in Google SERPs.

Example Of Rich Results Test Preview

Rich Results Test PreviewImage from Rich Result test tool, May 2022Rich Results Test Preview

The Rich Result test tool will report structured data errors and warnings as well.

As mentioned earlier, warnings are common and won’t prevent rich results from appearing.

However, structured data errors must be resolved to qualify for rich results.

Structured Data Monitoring Via Google Search Console

Google also offers sitewide structured data monitoring via Google Search Console.

It is highly recommended to have a verified Google Search Console account for your local business website to enable monitoring.

Google Search Console will provide sitewide enhancement reports on how many webpages have validated structured data, warnings, and errors.

Google also sends notification emails if there are issues with structured data on your local business website.

It is recommended to pay attention to these notifications.

Example Of Sitewide Structured Data Report

Example of Sitewide Structured Data ReportImage from Google Search Console, May 2022Example of Sitewide Structured Data Report

How Can I Tell How Many Rich Results My Website Is Getting In The SERPs?

Besides spot-checking rich results, it would be ideal to see how well a local business website is performing across all the Google SERPs.

There are few third-party SEO tools that scrape Google SERPs and provide reports.

One notable tool, Semrush, has a “SERP Feature” report that shows how many aggregate rich results your website is getting.

Example Of Semrush SERP Feature Report

SERP featuresImage from Semrush, May 2022SERP features

Is There Anything I Should Avoid When Using Structured Data?

Structured data is meant to be code to label or markup existing properties on your local business website.

Google explicitly requires that your structured data matches what is on the associated landing page.

However, structured data spam does exist and Google can apply manual penalties if they believe a webmaster is egregiously breaking the rules.

Make sure to follow Google’s structured data guidelines carefully.

Conclusion

There is no drawback in applying properly formatted and relevant structured data to your local business’ website.

Also, schema.org is continually coming out with new schema properties along with more integration via Google Search Console.

Most common SEO strategies (meta tag optimization, custom copywriting, design changes, etc.) usually require significant effort and visible on-page website updates.

In comparison, structured data updates are invisible to users visiting your website.

They also don’t require any direct changes to anything on your website besides including a new source code script.

They also have great potential to substantially improve visibility in the Google SERPs via rich results.

If you’re a local business looking to further optimize your website make sure to visit schema.org along with a webmaster to start applying structured data.

More resources:


Featured Image: Hangouts Vector Pro/Shutterstock

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘how-to-use-schema-for-local-seo-a-complete-guide’,
content_category: ‘local-search technical-seo’
});

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

No Algorithmic Actions For Site Reputation Abuse Yet

Published

on

By

Looking up at an angle at the Google sign on the Head Office for Canada

Google’s Search Liaison, Danny Sullivan, has confirmed that the search engine hasn’t launched algorithmic actions targeting site reputation abuse.

This clarification addresses speculation within the SEO community that recent traffic drops are related to Google’s previously announced policy update.

Sullivan Says No Update Rolled Out

Lily Ray, an SEO professional, shared a screenshot on Twitter showing a significant drop in traffic for the website Groupon starting on May 6.

Ray suggested this was evidence that Google had begun rolling out algorithmic penalties for sites violating the company’s site reputation abuse policy.

However, Sullivan quickly stepped in, stating:

“We have not gone live with algorithmic actions on site reputation abuse. I well imagine when we do, we’ll be very clear about that. Publishers seeing changes and thinking it’s this — it’s not — results change all the time for all types of reasons.”

Sullivan added that when the actions are rolled out, they will only impact specific content, not entire websites.

This is an important distinction, as it suggests that even if a site has some pages manually penalized, the rest of the domain can rank normally.

Background On Google’s Site Reputation Abuse Policy

Earlier this year, Google announced a new policy to combat what it calls “site reputation abuse.”

This refers to situations where third-party content is published on authoritative domains with little oversight or involvement from the host site.

Examples include sponsored posts, advertorials, and partner content that is loosely related to or unrelated to a site’s primary purpose.

Under the new policy, Google is taking manual action against offending pages and plans to incorporate algorithmic detection.

What This Means For Publishers & SEOs

While Google hasn’t launched any algorithmic updates related to site reputation abuse, the manual actions have publishers on high alert.

Those who rely heavily on sponsored content or partner posts to drive traffic should audit their sites and remove any potential policy violations.

Sullivan’s confirmation that algorithmic changes haven’t occurred may provide temporary relief.

Additionally, his statements also serve as a reminder that significant ranking fluctuations can happen at any time due to various factors, not just specific policy rollouts.


FAQ

Will Google’s future algorithmic actions impact entire websites or specific content?

When Google eventually rolls out algorithmic actions for site reputation abuse, these actions will target specific content rather than the entire website.

This means that if certain pages are found to be in violation, only those pages will be affected, allowing other parts of the site to continue ranking normally.

What should publishers and SEOs do in light of Google’s site reputation abuse policy?

Publishers and SEO professionals should audit their sites to identify and remove any content that may violate Google’s site reputation abuse policy.

This includes sponsored posts and partner content that doesn’t align with the site’s primary purpose. Taking these steps can mitigate the risk of manual penalties from Google.

What is the context of the recent traffic drops seen in the SEO community?

Google claims the recent drops for coupon sites aren’t linked to any algorithmic actions for site reputation abuse. Traffic fluctuations can occur for various reasons and aren’t always linked to a specific algorithm update.


Featured Image: sockagphoto/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

WP Rocket WordPress Plugin Now Optimizes LCP Core Web Vitals Metric

Published

on

By

WP Rocket WordPress Plugin Now Optimizes LCP Core Web Vitals Metric

WP Rocket, the WordPress page speed performance plugin, just announced the release of a new version that will help publishers optimize for Largest Contentful Paint (LCP), an important Core Web Vitals metric.

Large Contentful Paint (LCP)

LCP is a page speed metric that’s designed to show how fast it takes for a user to perceive that the page is loaded and read to be interacted with. This metric measures the time it takes for the main content elements has fully loaded. This gives an idea of how usable a webpage is. The faster the LCP the better the user experience will be.

WP Rocket 3.16

WP Rocket is a caching plugin that helps a site perform faster. The way page caching generally works is that the website will store frequently accessed webpages and resources so that when someone visits the page the website doesn’t have to fetch the data from the database, which takes time, but instead will serve the webpage from the cache. This is super important when a website has a lot of site visitors because that can use a lot of server resources to fetch and build the same website over and over for every visitor.

The lastest version of WP Rocket (3.16) now contains Automatic LCP optimization, which means that it will optimize the on-page elements from the main content so that they are served first thereby raising the LCP scores and providing a better user experience.

Because it’s automatic there’s really nothing to fiddle around with or fine tune.

According to WP Rocket:

  • Automatic LCP Optimization: Optimizes the Largest Contentful Paint, a critical metric for website speed, automatically enhancing overall PageSpeed scores.
  • Smart Management of Above-the-Fold Images: Automatically detects and prioritizes critical above-the-fold images, loading them immediately to improve user experience and performance metrics.

All new functionalities operate seamlessly in the background, requiring no direct intervention from the user. Upon installing or upgrading to WP Rocket 3.16, these optimizations are automatically enabled, though customization options remain accessible for those who prefer manual control.”

Read the official announcement:

WP Rocket 3.16: Improving LCP and PageSpeed Score Automatically

Featured Image by Shutterstock/ICONMAN66

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Optimizing Interaction To Next Paint: A Step-By-Step Guide

Published

on

By

Optimizing Interaction To Next Paint: A Step-By-Step Guide

This post was sponsored by DebugBear. The opinions expressed in this article are the sponsor’s own.

Keeping your website fast is important for user experience and SEO.

The Core Web Vitals initiative by Google provides a set of metrics to help you understand the performance of your website.

The three Core Web Vitals metrics are:

This post focuses on the recently introduced INP metric and what you can do to improve it.

How Is Interaction To Next Paint Measured?

INP measures how quickly your website responds to user interactions – for example, a click on a button. More specifically, INP measures the time in milliseconds between the user input and when the browser has finished processing the interaction and is ready to display any visual updates on the page.

Your website needs to complete this process in under 200 milliseconds to get a “Good” score. Values over half a second are considered “Poor”. A poor score in a Core Web Vitals metric can negatively impact your search engine rankings.

Google collects INP data from real visitors on your website as part of the Chrome User Experience Report (CrUX). This CrUX data is what ultimately impacts rankings.

Image created by DebugBear, May 2024

How To Identify & Fix Slow INP Times

The factors causing poor Interaction to Next Paint can often be complex and hard to figure out. Follow this step-by-step guide to understand slow interactions on your website and find potential optimizations.

1. How To Identify A Page With Slow INP Times

Different pages on your website will have different Core Web Vitals scores. So you need to identify a slow page and then investigate what’s causing it to be slow.

Using Google Search Console

One easy way to check your INP scores is using the Core Web Vitals section in Google Search Console, which reports data based on the Google CrUX data we’ve discussed before.

By default, page URLs are grouped into URL groups that cover many different pages. Be careful here – not all pages might have the problem that Google is reporting. Instead, click on each URL group to see if URL-specific data is available for some pages and then focus on those.

1716368164 358 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of Google Search Console, May 2024

Using A Real-User Monitoring (RUM) Service

Google won’t report Core Web Vitals data for every page on your website, and it only provides the raw measurements without any details to help you understand and fix the issues. To get that you can use a real-user monitoring tool like DebugBear.

Real-user monitoring works by installing an analytics snippet on your website that measures how fast your website is for your visitors. Once that’s set up you’ll have access to an Interaction to Next Paint dashboard like this:

1716368164 404 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the DebugBear Interaction to Next Paint dashboard, May 2024

You can identify pages you want to optimize in the list, hover over the URL, and click the funnel icon to look at data for that specific page only.

1716368164 975 Optimizing Interaction To Next Paint A Step By Step GuideImage created by DebugBear, May 2024

2. Figure Out What Element Interactions Are Slow

Different visitors on the same page will have different experiences. A lot of that depends on how they interact with the page: if they click on a background image there’s no risk of the page suddenly freezing, but if they click on a button that starts some heavy processing then that’s more likely. And users in that second scenario will experience much higher INP.

To help with that, RUM data provides a breakdown of what page elements users interacted with and how big the interaction delays were.

1716368164 348 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the DebugBear INP Elements view, May 2024

The screenshot above shows different INP interactions sorted by how frequent these user interactions are. To make optimizations as easy as possible you’ll want to focus on a slow interaction that affects many users.

In DebugBear, you can click on the page element to add it to your filters and continue your investigation.

3. Identify What INP Component Contributes The Most To Slow Interactions

INP delays can be broken down into three different components:

  • Input Delay: Background code that blocks the interaction from being processed.
  • Processing Time: The time spent directly handling the interaction.
  • Presentation Delay: Displaying the visual updates to the screen.

You should focus on which INP component is the biggest contributor to the slow INP time, and ensure you keep that in mind during your investigation.

1716368164 193 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the DebugBear INP Components, May 2024

In this scenario, Processing Time is the biggest contributor to the slow INP time for the set of pages you’re looking at, but you need to dig deeper to understand why.

High processing time indicates that there is code intercepting the user interaction and running slow performing code. If instead you saw a high input delay, that suggests that there are background tasks blocking the interaction from being processed, for example due to third-party scripts.

4. Check Which Scripts Are Contributing To Slow INP

Sometimes browsers report specific scripts that are contributing to a slow interaction. Your website likely contains both first-party and third-party scripts, both of which can contribute to slow INP times.

A RUM tool like DebugBear can collect and surface this data. The main thing you want to look at is whether you mostly see your own website code or code from third parties.

1716368164 369 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the INP Primary Script Domain Grouping in DebugBear, May 2024

Tip: When you see a script, or source code function marked as “N/A”, this can indicate that the script comes from a different origin and has additional security restrictions that prevent RUM tools from capturing more detailed information.

This now begins to tell a story: it appears that analytics/third-party scripts are the biggest contributors to the slow INP times.

5. Identify Why Those Scripts Are Running

At this point, you now have a strong suspicion that most of the INP delay, at least on the pages and elements you’re looking at, is due to third-party scripts. But how can you tell whether those are general tracking scripts or if they actually have a role in handling the interaction?

DebugBear offers a breakdown that helps see why the code is running, called the INP Primary Script Invoker breakdown. That’s a bit of a mouthful – multiple different scripts can be involved in slowing down an interaction, and here you just see the biggest contributor. The “Invoker” is just a value that the browser reports about what caused this code to run.

1716368165 263 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the INP Primary Script Invoker Grouping in DebugBear, May 2024

The following invoker names are examples of page-wide event handlers:

  • onclick
  • onmousedown
  • onpointerup

You can see those a lot in the screenshot above, which tells you that the analytics script is tracking clicks anywhere on the page.

In contrast, if you saw invoker names like these that would indicate event handlers for a specific element on the page:

  • .load_more.onclick
  • #logo.onclick

6. Review Specific Page Views

A lot of the data you’ve seen so far is aggregated. It’s now time to look at the individual INP events, to form a definitive conclusion about what’s causing slow INP in this example.

Real user monitoring tools like DebugBear generally offer a way to review specific user experiences. For example, you can see what browser they used, how big their screen is, and what element led to the slowest interaction.

1716368165 545 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of a Page View in DebugBear Real User Monitoring, May 2024

As mentioned before, multiple scripts can contribute to overall slow INP. The INP Scripts section shows you the scripts that were run during the INP interaction:

1716368165 981 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the DebugBear INP script breakdown, May 2024

You can review each of these scripts in more detail to understand why they run and what’s causing them to take longer to finish.

7. Use The DevTools Profiler For More Information

Real user monitoring tools have access to a lot of data, but for performance and security reasons they can access nowhere near all the available data. That’s why it’s a good idea to also use Chrome DevTools to measure your page performance.

To debug INP in DevTools you can measure how the browser processes one of the slow interactions you’ve identified before. DevTools then shows you exactly how the browser is spending its time handling the interaction.

1716368165 526 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of a performance profile in Chrome DevTools, May 2024

How You Might Resolve This Issue

In this example, you or your development team could resolve this issue by:

  • Working with the third-party script provider to optimize their script.
  • Removing the script if it is not essential to the website, or finding an alternative provider.
  • Adjusting how your own code interacts with the script

How To Investigate High Input Delay

In the previous example most of the INP time was spent running code in response to the interaction. But often the browser is already busy running other code when a user interaction happens. When investigating the INP components you’ll then see a high input delay value.

This can happen for various reasons, for example:

  • The user interacted with the website while it was still loading.
  • A scheduled task is running on the page, for example an ongoing animation.
  • The page is loading and rendering new content.

To understand what’s happening, you can review the invoker name and the INP scripts section of individual user experiences.

1716368165 86 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the INP Component breakdown within DebugBear, May 2024

In this screenshot, you can see that a timer is running code that coincides with the start of a user interaction.

The script can be opened to reveal the exact code that is run:

1716368165 114 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of INP script details in DebugBear, May 2024

The source code shown in the previous screenshot comes from a third-party user tracking script that is running on the page.

At this stage, you and your development team can continue with the INP workflow presented earlier in this article. For example, debugging with browser DevTools or contacting the third-party provider for support.

How To Investigate High Presentation Delay

Presentation delay tends to be more difficult to debug than input delay or processing time. Often it’s caused by browser behavior rather than a specific script. But as before, you still start by identifying a specific page and a specific interaction.

You can see an example interaction with high presentation delay here:

1716368165 665 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the an interaction with high presentation delay, May 2024

You see that this happens when the user enters text into a form field. In this example, many visitors pasted large amounts of text that the browser had to process.

Here the fix was to delay the processing, show a “Waiting…” message to the user, and then complete the processing later on. You can see how the INP score improves from May 3:

1716368165 845 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of an Interaction to Next Paint timeline in DebugBear, May 2024

Get The Data You Need To Improve Interaction To Next Paint

Setting up real user monitoring helps you understand how users experience your website and what you can do to improve it. Try DebugBear now by signing up for a free 14-day trial.

1716368165 494 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the DebugBear Core Web Vitals dashboard, May 2024

Google’s CrUX data is aggregated over a 28-day period, which means that it’ll take a while before you notice a regression. With real-user monitoring you can see the impact of website changes right away and get alerted automatically when there’s a big change.

DebugBear monitors lab data, CrUX data, and real user data. That way you have all the data you need to optimize your Core Web Vitals in one place.

This article has been sponsored by DebugBear, and the views presented herein represent the sponsor’s perspective.

Ready to start optimizing your website? Sign up for DebugBear and get the data you need to deliver great user experiences.


Image Credits

Featured Image: Image by Redesign.co. Used with permission.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending