Connect with us

SEO

Optimizing Interaction To Next Paint: A Step-By-Step Guide

Published

on

Optimizing Interaction To Next Paint: A Step-By-Step Guide

This post was sponsored by DebugBear. The opinions expressed in this article are the sponsor’s own.

Keeping your website fast is important for user experience and SEO.

The Core Web Vitals initiative by Google provides a set of metrics to help you understand the performance of your website.

The three Core Web Vitals metrics are:

This post focuses on the recently introduced INP metric and what you can do to improve it.

How Is Interaction To Next Paint Measured?

INP measures how quickly your website responds to user interactions – for example, a click on a button. More specifically, INP measures the time in milliseconds between the user input and when the browser has finished processing the interaction and is ready to display any visual updates on the page.

Your website needs to complete this process in under 200 milliseconds to get a “Good” score. Values over half a second are considered “Poor”. A poor score in a Core Web Vitals metric can negatively impact your search engine rankings.

Google collects INP data from real visitors on your website as part of the Chrome User Experience Report (CrUX). This CrUX data is what ultimately impacts rankings.

Image created by DebugBear, May 2024

How To Identify & Fix Slow INP Times

The factors causing poor Interaction to Next Paint can often be complex and hard to figure out. Follow this step-by-step guide to understand slow interactions on your website and find potential optimizations.

1. How To Identify A Page With Slow INP Times

Different pages on your website will have different Core Web Vitals scores. So you need to identify a slow page and then investigate what’s causing it to be slow.

Using Google Search Console

One easy way to check your INP scores is using the Core Web Vitals section in Google Search Console, which reports data based on the Google CrUX data we’ve discussed before.

By default, page URLs are grouped into URL groups that cover many different pages. Be careful here – not all pages might have the problem that Google is reporting. Instead, click on each URL group to see if URL-specific data is available for some pages and then focus on those.

1716368164 358 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of Google Search Console, May 2024

Using A Real-User Monitoring (RUM) Service

Google won’t report Core Web Vitals data for every page on your website, and it only provides the raw measurements without any details to help you understand and fix the issues. To get that you can use a real-user monitoring tool like DebugBear.

Real-user monitoring works by installing an analytics snippet on your website that measures how fast your website is for your visitors. Once that’s set up you’ll have access to an Interaction to Next Paint dashboard like this:

1716368164 404 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the DebugBear Interaction to Next Paint dashboard, May 2024

You can identify pages you want to optimize in the list, hover over the URL, and click the funnel icon to look at data for that specific page only.

1716368164 975 Optimizing Interaction To Next Paint A Step By Step GuideImage created by DebugBear, May 2024

2. Figure Out What Element Interactions Are Slow

Different visitors on the same page will have different experiences. A lot of that depends on how they interact with the page: if they click on a background image there’s no risk of the page suddenly freezing, but if they click on a button that starts some heavy processing then that’s more likely. And users in that second scenario will experience much higher INP.

To help with that, RUM data provides a breakdown of what page elements users interacted with and how big the interaction delays were.

1716368164 348 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the DebugBear INP Elements view, May 2024

The screenshot above shows different INP interactions sorted by how frequent these user interactions are. To make optimizations as easy as possible you’ll want to focus on a slow interaction that affects many users.

In DebugBear, you can click on the page element to add it to your filters and continue your investigation.

3. Identify What INP Component Contributes The Most To Slow Interactions

INP delays can be broken down into three different components:

  • Input Delay: Background code that blocks the interaction from being processed.
  • Processing Time: The time spent directly handling the interaction.
  • Presentation Delay: Displaying the visual updates to the screen.

You should focus on which INP component is the biggest contributor to the slow INP time, and ensure you keep that in mind during your investigation.

1716368164 193 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the DebugBear INP Components, May 2024

In this scenario, Processing Time is the biggest contributor to the slow INP time for the set of pages you’re looking at, but you need to dig deeper to understand why.

High processing time indicates that there is code intercepting the user interaction and running slow performing code. If instead you saw a high input delay, that suggests that there are background tasks blocking the interaction from being processed, for example due to third-party scripts.

4. Check Which Scripts Are Contributing To Slow INP

Sometimes browsers report specific scripts that are contributing to a slow interaction. Your website likely contains both first-party and third-party scripts, both of which can contribute to slow INP times.

A RUM tool like DebugBear can collect and surface this data. The main thing you want to look at is whether you mostly see your own website code or code from third parties.

1716368164 369 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the INP Primary Script Domain Grouping in DebugBear, May 2024

Tip: When you see a script, or source code function marked as “N/A”, this can indicate that the script comes from a different origin and has additional security restrictions that prevent RUM tools from capturing more detailed information.

This now begins to tell a story: it appears that analytics/third-party scripts are the biggest contributors to the slow INP times.

5. Identify Why Those Scripts Are Running

At this point, you now have a strong suspicion that most of the INP delay, at least on the pages and elements you’re looking at, is due to third-party scripts. But how can you tell whether those are general tracking scripts or if they actually have a role in handling the interaction?

DebugBear offers a breakdown that helps see why the code is running, called the INP Primary Script Invoker breakdown. That’s a bit of a mouthful – multiple different scripts can be involved in slowing down an interaction, and here you just see the biggest contributor. The “Invoker” is just a value that the browser reports about what caused this code to run.

1716368165 263 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the INP Primary Script Invoker Grouping in DebugBear, May 2024

The following invoker names are examples of page-wide event handlers:

  • onclick
  • onmousedown
  • onpointerup

You can see those a lot in the screenshot above, which tells you that the analytics script is tracking clicks anywhere on the page.

In contrast, if you saw invoker names like these that would indicate event handlers for a specific element on the page:

  • .load_more.onclick
  • #logo.onclick

6. Review Specific Page Views

A lot of the data you’ve seen so far is aggregated. It’s now time to look at the individual INP events, to form a definitive conclusion about what’s causing slow INP in this example.

Real user monitoring tools like DebugBear generally offer a way to review specific user experiences. For example, you can see what browser they used, how big their screen is, and what element led to the slowest interaction.

1716368165 545 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of a Page View in DebugBear Real User Monitoring, May 2024

As mentioned before, multiple scripts can contribute to overall slow INP. The INP Scripts section shows you the scripts that were run during the INP interaction:

1716368165 981 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the DebugBear INP script breakdown, May 2024

You can review each of these scripts in more detail to understand why they run and what’s causing them to take longer to finish.

7. Use The DevTools Profiler For More Information

Real user monitoring tools have access to a lot of data, but for performance and security reasons they can access nowhere near all the available data. That’s why it’s a good idea to also use Chrome DevTools to measure your page performance.

To debug INP in DevTools you can measure how the browser processes one of the slow interactions you’ve identified before. DevTools then shows you exactly how the browser is spending its time handling the interaction.

1716368165 526 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of a performance profile in Chrome DevTools, May 2024

How You Might Resolve This Issue

In this example, you or your development team could resolve this issue by:

  • Working with the third-party script provider to optimize their script.
  • Removing the script if it is not essential to the website, or finding an alternative provider.
  • Adjusting how your own code interacts with the script

How To Investigate High Input Delay

In the previous example most of the INP time was spent running code in response to the interaction. But often the browser is already busy running other code when a user interaction happens. When investigating the INP components you’ll then see a high input delay value.

This can happen for various reasons, for example:

  • The user interacted with the website while it was still loading.
  • A scheduled task is running on the page, for example an ongoing animation.
  • The page is loading and rendering new content.

To understand what’s happening, you can review the invoker name and the INP scripts section of individual user experiences.

1716368165 86 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the INP Component breakdown within DebugBear, May 2024

In this screenshot, you can see that a timer is running code that coincides with the start of a user interaction.

The script can be opened to reveal the exact code that is run:

1716368165 114 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of INP script details in DebugBear, May 2024

The source code shown in the previous screenshot comes from a third-party user tracking script that is running on the page.

At this stage, you and your development team can continue with the INP workflow presented earlier in this article. For example, debugging with browser DevTools or contacting the third-party provider for support.

How To Investigate High Presentation Delay

Presentation delay tends to be more difficult to debug than input delay or processing time. Often it’s caused by browser behavior rather than a specific script. But as before, you still start by identifying a specific page and a specific interaction.

You can see an example interaction with high presentation delay here:

1716368165 665 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the an interaction with high presentation delay, May 2024

You see that this happens when the user enters text into a form field. In this example, many visitors pasted large amounts of text that the browser had to process.

Here the fix was to delay the processing, show a “Waiting…” message to the user, and then complete the processing later on. You can see how the INP score improves from May 3:

1716368165 845 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of an Interaction to Next Paint timeline in DebugBear, May 2024

Get The Data You Need To Improve Interaction To Next Paint

Setting up real user monitoring helps you understand how users experience your website and what you can do to improve it. Try DebugBear now by signing up for a free 14-day trial.

1716368165 494 Optimizing Interaction To Next Paint A Step By Step GuideScreenshot of the DebugBear Core Web Vitals dashboard, May 2024

Google’s CrUX data is aggregated over a 28-day period, which means that it’ll take a while before you notice a regression. With real-user monitoring you can see the impact of website changes right away and get alerted automatically when there’s a big change.

DebugBear monitors lab data, CrUX data, and real user data. That way you have all the data you need to optimize your Core Web Vitals in one place.

This article has been sponsored by DebugBear, and the views presented herein represent the sponsor’s perspective.

Ready to start optimizing your website? Sign up for DebugBear and get the data you need to deliver great user experiences.


Image Credits

Featured Image: Image by Redesign.co. Used with permission.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

The 6 Biggest SEO Challenges You’ll Face in 2024

Published

on

The 6 Biggest SEO Challenges You'll Face in 2024

Seen any stressed-out SEOs recently? If so, that’s because they’ve got their work cut out this year.

Between navigating Google’s never-ending algorithm updates, fighting off competitors, and getting buy-in for projects, there are many significant SEO challenges to consider.

So, which ones should you focus on? Here are the six biggest ones I think you should pay close attention to.

Make no mistake—Google’s algorithm updates can make or break your site.

Core updates, spam updates, helpful content updates—you name it, they can all impact your site’s performance.

As we can see below, the frequency of Google updates has increased in recent years, meaning that the likelihood of being impacted by a Google update has also increased.

How to deal with it:

Recovering from a Google update isn’t easy—and sometimes, websites that get hit by updates may never fully recover.

For the reasons outlined above, most businesses try to stay on the right side of Google and avoid incurring Google’s wrath.

SEOs do this by following Google’s Search Essentials, SEO best practices and avoiding risky black hat SEO tactics. But sadly, even if you think you’ve done this, there is no guarantee that you won’t get hit.

If you suspect a website has been impacted by a Google update, the fastest way to check is to plug the domain into Ahrefs’ Site Explorer.

Ahrefs Site Explorer screenshotAhrefs Site Explorer screenshot

Here’s an example of a website likely affected by Google’s August 2023 Core Update. The traffic drop started on the update’s start date.

Website impacted by Google's August 2023 Core UpdateWebsite impacted by Google's August 2023 Core Update
Hover over the G circles on the X axis to get information about each update.

From this screen, you can see if a drop in traffic correlates with a Google update. If there is a strong correlation, then that update may have hit the site. To remedy it, you will need to understand the update and take action accordingly.

Follow SEO best practices

It’s important your website follows SEO best practices so you can understand why it has been affected and determine what you need to do to fix things.

For example, you might have missed significant technical SEO issues impacting your website’s traffic. To rule this out, it’s worth using Site Audit to run a technical crawl of your website.

Site Audit screenshot, via Ahrefs Site AuditSite Audit screenshot, via Ahrefs Site Audit

Monitor the latest SEO news

In addition to following best practices, it’s a good idea to monitor the latest SEO news. You can do this through various social media channels like X or LinkedIn, but I find the two websites below to be some of the most reliable sources of SEO news.

Even if you escape Google’s updates unscathed, you’ve still got to deal with your competitors vying to steal your top-ranking keywords from right under your nose.

This may sound grim, but it’s a mistake to underestimate them. Most of the time, they’ll be trying to improve their website’s SEO just as much as you are.

And these days, your competitors will:

How to deal with it:

If you want to stay ahead of your competitors, you need to do these two things:

Spy on your competitors and monitor their strategy

Ok, so you don’t have to be James Bond, but by using a tool like Ahrefs Site Explorer and our Google Looker Studio Integration (GLS), you can extract valuable information and keep tabs on your competitors, giving you a competitive advantage in the SERPs.

Using a tool like Site Explorer, you can use the Organic Competitors report to understand the competitor landscape:

Organic competitors screenshot, via Ahrefs' Site ExplorerOrganic competitors screenshot, via Ahrefs' Site Explorer

You can check out their Organic traffic performance across the years:

Year on Year comparison of organic traffic, via Ahrefs' Site ExplorerYear on Year comparison of organic traffic, via Ahrefs' Site Explorer

You can use Calendar to see which days changes in Positions, Pages, Referring domains Backlinks occurred:

Screenshot of Ahrefs' Calendar, via Ahrefs' Site ExplorerScreenshot of Ahrefs' Calendar, via Ahrefs' Site Explorer

You can see their Top pages’ organic traffic and Organic keywords:

Top pages report, via Ahrefs' Site ExplorerTop pages report, via Ahrefs' Site Explorer

And much, much more.

If you want to monitor your most important competitors more closely, you can even create a dashboard using Ahrefs’ GLS integration.

Google Looker Studio integration screenshot,Google Looker Studio integration screenshot,

Acquire links and create content that your competitors can’t recreate easily

Once you’ve done enough spying, it’s time to take action.

Links and content are the bread and butter for many SEOs. But a lot of the time the links that are acquired and the content that is created just aren’t that great.

So, to stand the best chance of maintaining your rankings, you need to work on getting high-quality backlinks and producing high-quality content that your competitors can’t easily recreate.

It’s easy to say this, but what does it mean in practice?

The best way to create this type of content is to create deep content.

At Ahrefs, we do this by running surveys, getting quotes from industry experts, running data studies, creating unique illustrations or diagrams, and generally fine-tuning our content until it is the best it can be.

As if competing against your competitors wasn’t enough, you must also compete against Google for clicks.

As Google not-so-subtly transitions from a search engine to an answer engine, it’s becoming more common for it to supply the answer to search queries—rather than the search results themselves.

The result is that even the once top-performing organic search websites have a lower click-through rate (CTR) because they’re further down the page—or not on the first page.

Whether you like it or not, Google is reducing traffic to your website through two mechanisms:

  • AI overviews – Where Google generates an answer based on sources on the internet
  • Zero-click searches – Where Google shows the answer in the search results

With AI overviews, we can see that the traditional organic search results are not visible.

And with zero-click searches, Google supplies the answer directly in the SERP, so the user doesn’t have to click anything unless they want to know more.

Zero Click searches example, via Google.comZero Click searches example, via Google.com

These features have one thing in common: They are pushing the organic results further down the page.

With AI Overviews, even when links are included, Kevin Indig’s AI overviews traffic impact study suggests that AI overviews will reduce organic clicks.

In this example below, shared by Aleyda, we can see that even when you rank organically in the number one position, it doesn’t mean much if there are Ads and an AI overview with the UX with no links in the AI overview answer; it just perpetuates the zero-clicks model through the AI overview format.

How to deal with it:

You can’t control how Google changes the SERPs, but you can do two things:

Make your website the best it can be

If you focus on the latter, your website will naturally become more authoritative over time. This isn’t a guarantee that your website will be included in the AI overview, but it’s better than doing nothing.

Prevent Google from showing your website in an AI Overview

If you want to be excluded from Google’s AI Overviews, Google says you can add no snippet to prevent your content from appearing in AI Overviews.

nosnippet code explanation screemshot, via Google's documentationnosnippet code explanation screemshot, via Google's documentation

One of the reasons marketers gravitated towards Google in the early days was that it was relatively easy to set up a website and get traffic.

Recently, there have been a few high-profile examples of smaller websites that have been impacted by Google:

Apart from the algorithmic changes, I think there are two reasons for this:

  • Large authoritative websites with bigger budgets and SEO teams are more likely to rank well in today’s Google
  • User-generated content sites like Reddit and Quora have been given huge traffic boosts from Google, which has displaced smaller sites from the SERPs that used to rank for these types of keyword queries

Here’s Reddit’s traffic increase over the last year:

Reddit's organic traffic increase, via Ahrefs Site ExplorerReddit's organic traffic increase, via Ahrefs Site Explorer

And here’s Quora’s traffic increase:

Quora's organic traffic increase, via Ahrefs Site ExplorerQuora's organic traffic increase, via Ahrefs Site Explorer

How to deal with it:

There are three key ways I would deal with this issue in 2024:

Focus on targeting the right keywords using keyword research

Knowing which keywords to target is really important for smaller websites. Sadly, you can’t just write about a big term like “SEO” and expect to rank for it in Google.

Use a tool like Keywords Explorer to do a SERP analysis for each keyword you want to target. Use the effort-to-reward ratio to ensure you are picking the right keyword battles:

Effort to reward ratio illustrationEffort to reward ratio illustration

If you’re concerned about Reddit, Quora, or other UGC sites stealing your clicks, you can also use Keywords Explorer to target SERPs where these websites aren’t present.

To do this:

  • Enter your keyword in the search bar and head to the matching terms report
  • Click on the SERP features drop-down box
  • Select Not on SERP and select Discussions and forums
Example of removing big UGC sites from keyword searches using filters in Ahrefs' Keywords ExplorerExample of removing big UGC sites from keyword searches using filters in Ahrefs' Keywords Explorer

This method can help you find SERPs where these types of sites are not present.

Build more links to become more authoritative

Another approach you could take is to double down on the SEO basics and start building more high-quality backlinks.

Write deep content

Most SEOs are not churning out 500-word blog posts and hoping for the best; equally, the content they’re creating is often not deep or the best it can possibly be.

This is often due to time restraints, budget and inclination. But to be competitive in the AI era, deep content is exactly what you should be creating.

As your website grows, the challenge of maintaining the performance of your content portfolio gets increasingly more difficult.

And what may have been an “absolute banger” of an article in 2020 might not be such a great article now—so you’ll need to update it to keep the clicks rolling in.

So how can you ensure that your content is the best it can be?

How to deal with it:

Here’s the process I use:

Steal this content updating framework

And here’s a practical example of this in action:

Use Page Inspect with Overview to identify pages that need updating

Here’s an example of an older article Michal Pecánek wrote that I recently updated. Using Page Inspect, we can pinpoint the exact date of the update was on May 10, 2024, with no other major in the last year.

Ahrefs Page Inspect screenshot, via Ahrefs' Site ExplorerAhrefs Page Inspect screenshot, via Ahrefs' Site Explorer

According to Ahrefs, this update almost doubled the page’s organic traffic, underlining the value of updating old content. Before the update, the content had reached its lowest performance ever.

Example of a content update and the impact on organic traffic, via Ahrefs' Site ExplorerExample of a content update and the impact on organic traffic, via Ahrefs' Site Explorer

So, what changed to casually double the traffic? Clicking on Page Inspect gives us our answer.

Page Inspect detail screenshot, via Ahrefs' Site ExplorerPage Inspect detail screenshot, via Ahrefs' Site Explorer

I was focused on achieving three aims with this update:

  • Keeping Michal’s original framework for the post intact
  • Making the content as concise and readable as it can be
  • Refreshing the template (the main draw of the post) and explaining how to use the updated version in a beginner-friendly way to match the search intent

Getting buy-in for SEO projects has never been easy compared to other channels. Unfortunately, this meme perfectly describes my early days of agency life.

SEO meme, SEO vs PPC budgetsSEO meme, SEO vs PPC budgets

SEO is not an easy sell—either internally or externally to clients.

With companies hiring fewer SEO roles this year, the appetite for risk seems lower than in previous years.

SEO can also be slow to take impact, meaning getting buy-in for projects is harder than other channels.

How long does SEO take illustrationHow long does SEO take illustration

How to deal with it:

My colleague Despina Gavoyannis has written a fantastic article about how to get SEO buy-in, here is a summary of her top tips:

  • Find key influencers and decision-makers within the organization, starting with cross-functional teams before approaching executives. (And don’t forget the people who’ll actually implement your changes—developers.)
  • Adapt your language and communicate the benefits of SEO initiatives in terms that resonate with different stakeholders’ priorities.
  • Highlight the opportunity costs of not investing in SEO by showing the potential traffic and revenue being missed out on using metrics like Ahrefs’ traffic value.
  • Collaborate cross-functionally by showing how SEO can support other teams’ goals, e.g. helping the editorial team create content that ranks for commercial queries.

And perhaps most important of all: build better business cases and SEO opportunity forecasts.

If you just want to show the short-term trend for a keyword, you can use Keywords Explorer:

Forecasting feature for keywords, via Ahrefs' Keywords ExplorerForecasting feature for keywords, via Ahrefs' Keywords Explorer
The forecasted trend is shown in orange as a dotted line.

If you want to show the Traffic potential of a particular keyword, you can use our Traffic potential metric in SERP overview to gauge this:

Traffic potential example, via Ahrefs' Site ExplorerTraffic potential example, via Ahrefs' Site Explorer

And if you want to go the whole hog, you can create an SEO forecast. You can use a third-party tool to create a forecast, but I recommend you use Patrick Stox’s SEO forecasting guide.

Final thoughts

Of all the SEO challenges mentioned above, the one keeping SEOs awake at night is AI.

It’s swept through our industry like a hurricane, presenting SEOs with many new challenges. The SERPs are changing, competitors are using AI tools, and the bar for creating basic content has been lowered, all thanks to AI.

If you want to stay competitive, you need to arm yourself with the best SEO tools and search data on the market—and for me, that always starts with Ahrefs.

Got questions? Ping me on X.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Why Now’s The Time To Adopt Schema Markup

Published

on

By

Why Now's The Time To Adopt Schema Markup

There is no better time for organizations to prioritize Schema Markup.

Why is that so, you might ask?

First of all, Schema Markup (aka structured data) is not new.

Google has been awarding sites that implement structured data with rich results. If you haven’t taken advantage of rich results in search, it’s time to gain a higher click-through rate from these visual features in search.

Secondly, now that search is primarily driven by AI, helping search engines understand your content is more important than ever.

Schema Markup allows your organization to clearly articulate what your content means and how it relates to other things on your website.

The final reason to adopt Schema Markup is that, when done correctly, you can build a content knowledge graph, which is a critical enabler in the age of generative AI. Let’s dig in.

Schema Markup For Rich Results

Schema.org has been around since 2011. Back then, Google, Bing, Yahoo, and Yandex worked together to create the standardized Schema.org vocabulary to enable website owners to translate their content to be understood by search engines.

Since then, Google has incentivized websites to implement Schema Markup by awarding rich results to websites with certain types of markup and eligible content.

Websites that achieve these rich results tend to see higher click-through rates from the search engine results page.

In fact, Schema Markup is one of the most well-documented SEO tactics that Google tells you to do. With so many things in SEO that are backward-engineered, this one is straightforward and highly recommended.

You might have delayed implementing Schema Markup due to the lack of applicable rich results for your website. That might have been true at one point, but I’ve been doing Schema Markup since 2013, and the number of rich results available is growing.

Even though Google deprecated how-to rich results and changed the eligibility of FAQ rich results in August 2023, it introduced six new rich results in the months following – the most new rich results introduced in a year!

These rich results include vehicle listing, course info, profile page, discussion forum, organization, vacation rental, and product variants.

There are now 35 rich results that you can use to stand out in search, and they apply to a wide range of industries such as healthcare, finance, and tech.

Here are some widely applicable rich results you should consider utilizing:

  • Breadcrumb.
  • Product.
  • Reviews.
  • JobPosting.
  • Video.
  • Profile Page.
  • Organization.

With so many opportunities to take control of how you appear in search, it’s surprising that more websites haven’t adopted it.

A statistic from Web Data Commons’ October 2023 Extractions Report showed that only 50% of pages had structured data.

Of the pages with JSON-LD markup, these were the top types of entities found.

  • http://schema.org/ListItem (2,341,592,788 Entities)
  • http://schema.org/ImageObject (1,429,942,067 Entities)
  • http://schema.org/Organization (907,701,098 Entities)
  • http://schema.org/BreadcrumbList (817,464,472 Entities)
  • http://schema.org/WebSite (712,198,821 Entities)
  • http://schema.org/WebPage (691,208,528 Entities)
  • http://schema.org/Offer (623,956,111 Entities)
  • http://schema.org/SearchAction (614,892,152 Entities)
  • http://schema.org/Person (582,460,344 Entities)
  • http://schema.org/EntryPoint (502,883,892 Entities)

(Source: October 2023 Web Data Commons Report)

Most of the types on the list are related to the rich results mentioned above.

For example, ListItem and BreadcrumbList are required for the Breadcrumb Rich Result, SearchAction is required for Sitelink Search Box, and Offer is required for the Product Rich Result.

This tells us that most websites are using Schema Markup for rich results.

Even though these Schema.org types can help your site achieve rich results and stand out in search, they don’t necessarily tell search engines what each page is about in detail and help your site be more semantic.

Help AI Search Engines Understand Your Content

Have you ever seen your competitor’s sites using specific Schema.org Types that are not found in Google’s structured data documentation (i.e. MedicalClinic, IndividualPhysician, Service, etc)?

The Schema.org vocabulary has over 800 types and properties to help websites explain what the page is about. However, Google’s structured data features only require a small subset of these properties for websites to be eligible for a rich result.

Many websites that solely implement Schema Markup to get rich results tend to be less descriptive with their Schema Markup.

AI search engines now look at the meaning and intent behind your content to provide users with more relevant search results.

Therefore, organizations that want to stay ahead should use more specific Schema.org types and leverage appropriate properties to help search engines better understand and contextualize their content. You can be descriptive with your content while still achieving rich results.

For example, each type (e.g. Article, Person, etc.) in the Schema.org vocabulary has 40 or more properties to describe the entity.

The properties are there to help you fully describe what the page is about and how it relates to other things on your website and the web. In essence, it’s asking you to describe the entity or topic of the page semantically.

The word ‘semantic’ is about understanding the meaning of language.

Note that the word “understanding” is part of the definition. Funny enough, in October 2023, John Mueller at Google released a Search Update video. In this six-minute video, he leads with an update on Schema Markup.

For the first time, Mueller described Schema Markup as “a code you can add to your web pages, which search engines can use to better understand the content. ”

While Mueller has historically spoken a lot about Schema Markup, he typically talked about it in the context of rich result eligibility. So, why the change?

This shift in thinking about Schema Markup for enhanced search engine understanding makes sense. With AI’s growing role and influence in search, we need to make it easy for search engines to consume and understand the content.

Take Control Of AI By Shaping Your Data With Schema Markup

Now, if being understood and standing out in search is not a good enough reason to get started, then doing it to help your enterprise take control of your content and prepare it for artificial intelligence is.

In February 2024, Gartner published a report on “30 Emerging Technologies That Will Guide Your Business Decisions,”  highlighting generative AI and knowledge graphs as critical emerging technologies companies should invest in within the next 0-1 years.

Knowledge graphs are collections of relationships between entities defined using a standardized vocabulary that enables new knowledge to be gained by way of inferencing.

Good news! When you implement Schema Markup to define and connect the entities on your site, you are creating a content knowledge graph for your organization.

Thus, your organization gains a critical enabler for generative AI adoption while reaping its SEO benefits.

Learn more about building content knowledge graphs in my article, Extending Your Schema Markup From Rich Results to Knowledge Graphs.

We can also look at other experts in the knowledge graph field to understand the urgency of implementing Schema Markup.

In his LinkedIn post, Tony Seale, Knowledge Graph Architect at UBS in the UK, said,

“AI does not need to happen to you; organizations can shape AI by shaping their data.

It is a choice: We can allow all data to be absorbed into huge ‘data gravity wells’ or we can create a network of networks, each of us connecting and consolidating our data.”

The “networks of networks” Seale refers to is the concept of knowledge graphs – the same knowledge graph that can be built from your web data using semantic Schema Markup.”

The AI revolution has only just begun, and there is no better time than now to shape your data, starting with your web content through the implementation of Schema Markup.

Use Schema Markup As The Catalyst For AI

In today’s digital landscape, organizations must invest in new technology to keep pace with the evolution of AI and search.

Whether your goal is to stand out on the SERP or ensure your content is understood as intended by Google and other search engines, the time to implement Schema Markup is now.

With Schema Markup, SEO pros can become heroes, enabling generative AI adoption through content knowledge graphs while delivering tangible benefits, such as increased click-through rates and improved search visibility.

More resources: 


Featured Image by author

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Quietly Ends Covid-Era Rich Results

Published

on

By

Google Quietly Ends Covid-Era Rich Results

Google removed the Covid-era structured data associated with the Home Activities rich results that allowed online events to be surfaced in search since August 2020, publishing a mention of the removal in the search documentation changelog.

Home Activities Rich Results

The structured data for the Home Activities rich results allowed providers of online livestreams, pre-recorded events and online events to be findable in Google Search.

The original documentation has been completely removed from the Google Search Central webpages and now redirects to a changelog notation that explains that the Home Activity rich results is no longer available for display.

The original purpose was to allow people to discover things to do from home while in quarantine, particularly online classes and events. Google’s rich results surfaced details of how to watch, description of the activities and registration information.

Providers of online events were required to use Event or Video structured data. Publishers and businesses who have this kind of structured data should be aware that this kind of rich result is no longer surfaced but it’s not necessary to remove the structured data if it’s a burden, it’s not going to hurt anything to publish structured data that isn’t used for rich results.

The changelog for Google’s official documentation explains:

“Removing home activity documentation
What: Removed documentation on home activity structured data.

Why: The home activity feature no longer appears in Google Search results.”

Read more about Google’s Home Activities rich results:

Google Announces Home Activities Rich Results

Read the Wayback Machine’s archive of Google’s original announcement from 2020:

Home activities

Featured Image by Shutterstock/Olga Strel

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending