Connect with us


How To Set Up Scroll Depth Tracking In GA4



How To Set Up Scroll Depth Tracking In GA4

If you are familiar with Google Analytics 4 (GA4), you probably know already that it has built-in scroll tracking by default.

And naturally, you might be asking yourself: why do I need custom tracking for scroll depth?

The GA4’s “built-in scroll event” triggers when a user has scrolled through approximately 90% of the page – but you most likely want to know more than that, such as how many users scrolled through 50% or even 25% of the page.

Here is why you would need to set up custom scroll depth tracking in GA4.

By tracking scroll depth, you can gain insights into user engagement and behavior, understand how much content users are consuming, and optimize accordingly.

We will be looking into implementation by using Google Tag Manager (GTM).

Overall, I would highly recommend using GTM for your GA4 setup vs. gtag.js because it has very nifty features you can use and requires less time to maintain the setup.

At least you can go hybrid and use GTM with the datalayer.push method whenever you need more granular control over tracking.

We will learn how to set up scroll depth tracking and build an example content engagement dashboard in Looker Studio.

How To Set Up Scroll Depth Tracking

First of all, we have to disable default scroll tracking from the Enhanced measurement stream.

Navigate to Data Stream > Clicks to Stream.

Enhanced measurements Enhanced measurements.

Disable scroll tracking in the popup dialog.

Scroll trackingScroll tracking.

Navigate to GA4’s custom definitions setting page and add the custom dimension “scroll_percentage” (you can name it anything). We will be using this to send scroll depth thresholds.

scroll_percentage custom dimensionScroll_percentage custom dimension.

Navigate to Variables in Google Tag Manager and enable these three build-in variables:

  • Scroll Depth Threshold.
  • Scroll Depth Units.
  • Scroll Direction.
Scrolling variablesScrolling variables.

Navigate to Triggers and add a scroll depth trigger with the name “Custom Scroll.”

In the “Percentages” settings, add the scroll depth levels you want to track as a comma-separated list.

Please note: If you have a high-traffic website and you add too many thresholds like (21,22,23….90) etc., you may hit GA4’s BigQuery 1M daily events export limits and lose one of the greatest benefits of GA4.

You may notice that there are also pixel values you can use to track scroll depth with pixels.

In some cases, it may make sense to use, but in my opinion, it doesn’t have wide use cases.

For instance, tracking a scroll depth of 1000 pixels might not accurately tell you how much of an article users have read, since articles can vary greatly in length, ranging from 2000 pixels to 10,000 pixels.

Go to Tags in GTM and add an Event tag.

Event tagEvent tag.

In the event name, type “scroll” and set the custom parameter to “scroll_percentage.”

Choose a trigger Custom Scroll.

Set a trigger Set a trigger “Custom Scroll.”

In a nutshell, custom scroll tracking works by:

  • Disabling default scroll tracking.
  • Re-adding the same scroll tracking with the event name “scroll.”
  • Sending scroll depth thresholds in a custom parameter.

Once you have the scroll depth tracking setup, let’s dive into how to set up Looker Studio (formerly Google Data Studio) dashboard that will report the articles with their average scroll depth.

How To Create Looker Studio Report On Scroll Depth

Open Looker Studio and add GA4 as a source.

Looker StudioLooker Studio.

Insert Pivot Table as a chart type.

Pivot TablePivot Table.

Add “Page path” as a row dimension and scroll_percentage as a column dimension. As a metric, add “Total Users” and “Event Count.”

Pivot Table with DataPivot Table with Data.

Apply a filter with “Event name” containing “scroll” to filter out other events, such as “page_view” from the data.

Because of GA4 data structure specifics, it is pulling all events, and a filter is needed to eliminate none scroll events.

How To Create Scroll Depth Report In Explorations

The same report can be created in explorations, even though I recommend using Looker Studio because you can share it with your editorial team, and it is easier to read.

In order to create a scroll depth report in explorations, navigate to Explore and create a blank “Free Form” report. Add dimensions such as “Page path” and “scroll_percentage.”


Build the report according to the screenshot.

Scroll depth reportScroll depth report.

Don’t forget to filter only events with the name “scroll.”

It is also useful to set conversions based on scroll depth. For example, we have a conversion set whenever one reads 50% of an article.

In order to set up conversion based on scroll depth, we will use one of GA4’s greatest features, which is setting up an event based on events parameters.

Navigate to Events settings and click the Create event button in the upper right corner.


In the popup dialog, enter the “Event Name=scroll” and “scroll_percentage=50” and name your event.

50% scroll custom event based on scroll depth paramter50% scroll custom event based on scroll depth parameter.

And from Conversions settings, mark it as a conversion.



Here we’ve discussed how to track scroll depth in GA4, which is an important metric to measure your audience engagement.

I would recommend trying to build different segments and comparing how each type of user is engaging with your content.

Consider also blending data with Google Search Console and see which keywords drive more engagement with your content.

In the future, we will cover more on how to use GA4 and help you set it up.

To stay updated with our upcoming guides, we invite you to subscribe to our newsletter.

More resources:

Featured Image: fizkes/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address


Why Now’s The Time To Adopt Schema Markup




Why Now's The Time To Adopt Schema Markup

There is no better time for organizations to prioritize Schema Markup.

Why is that so, you might ask?

First of all, Schema Markup (aka structured data) is not new.

Google has been awarding sites that implement structured data with rich results. If you haven’t taken advantage of rich results in search, it’s time to gain a higher click-through rate from these visual features in search.

Secondly, now that search is primarily driven by AI, helping search engines understand your content is more important than ever.

Schema Markup allows your organization to clearly articulate what your content means and how it relates to other things on your website.

The final reason to adopt Schema Markup is that, when done correctly, you can build a content knowledge graph, which is a critical enabler in the age of generative AI. Let’s dig in.

Schema Markup For Rich Results has been around since 2011. Back then, Google, Bing, Yahoo, and Yandex worked together to create the standardized vocabulary to enable website owners to translate their content to be understood by search engines.

Since then, Google has incentivized websites to implement Schema Markup by awarding rich results to websites with certain types of markup and eligible content.

Websites that achieve these rich results tend to see higher click-through rates from the search engine results page.

In fact, Schema Markup is one of the most well-documented SEO tactics that Google tells you to do. With so many things in SEO that are backward-engineered, this one is straightforward and highly recommended.

You might have delayed implementing Schema Markup due to the lack of applicable rich results for your website. That might have been true at one point, but I’ve been doing Schema Markup since 2013, and the number of rich results available is growing.

Even though Google deprecated how-to rich results and changed the eligibility of FAQ rich results in August 2023, it introduced six new rich results in the months following – the most new rich results introduced in a year!

These rich results include vehicle listing, course info, profile page, discussion forum, organization, vacation rental, and product variants.

There are now 35 rich results that you can use to stand out in search, and they apply to a wide range of industries such as healthcare, finance, and tech.

Here are some widely applicable rich results you should consider utilizing:

  • Breadcrumb.
  • Product.
  • Reviews.
  • JobPosting.
  • Video.
  • Profile Page.
  • Organization.

With so many opportunities to take control of how you appear in search, it’s surprising that more websites haven’t adopted it.

A statistic from Web Data Commons’ October 2023 Extractions Report showed that only 50% of pages had structured data.

Of the pages with JSON-LD markup, these were the top types of entities found.

  • (2,341,592,788 Entities)
  • (1,429,942,067 Entities)
  • (907,701,098 Entities)
  • (817,464,472 Entities)
  • (712,198,821 Entities)
  • (691,208,528 Entities)
  • (623,956,111 Entities)
  • (614,892,152 Entities)
  • (582,460,344 Entities)
  • (502,883,892 Entities)

(Source: October 2023 Web Data Commons Report)

Most of the types on the list are related to the rich results mentioned above.

For example, ListItem and BreadcrumbList are required for the Breadcrumb Rich Result, SearchAction is required for Sitelink Search Box, and Offer is required for the Product Rich Result.

This tells us that most websites are using Schema Markup for rich results.

Even though these types can help your site achieve rich results and stand out in search, they don’t necessarily tell search engines what each page is about in detail and help your site be more semantic.

Help AI Search Engines Understand Your Content

Have you ever seen your competitor’s sites using specific Types that are not found in Google’s structured data documentation (i.e. MedicalClinic, IndividualPhysician, Service, etc)?

The vocabulary has over 800 types and properties to help websites explain what the page is about. However, Google’s structured data features only require a small subset of these properties for websites to be eligible for a rich result.

Many websites that solely implement Schema Markup to get rich results tend to be less descriptive with their Schema Markup.

AI search engines now look at the meaning and intent behind your content to provide users with more relevant search results.

Therefore, organizations that want to stay ahead should use more specific types and leverage appropriate properties to help search engines better understand and contextualize their content. You can be descriptive with your content while still achieving rich results.

For example, each type (e.g. Article, Person, etc.) in the vocabulary has 40 or more properties to describe the entity.

The properties are there to help you fully describe what the page is about and how it relates to other things on your website and the web. In essence, it’s asking you to describe the entity or topic of the page semantically.

The word ‘semantic’ is about understanding the meaning of language.

Note that the word “understanding” is part of the definition. Funny enough, in October 2023, John Mueller at Google released a Search Update video. In this six-minute video, he leads with an update on Schema Markup.

For the first time, Mueller described Schema Markup as “a code you can add to your web pages, which search engines can use to better understand the content. ”

While Mueller has historically spoken a lot about Schema Markup, he typically talked about it in the context of rich result eligibility. So, why the change?

This shift in thinking about Schema Markup for enhanced search engine understanding makes sense. With AI’s growing role and influence in search, we need to make it easy for search engines to consume and understand the content.

Take Control Of AI By Shaping Your Data With Schema Markup

Now, if being understood and standing out in search is not a good enough reason to get started, then doing it to help your enterprise take control of your content and prepare it for artificial intelligence is.

In February 2024, Gartner published a report on “30 Emerging Technologies That Will Guide Your Business Decisions,”  highlighting generative AI and knowledge graphs as critical emerging technologies companies should invest in within the next 0-1 years.

Knowledge graphs are collections of relationships between entities defined using a standardized vocabulary that enables new knowledge to be gained by way of inferencing.

Good news! When you implement Schema Markup to define and connect the entities on your site, you are creating a content knowledge graph for your organization.

Thus, your organization gains a critical enabler for generative AI adoption while reaping its SEO benefits.

Learn more about building content knowledge graphs in my article, Extending Your Schema Markup From Rich Results to Knowledge Graphs.

We can also look at other experts in the knowledge graph field to understand the urgency of implementing Schema Markup.

In his LinkedIn post, Tony Seale, Knowledge Graph Architect at UBS in the UK, said,

“AI does not need to happen to you; organizations can shape AI by shaping their data.

It is a choice: We can allow all data to be absorbed into huge ‘data gravity wells’ or we can create a network of networks, each of us connecting and consolidating our data.”

The “networks of networks” Seale refers to is the concept of knowledge graphs – the same knowledge graph that can be built from your web data using semantic Schema Markup.”

The AI revolution has only just begun, and there is no better time than now to shape your data, starting with your web content through the implementation of Schema Markup.

Use Schema Markup As The Catalyst For AI

In today’s digital landscape, organizations must invest in new technology to keep pace with the evolution of AI and search.

Whether your goal is to stand out on the SERP or ensure your content is understood as intended by Google and other search engines, the time to implement Schema Markup is now.

With Schema Markup, SEO pros can become heroes, enabling generative AI adoption through content knowledge graphs while delivering tangible benefits, such as increased click-through rates and improved search visibility.

More resources: 

Featured Image by author

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading


Google Quietly Ends Covid-Era Rich Results




Google Quietly Ends Covid-Era Rich Results

Google removed the Covid-era structured data associated with the Home Activities rich results that allowed online events to be surfaced in search since August 2020, publishing a mention of the removal in the search documentation changelog.

Home Activities Rich Results

The structured data for the Home Activities rich results allowed providers of online livestreams, pre-recorded events and online events to be findable in Google Search.

The original documentation has been completely removed from the Google Search Central webpages and now redirects to a changelog notation that explains that the Home Activity rich results is no longer available for display.

The original purpose was to allow people to discover things to do from home while in quarantine, particularly online classes and events. Google’s rich results surfaced details of how to watch, description of the activities and registration information.

Providers of online events were required to use Event or Video structured data. Publishers and businesses who have this kind of structured data should be aware that this kind of rich result is no longer surfaced but it’s not necessary to remove the structured data if it’s a burden, it’s not going to hurt anything to publish structured data that isn’t used for rich results.

The changelog for Google’s official documentation explains:

“Removing home activity documentation
What: Removed documentation on home activity structured data.

Why: The home activity feature no longer appears in Google Search results.”

Read more about Google’s Home Activities rich results:

Google Announces Home Activities Rich Results

Read the Wayback Machine’s archive of Google’s original announcement from 2020:

Home activities

Featured Image by Shutterstock/Olga Strel

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading


Google’s Gary Illyes: Lastmod Signal Is Binary




Google's Gary Illyes: Lastmod Signal Is Binary

In a recent LinkedIn discussion, Gary Illyes, Analyst at Google, revealed that the search engine takes a binary approach when assessing a website’s lastmod signal from sitemaps.

The revelation came as Illyes encouraged website owners to upgrade to WordPress 6.5, which now natively supports the lastmod element in sitemaps.

When Mark Williams-Cook asked if Google has a “reputation system” to gauge how much to trust a site’s reported lastmod dates, Illyes stated, “It’s binary: we either trust it or we don’t.”

No Shades Of Gray For Lastmod

The lastmod tag indicates the date of the most recent significant update to a webpage, helping search engines prioritize crawling and indexing.

Illyes’ response suggests Google doesn’t factor in a website’s history or gradually build trust in the lastmod values being reported.

Google either accepts the lastmod dates provided in a site’s sitemap as accurate, or it disregards them.

This binary approach reinforces the need to implement the lastmod tag correctly and only specify dates when making meaningful changes.

Illyes commends the WordPress developer community for their work on version 6.5, which automatically populates the lastmod field without extra configuration.

Accurate Lastmod Essential For Crawl Prioritization

While convenient for WordPress users, the native lastmod support is only beneficial if Google trusts you’re using it correctly.

Inaccurate lastmod tags could lead to Google ignoring the signal when scheduling crawls.

With Illyes confirming Google’s stance, it shows there’s no room for error when using this tag.

Why SEJ Cares

Understanding how Google acts on lastmod can help ensure Google displays new publish dates in search results when you update your content.

It’s an all-or-nothing situation – if the dates are deemed untrustworthy, the signal could be disregarded sitewide.

With the information revealed by Illyes, you can ensure your implementation follows best practices to the letter.

Featured Image: Danishch/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading