Connect with us

SEO

What Is Largest Contentful Paint: An Easy Explanation

Published

on

What Is Largest Contentful Paint: An Easy Explanation

Largest Contentful Paint (LCP) is a Google user experience metric integrated into ranking systems in 2021.

LCP is one of the three Core Web Vitals (CWV) metrics that track technical performance metrics that impact user experience.

Core Web Vitals exist paradoxically, with Google providing guidance highlighting their importance but downplaying their impact on rankings.

LCP, like the other CWV signals, is useful for diagnosing technical issues and ensuring your website meets a base level of functionality for users.

What Is Largest Contentful Paint?

LCP is a measurement of how long it takes for the main content of a page to download and be ready to be interacted with.

Specifically, the time it takes from page load initiation to the rendering of the largest image or block of text within the user viewport. Anything below the fold doesn’t count.

Images, video poster images, background images, and block-level text elements like paragraph tags are typical elements measured.

LCP consists of the following sub-metrics:

Optimizing for LCP means optimizing for each of these metrics, so it takes less than 2.5 seconds to load and display LCP resources.

Here is a threshold scale for your reference:

LCP thresholds

Let’s dive into what these sub-metrics mean and how you can improve.

Time To First Byte (TTFB)

TTFB is the server response time and measures the time it takes for the user’s browser to receive the first byte of data from your server. This includes DNS lookup time, the time it takes to process requests by server, and redirects.

Optimizing TTFB can significantly reduce the overall load time and improve LCP.

Server response time largely depends on:

  • Database queries.
  • CDN cache misses.
  • Inefficient server-side rendering.
  • Hosting.

Let’s review each:

1. Database Queries

If your response time is high, try to identify the source.

For example, it may be due to poorly optimized queries or a high volume of queries slowing down the server’s response time. If you have a MySQL database, you can log slow queries to find which queries are slow.

If you have a WordPress website, you can use the Query Monitor plugin to see how much time SQL queries take.

Other great tools are Blackfire or Newrelic, which do not depend on the CMS or stack you use, but require installation on your hosting/server.

2. CDN Cache Misses

A CDN cache miss occurs when a requested resource is not found in the CDN’s cache, and the request is forwarded to fetch from the origin server. This process takes more time, leading to increased latency and longer load times for the end user.

Usually, your CDN provider has a report on how many cache misses you have.

Example of CDN cache reportExample of CDN cache report

If you observe a high percentage ( >10% ) of cache misses, you may need to contact your CDN provider or hosting support in case you have managed hosting with cache integrated to solve the issue.

One reason that may cause cache misses is when you have a search spam attack.

For example, a dozen spammy domains link to your internal search pages with random spammy queries like [/?q=甘肃代], which are not cached because the search term is different each time. The issue is that Googlebot aggressively crawls them, which may cause high server response times and cache misses.

In that case, and overall, it is a good practice to block search or facets URLs via robots.txt. But once you block them via robots.txt, you may find those URLs to be indexed because they have backlinks from low-quality websites.

However, don’t be afraid. John Mueller said it would be cleared in time.

Here is a real-life example from the search console of high server response time (TTFB) caused by cache misses:

Crawl spike of 404 search pages which have high server response timeCrawl spike of 404 search pages that have high server response time

3. Inefficient Server Side Rendering

You may have certain components on your website that depend on third-party APIs.

For example, you’ve seen reads and shares numbers on SEJ’s articles. We fetch those numbers from different APIs, but instead of fetching them when a request is made to the server, we prefetch them and store them in our database for faster display.

Imagine if we connect to share count and GA4 APIs when a request is made to the server. Each request takes about 300-500 ms to execute, and we would add about ~1,000 ms delay due to inefficient server-side rendering. So, make sure your backend is optimized.

4. Hosting

Be aware that hosting is highly important for low TTFB. By choosing the right hosting, you may be able to reduce your TTFB by two to three times.

Choose hosting with CDN and caching integrated into the system. This will help you avoid purchasing a CDN separately and save time maintaining it.

So, investing in the right hosting will pay off.

Read more detailed guidance:

Now, let’s look into other metrics mentioned above that contribute to LCP.

Resource Load Delay

Resource load delay is the time it takes for the browser to locate and start downloading the LCP resource.

For example, if you have a background image on your hero section that requires CSS files to load to be identified, there will be a delay equal to the time the browser needs to download the CSS file to start downloading the LCP image.

In the case when the LCP element is a text block, this time is zero.

By optimizing how quickly these resources are identified and loaded, you can improve the time it takes to display critical content. One way to do this is to preload CSS files and LCP images with fetchpriority=”high” so the browser prioritizes loading above-the-fold images and styles.

But a better approach – if you have enough control over the website – is to inline the critical CSS required for above the fold, so the browser doesn’t spend time downloading the CSS file. This saves bandwidth and helps load images faster.

Of course, it’s even better if you design webpages to avoid hero images or sliders, as those usually don’t add value, and users tend to scroll past them since they are distracting.

Another major factor contributing to load delay is redirects.

If you have external backlinks with redirects, there’s not much you can do. But you have control over your internal links, so try to find internal links with redirects, usually because of missing trailing slashes, non-WWW versions, or changed URLs, and replace them with actual destinations.

There are a number of technical SEO tools you can use to crawl your website and find redirects to be replaced.

Resource Load Duration

Resource load duration refers to the actual time spent downloading the LCP resource.

Even if the browser quickly finds and starts downloading resources, slow download speeds can still affect LCP negatively. It depends on the size of the resources, the server’s network connection speed, and the user’s network conditions.

You can reduce resource load duration by implementing:

  • WebP format.
  • Properly sized images (make the intrinsic size of the image match the visible size).
  • Load prioritization by using fetchpriority=”high” on important and fetchpriority=”low” on non-critical resources.
  • CDN.

Element Render Delay

Element render delay is the time it takes for the browser to process and render the LCP element.

This metric is influenced by the complexity of your HTML, CSS, and JavaScript.

Minimizing render-blocking resources and optimizing your code can help reduce this delay. However, it may happen that you have heavy JavaScript scripting running, which blocks the main thread, and the rendering of the LCP element is delayed until those tasks are completed.

Here is where low values of the Total Blocking Time (TBT) metric are important, as it measures the total time during which the main thread is blocked by long tasks on page load, indicating the presence of heavy scripts that can delay rendering and responsiveness.

One way you can improve not only load duration and delay but overall all CWV metrics when users navigate within your website is to implement speculation rules API for future navigations. By prerendering pages as users mouse over links or pages they will most likely navigate, you can make your pages load instantaneously.

Beware These Scoring “Gotchas”

All elements in the user’s screen (the viewport) are used to calculate LCP. That means that images rendered off-screen and then shifted into the layout, once rendered, may not count as part of the Largest Contentful Paint score.

On the opposite end, elements starting in the user viewport and then getting pushed off-screen may be counted as part of the LCP calculation.

How To Measure The LCP Score

There are two kinds of scoring tools. The first is called Field Tools, and the second is called Lab Tools.

Field tools are actual measurements of a site.

Lab tools give a virtual score based on a simulated crawl using algorithms that approximate Internet conditions that a typical mobile phone user might encounter.

Here is one way you can find LCP resources and measure the time to display them via DevTools > Performance report:

You can read more in our in-depth guide on how to measure CWV metrics, where you can learn how to troubleshoot not only LCP but other metrics altogether.

LCP Optimization Is A Much More In-Depth Subject

Improving LCP is a crucial step toward improving CVW, but it can be the most challenging CWV metric to optimize.

LCP consists of multiple layers of sub-metrics, each requiring a thorough understanding for effective optimization.

This guide has given you a basic idea of improving LCP, and the insights you’ve gained thus far will help you make significant improvements.

But there’s still more to learn. Optimizing each sub-metric is a nuanced science. Stay tuned, as we’ll publish in-depth guides dedicated to optimizing each sub-metric.

More resources:


Featured image credit: BestForBest/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Mediavine Bans Publisher For Overuse Of AI-Generated Content

Published

on

By

Single continuous line drawing robot sitting near piles of work files.

According to details surfacing online, ad management firm Mediavine is terminating publishers’ accounts for overusing AI.

Mediavine is a leading ad management company providing products and services to help website publishers monetize their content.

The company holds elite status as a Google Certified Publishing Partner, which indicates that it meets Google’s highest standards and requirements for ad networks and exchanges.

AI Content Triggers Account Terminations

The terminations came to light in a post on the Reddit forum r/Blogging, where a user shared an email they received from Mediavine citing “overuse of artificially created content.”

Trista Jensen, Mediavine’s Director of Ad Operations & Market Quality, states in the email:

“Our third party content quality tools have flagged your sites for overuse of artificially created content. Further internal investigation has confirmed those findings.”

Jensen stated that due to the overuse of AI content, “our top partners will stop spending on your sites, which will negatively affect future monetization efforts.”

Consequently, Mediavine terminated the publisher’s account “effective immediately.”

The Risks Of Low-Quality AI Content

This strict enforcement aligns with Mediavine’s publicly stated policy prohibiting websites from using “low-quality, mass-produced, unedited or undisclosed AI content that is scraped from other websites.”

In a March 7 blog post titled “AI and Our Commitment to a Creator-First Future,” the company declared opposition to low-value AI content that could “devalue the contributions of legitimate content creators.”

Mediavine warned in the post:

“Without publishers, there is no open web. There is no content to train the models that power AI. There is no internet.”

The company says it’s using its platform to “advocate for publishers” and uphold quality standards in the face of AI’s disruptive potential.

Mediavine states:

“We’re also developing faster, automated tools to help us identify low-quality, mass-produced AI content across the web.”

Targeting ‘AI Clickbait Kingpin’ Tactics

While the Reddit user’s identity wasn’t disclosed, the incident has drawn connections to the tactics of Nebojša Vujinović Vujo, who was dubbed an “AI Clickbait Kingpin” in a recent Wired exposé.

According to Wired, Vujo acquired over 2,000 dormant domains and populated them with AI-generated, search-optimized content designed purely to capture ad revenue.

His strategies represent the low-quality, artificial content Mediavine has vowed to prohibit.

Potential Implications

Lost Revenue

Mediavine’s terminations highlight potential implications for publishers that rely on artificial intelligence to generate website content at scale.

Perhaps the most immediate and tangible implication is the risk of losing ad revenue.

For publishers that depend heavily on programmatic advertising or sponsored content deals as key revenue drivers, being blocked from major ad networks could devastate their business models.

Devalued Domains

Another potential impact is the devaluation of domains and websites built primarily on AI-generated content.

If this pattern of AI content overuse triggers account terminations from companies like Mediavine, it could drastically diminish the value proposition of scooping up these domains.

Damaged Reputations & Brands

Beyond the lost monetization opportunities, publishers leaning too heavily into automated AI content also risk permanent reputational damage to their brands.

Once a determining authority flags a website for AI overuse, it could impact how that site is perceived by readers, other industry partners, and search engines.

In Summary

AI has value as an assistive tool for publishers, but relying heavily on automated content creation poses significant risks.

These include monetization challenges, potential reputation damage, and increasing regulatory scrutiny. Mediavine’s strict policy illustrates the possible consequences for publishers.

It’s important to note that Mediavine’s move to terminate publisher accounts over AI content overuse represents an independent policy stance taken by the ad management firm itself.

The action doesn’t directly reflect the content policies or enforcement positions of Google, whose publishing partner program Mediavine is certified under.

We have reached out to Mediavine requesting a comment on this story. We’ll update this article with more information when it’s provided.


Featured Image: Simple Line/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google’s Guidance About The Recent Ranking Update

Published

on

By

Google issues a statement about their recent algorithm update

Google’s Danny Sullivan explained the recent update, addressing site recoveries and cautioning against making radical changes to improve rankings. He also offered advice for publishes whose rankings didn’t improve after the last update.

Google’s Still Improving The Algorithm

Danny said that Google is still working on their ranking algorithm, indicating that more changes (for the positive) are likely on the way. The main idea he was getting across is that they’re still trying to fill the gaps in surfacing high quality content from independent sites. Which is good because big brand sites don’t necessarily have the best answers.

He wrote:

“…the work to connect people with “a range of high quality sites, including small or independent sites that are creating useful, original content” is not done with this latest update. We’re continuing to look at this area and how to improve further with future updates.”

A Message To Those Who Were Left Behind

There was a message to those publishers whose work failed to recover with the latest update, to let them know that Google is still working to surface more of the independent content and that there may be relief on the next go.

Danny advised:

“…if you’re feeling confused about what to do in terms of rankings…if you know you’re producing great content for your readers…If you know you’re producing it, keep doing that…it’s to us to keep working on our systems to better reward it.”

Google Cautions Against “Improving” Sites

Something really interesting that he mentioned was a caution against trying to improve rankings of something that’s already on page one in order to rank even higher. Tweaking a site to get from position six or whatever to something higher has always been a risky thing to do for many reasons I won’t elaborate on here. But Danny’s warning increases the pressure to not just think twice before trying to optimize a page for search engines but to think three times and then some more.

Danny cautioned that sites that make it to the top of the SERPs should consider that a win and to let it ride instead of making changes right now in order to improve their rankings. The reason for that caution is that the search results continue to change and the implication is that changing a site now may negatively impact the rankings in a newly updated search index.

He wrote:

“If you’re showing in the top results for queries, that’s generally a sign that we really view your content well. Sometimes people then wonder how to move up a place or two. Rankings can and do change naturally over time. We recommend against making radical changes to try and move up a spot or two”

How Google Handled Feedback

There was also some light shed on what Google did with all the feedback they received from publishers who lost rankings. Danny wrote that the feedback and site examples he received was summarized, with examples, and sent to the search engineers for review. They continue to use that feedback for the next round of improvements.

He explained:

“I went through it all, by hand, to ensure all the sites who submitted were indeed heard. You were, and you continue to be. …I summarized all that feedback, pulling out some of the compelling examples of where our systems could do a better job, especially in terms of rewarding open web creators. Our search engineers have reviewed it and continue to review it, along with other feedback we receive, to see how we can make search better for everyone, including creators.”

Feedback Itself Didn’t Lead To Recovery

Danny also pointed out that sites that recovered their rankings did not do so because of they submitted feedback to Google. Danny wasn’t specific about this point but it conforms with previous statements about Google’s algorithms that they implement fixes at scale. So instead of saying, “Hey let’s fix the rankings of this one site” it’s more about figuring out if the problem is symptomatic of something widescale and how to change things for everybody with the same problem.

Danny wrote:

“No one who submitted, by the way, got some type of recovery in Search because they submitted. Our systems don’t work that way.”

That feedback didn’t lead to recovery but was used as data shouldn’t be surprising. Even as far back as the 2004 Florida Update Matt Cutts collected feedback from people, including myself, and I didn’t see a recovery for a false positive until everyone else also got back their rankings.

Takeaways

Google’s work on their algorithm is ongoing:
Google is continuing to tune its algorithms to improve its ability to rank high quality content, especially from smaller publishers. Danny Sullivan emphasized that this is an ongoing process.

What content creators should focus on:
Danny’s statement encouraged publishers to focus on consistently creating high quality content and not to focus on optimizing for algorithms. Focusing on quality should be the priority.

What should publishers do if their high-quality content isn’t yet rewarded with better rankings?
Publishers who are certain of the quality of their content are encouraged to hold steady and keep it coming because Google’s algorithms are still being refined.

Read the post on LinkedIn.

Featured Image by Shutterstock/Cast Of Thousands

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Plot Up To Five Metrics At Once

Published

on

By

Plot Up To Five Metrics At Once

Google has rolled out changes to Analytics, adding features to help you make more sense of your data.

The update brings several key improvements:

  • You can now compare up to five different metrics side by side.
  • A new tool automatically spots unusual trends in your data.
  • A more detailed report on transactions gives a closer look at revenue.
  • The acquisition reports now separate user and session data more clearly.
  • It’s easier to understand what each report does with new descriptions.

Here’s an overview of these new features, why they matter, and how they might help improve your data analysis and decision-making.

Plot Rows: Enhanced Data Visualization

The most prominent addition is the “Plot Rows” feature.

You can now visualize up to five rows of data simultaneously within your reports, allowing for quick comparisons and trend analysis.

This feature is accessible by selecting the desired rows and clicking the “Plot Rows” option.

Anomaly Detection: Spotting Unusual Patterns

Google Analytics has implemented an anomaly detection system to help you identify potential issues or opportunities.

This new tool automatically flags unusual data fluctuations, making it easier to spot unexpected traffic spikes, sudden drops, or other noteworthy trends.

Improved Report Navigation & Understanding

Google Analytics has added hover-over descriptions for report titles.

These brief explanations provide context and include links to more detailed information about each report’s purpose and metrics.

Key Event Marking In Events Report

The Events report allows you to mark significant events for easy reference.

This feature, accessed through a three-dot menu at the end of each event row, helps you prioritize and track important data points.

New Transactions Report For Revenue Insights

For ecommerce businesses, the new Transactions report offers granular insights into revenue streams.

This feature provides information about each transaction, utilizing the transaction_id parameter to give you a comprehensive view of sales data.

Scope Changes In Acquisition Reports

Google has refined its acquisition reports to offer more targeted metrics.

The User Acquisition report now includes user-related metrics such as Total Users, New Users, and Returning Users.

Meanwhile, the Traffic Acquisition report focuses on session-related metrics like Sessions, Engaged Sessions, and Sessions per Event.

What To Do Next

As you explore these new features, keep in mind:

  • Familiarize yourself with the new Plot Rows function to make the most of comparative data analysis.
  • Pay attention to the anomaly detection alerts, but always investigate the context behind flagged data points.
  • Take advantage of the more detailed Transactions report to understand your revenue patterns better.
  • Experiment with the refined acquisition reports to see which metrics are most valuable for your needs.

As with any new tool, there will likely be a learning curve as you incorporate these features into your workflow.


FAQ

What is the “Plot Rows” feature in Google Analytics?

The “Plot Rows” feature allows you to visualize up to five rows of data at the same time. This makes it easier to compare different metrics side by side within your reports, facilitating quick comparisons and trend analysis. To use this feature, select the desired rows and click the “Plot Rows” option.

How does the new anomaly detection system work in Google Analytics?

Google Analytics’ new anomaly detection system automatically flags unusual data patterns. This tool helps identify potential issues or opportunities by spotting unexpected traffic spikes, sudden drops, or other notable trends, making it easier for users to focus on significant data fluctuations.

What improvements have been made to the Transactions report in Google Analytics?

The enhanced Transactions report provides detailed insights into revenue for ecommerce businesses. It utilizes the transaction_id parameter to offer granular information about each transaction, helping businesses get a better understanding of their revenue streams.


Featured Image: Vladimka production/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending