Connect with us

SEARCHENGINES

What are Core Web Vitals and Why You Should Care?

Published

on

what-are-core-web-vitals-and-why-you-should-care?

When it comes to optimizing your website, you do so for various reasons. It’s important that your site is set up to generate leads, optimized for SEO, encourages more email subscribers so you can grow your email list, and that it provides an optimal user experience.

You may not have heard of it before, but Core Web Vitals are especially important for increasing the experience of your users. Let’s dive into what Core Web Vitals are and what you can do to make sure your website has good Core Web Vitals.

What are Core Web Vitals

Web Vitals is an initiative by Google on how to provide a great user experience (UX) for the web. Core Web Vitals, a subset of Web Vitals, helps you to judge your website’s UX against a distinct set of UX metrics.

Google intends to evolve Core Web Vitals over time. However, as of 2021, the current set of Core Web Vitals focuses on the following three areas of UX: loading, interactivity, and visual stability.

These three areas are measured as Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS).

Why Core Web Vitals Are Important

In May 2020, Google announced that Core Web Vitals would be incorporated into Google’s Search Ranking Signals. What that means is your website’s Core Web Vitals will influence how well your website ranks in Google Search, along with many other ranking factors.

Google began introducing Core Web Vitals to Google Search in June 2021 and is scheduled to be completed by the end of August 2021.

Since 2015, Google has been increasing the importance of good website UX as a Search Ranking Signal. This puts more emphasis on website owners to ensure their website UX is excellent if they want to be rewarded with good Google search rankings.

How Can I Tell if My Website Has Good Web Vitals

Unlike most of the other Search Ranking Signals, Google has made it easy to know if your website has good or bad Core Web Vitals.

To get a quick overview of your website’s Core Web Vitals, you can use either Google PageSpeed or Google Lighthouse.

For a fast and simplified overview of your website’s Core Web Vitals, enter your website URL into Google PageSpeed and hit the Analyze button.

To check your website’s Core Web Vitals as well as other user-experience metrics, you can run a Google Lighthouse audit from within Chrome DevTools.

Although there are other ways to check your website’s Core Web Vitals, these two tools use field data wherever possible as opposed to lab data. Field data means data collected from real-life users that have visited your website in the past. Lab data means simulated data based on averages collected from other websites.

Google recommends using field data over lab data when testing a live website’s vitals as field data is more closer to what an actual website user would experience in the real world.

What Can I Do to Make Sure my Website has Good Web Vitals?

Ensuring your website is golden when it comes to Good Web Vitals isn’t necessarily a quick SEO tactic you can implement in just 15 minutes. However, that doesn’t mean it isn’t worth doing.

Fortunately, Google has outlined what metrics you should aim for if you want to have good Core Web Vitals. They are as follows:

  • Largest Contentful Paint: within 2.5 seconds of when the page first starts loading.
  • First Input Delay: 100 milliseconds or less.
  • Cumulative Layout Shift: 0.1. or less.

If your website’s Core Web Vitals are not within the thresholds outlined above, the next section will tell you how you can improve them.

Improving Largest Contentful Paint (LCP)

The time it takes for a browser to receive content from your server can have an adverse effect on how quickly your website appears on the screen. A faster server response time improves page-load metrics, including LCP and time to first byte (TTFB). You can improve your TTFB by:

  • Cleaning up your server and ensuring it has enough resources
  • Implementing a content delivery network (CDN)
  • Ensuring all static website files such as CSS and JavaScript are cached
  • Making sure HTML pages are served cache-first
  • Establishing third-party connections early
  • Using signed exchanges (SXGs) wherever possible

Unoptimized WordPress websites are notorious for having poor LCP/TTFB metrics. That’s why investing in a high-performance WordPress hosting server is important if you want to achieve the best Core Web Vitals scores.

Another area that can drastically improve your LCP is optimizing the way your website loads static JavaScript and CSS files. Here are a few ways you can optimize your website CSS and JS file handling:

  • Reduce/remove render-blocking JS and CSS
  • Compress and minify JS and CSS
  • Defer non-critical JS and CSS while making critical JS and CSS inline

Aside from CSS and JavaScript, your website assets (images, videos, and fonts, etc.) could be hurting your LCP. Here are a few ways to minimize the impact your website assets has on your LCP:

  • Remove unnecessary images
  • Optimize and compress all images
  • Use modern image formats such as JPEG 2000, JPEG XR, or WebP
  • Lazy-load website assets and consider using an image CDN
  • Preload any custom web fonts and use the WOFF2 format
  • Ensure GZIP is enabled on your web server or CDN

Improving First Input Delay (FID)

FID measures the time from when a user first interacts with your web page to the point where they can actually respond. A real-life human interaction would be necessary in order to measure this response delay. That means lab tests will not work to simulate FID. If you are unable to test your website with field data, total blocking time (TFT) can be used as a fallback metric. Here are a few ways to improve your website FID and TFT metrics:

  • Remove or reduce unnecessary JavaScript
  • Break up long tasks into smaller asynchronous tasks
  • Optimize your web page for interaction readiness
  • Reduce reliance on cascading fetches and third-party JavaScripts
  • Consider on-demand loading for critical third-party JavaScript
  • Use a web worker for non-UI operations
  • Reduce JavaScript execution time by deferring unused JS
  • Minimize unused polyfills

Improving Cumulative Layout Shift (CLS)

Have you ever gone to click a website button only for it to shift its position when the page finally loads? That’s an example of a website with poor CLS. The CLS metric measures how unstable your website layout is by summing up layout shifts that happen without user input. It also looks at how much visible content has shifted in view as well as the distance it shifted by. If your website has poor CLS, here are a few ways to improve it:

  • Always include width and height size attributes on your website images and videos
  • If your website has banner ads, statically reserve a space for the ad slots
  • Avoid placing non-sticky ads at the top of the viewport
  • Use a placeholder or fallback to reserve the space for iframes and embeds
  • Avoid pop-ups that shift the layout of your web pages when they are still loading
  • Avoid using animations that shift your website layout
  • Use the CSS transform attribute over attributes that can cause layout shifts such as box-shadow and box-sizing

Wrapping Up

Website optimization is crucial for success. Your site plays a key role in growing your email list and getting you new leads to engage with. It needs to be built successfully so it leaves an impression and delights your users.

Improving your Core Web Vitals will not only help improve your user experience but also your website’s ability to rank well in Google search.

Author:
James, Founder of jamesbanks.co, teaches entrepreneurs how to start and scale online businesses with search engine marketing. His content stems from growing over a thousand businesses over the last decade as a digital marketing consultant and agency co-owner.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

SEARCHENGINES

Google Says If You Redesign Your Site Your Rankings May Go Nuts

Published

on

Google Waves

Gary Illyes from the Google Search Relations team posted another PSA on LinkedIn. This time he said, “when you redesign a site, its rankings in search engines may go nuts.”

Yes, this is probably super obvious to most of you reading this site but Gary dives a bit deeper.

He said, “Among other things, search engines use the HTML of your pages to make sense of the content. If for example you break up paragraphs, remove H tags in favor of CSS styling, or add breaking tags (especially true for CJK languages), you change the HTML parsers’ output, which in turn may change the site’s rankings.”

In short, when redesigning, sure – go ahead – make the site pretty. But changing the core HTML can result in ranking changes.

Gary recommends, “try to use semantically similar HTML when you redesign the site and avoid adding tags where you don’t actually need them.”

So if you can change the design but at the same time keep things in the HTML looking similar, that is your best bet. Change a lot without changing a lot – if that makes sense.

Forum discussion at LinkedIn.

Source link

Continue Reading

SEARCHENGINES

Yandex Search Ranking Factors Leaked & Exposed

Published

on

Yandex Leak

Yandex had a boatload of its source code across all its technology allegedly leaked by a disgruntled employee and part of that was the source code for Russia’s largest search engine – Yandex. As you can imagine, SEOs and others are diving in and seeing what they can learn from the source code.

I personally did not download the source code, so I did not go through it myself but I wanted to share what people did find via Twitter from their investigations of the source code.

11

Will this help you do SEO on Google? Probably not but hey, it is super interesting.

Forum discussion at WebmasterWorld.



Source link

Continue Reading

SEARCHENGINES

Unconfirmed Google Update Impacting Product Reviews Sites On January 26th

Published

on

Google Product Reviews Update

On Thursday, January 26th, there were some signs of a possible Google search ranking algorithm update. The signals and chatter I was tracking, honestly, were not at super high levels. However, now that I see Glenn Gabe shared some really shocking charts of sites previously impacted by Google updates but now seeing a big swing. This is not necessarily a product reviews update but rather sites in the product reviews space that are seeing massive swings on the 26th.

Let me first share Glenn’s charts, which he posted on Twitter and said, “Heads-up, run a product reviews site? Google pushed something on 1/26 that, once again, impacted some product review sites HEAVILY. These are sites I’ve documented before with crazy surges/drops, even outside of Product Reviews Updates. Let’s see if this sticks. I hope it does.”

click for full size

click for full size

Here is that tweet:

Starting on the 25th, I began to see limited chatter kick into gear at WebmasterWorld. The chatter there was not specific to the product reviews update but it could have been related to it. Here are some quotes from there:

My site has been climbing rankings all week and yet today the traffic has been dismal (UK). Do I smell an update?

My ranks, as I said, have been getting better after being in decline steadily since this time last year. I haven’t done anything to improve. Could this mean that google are indeed looking at different ranking signals, not just links? I could have gained a link or two that has turned it around I suppose but no evidence so far.

After great improvement over the month of January, got a big bang down since January 24 onwards. Changes again?

Definitely, something is wrong with Google. My traffic is going down without any reason. Are there any delays with Analytics?

I don’t know what’s going on, but my website has been in free fall for a week now. And it’s the usual picture again, old news articles or meaningless keyword spam ranks in my area on the top places. It’s really no fun anymore …

Google Tracking Tools

Here are what the tracking tool are showing:

Mozcast:

click for full size

Semrush:

click for full size

Cognitive SEO:

click for full size

Advanced Web Rankings:

click for full size

RankRanger:

click for full size

Accuranker:

click for full size

SERPmetrics:

click for full size

Algoroo:

click for full size

So the tools are not lighting up but this update may have been something tweaked with the product reviews update, or a new update impacting product review sites or limited to those segments of websites?

Have any of you seen any big changes around January 26th?

Forum discussion at Twitter.



Source link

Continue Reading

Trending

en_USEnglish