Connect with us

SEO

What Are Core Web Vitals & How Can You Improve Them?

Published

on

What Are Core Web Vitals & How Can You Improve Them?


Core Web Vitals are speed metrics that are part of Google’s Page Experience signals used to measure user experience. The metrics measure visual load with Largest Contentful Paint (LCP), visual stability with Cumulative Layout Shift (CLS), and interactivity with First Input Delay (FID).

Mobile page experience and the included Core Web Vital metrics have officially been used for ranking pages since May 2021. Desktop signals have also been used as of February 2022.

Google's Page Experience signals include https, no intrusive interstitials, mobile-friendliness, and core web vitals

The easiest way to see the metrics for your site is with the Core Web Vitals report in Google Search Console. With the report, you can easily see if your pages are categorized as “poor URLs,” “URLs need improvement,” or “good URLs.”

The thresholds for each category are as follows:

  Good Needs improvement Poor
LCP <=2.5s <=4s >4s
FID <=100ms <=300ms >300ms
CLS <=0.1 <=0.25 >0.25

And here’s how the report looks:

Mobile and desktop Core Web Vitals report in Google Search Console

If you click into one of these reports, you get a better breakdown of the issues with categorization and the number of URLs impacted.

Breakdown of Core Web Vitals issues in GSC

Clicking into one of the issues gives you a breakdown of page groups that are impacted. This grouping of pages makes a lot of sense. This is because most of the changes to improve Core Web Vitals are done for a particular page template that impacts many pages. You make the changes once in the template, and that will be fixed across the pages in the group.

GSC page groups with specific issues

Now that you know what pages are impacted, here’s some more information about Core Web Vitals and how you can get your pages to pass the checks:

Quick facts about Core Web Vitals

Fact 1: The metrics are split between desktop and mobile. Mobile signals are used for mobile rankings, and desktop signals are used for desktop rankings.

Fact 2: The data comes from the Chrome User Experience Report (CrUX), which records data from opted-in Chrome users. The metrics are assessed at the 75th percentile of users. So if 70% of your users are in the “good” category and 5% are in the “need improvement” category, then your page will still be judged as “need improvement.”

Fact 3: The metrics are assessed for each page. But if there isn’t enough data, Google Webmaster Trends Analyst John Mueller states that signals from sections of a site or the overall site may be used. In our Core Web Vitals data study, we looked at over 42 million pages and found that only 11.4% of the pages had metrics associated with them.

Fact 4: With the addition of these new metrics, Accelerated Mobile Pages (AMP) was removed as a requirement from the Top Stories feature on mobile. Since new stories won’t actually have data on the speed metrics, it’s likely the metrics from a larger category of pages or even the entire domain may be used.

Fact 5: Single Page Applications don’t measure a couple of metrics, FID and LCP, through page transitions. There are a couple of proposed changes, including the App History API and potentially a change in the metric used to measure interactivity that would be called “Responsiveness.”

Fact 6: The metrics may change over time, and the thresholds may as well. Google has already changed the metrics used for measuring speed in its tools over the years, as well as its thresholds for what is considered fast or not.

Core Web Vitals have already changed, and there are more proposed changes to the metrics. I wouldn’t be surprised if page size was added. You can pass the current metrics by prioritizing assets and still have an extremely large page. It’s a pretty big miss, in my opinion.

Are Core Web Vitals important for SEO?

There are over 200 ranking factors, many of which don’t carry much weight. When talking about Core Web Vitals, Google reps have referred to these as tiny ranking factors or even tiebreakers. I don’t expect much, if any, improvement in rankings from improving Core Web Vitals. Still, they are a factor, and this tweet from John shows how the boost may work.

There have been ranking factors targeting speed metrics for many years. So I wasn’t expecting much, if any, impact to be visible when the mobile page experience update rolled out. Unfortunately, there were also a couple of Google core updates during the time frame for the Page Experience update, which makes determining the impact too messy to draw a conclusion.

There are a couple of studies that found some positive correlation between passing Core Web Vitals and better rankings, but I personally look at these results with skepticism. It’s like saying a site that focuses on SEO tends to rank better. If a site is already working on Core Web Vitals, it likely has done a lot of other things right as well. And people did work on them, as you can see in the chart below from our data study.

Graph showing percentage of good FID, LCP, and CLS over time

Let’s look at each of the Core Web Vitals in more detail.

Components of Core Web Vitals

Here are the three current components of Core Web Vitals and what they measure:

  • Largest Contentful Paint (LCP) – Visual load
  • Cumulative Layout Shift (CLS) – Visual stability
  • First Input Delay (FID) – Interactivity

Note there are additional Web Vitals that serve as proxy measures or supplemental metrics but are not used in the ranking calculations. The Web Vitals metrics for visual load include Time to First Byte (TTFB) and First Contentful Paint (FCP). Total Blocking Time (TBT) and Time to Interactive (TTI) help to measure interactivity.

Largest Contentful Paint

LCP is the single largest visible element loaded in the viewport.

The largest element is usually going to be a featured image or maybe the <h1> tag. But it could also be any of these:

  • <img> element
  • <image> element inside an <svg> element
  • Image inside a <video> element
  • Background image loaded with the url() function
  • Blocks of text

<svg> and <video> may be added in the future.

How to see LCP

In PageSpeed Insights, the LCP element will be specified in the “Diagnostics” section. Also, notice there is a tab to select LCP that will only show issues related to LCP.

Largest Contentful Paint issues in PageSpeed Insights point to the blue LCP tab

In Chrome DevTools, follow these steps:

  1. Performance > check “Screenshots”
  2. Click “Start profiling and reload page”
  3. LCP is on the timing graph
  4. Click the node; this is the element for LCP
Checking LCP in Chrome DevTools

Optimizing LCP

As we saw in PageSpeed Insights, there are a lot of issues that need to be solved, making LCP the hardest metric to improve, in my opinion. In our study, I noticed that most sites didn’t seem to improve their LCP over time.

Here are a few concepts to keep in mind and some ways you can improve LCP.

1. Smaller is faster

If you can get rid of any files or reduce their sizes, then your page will load faster. This means you may want to delete any files not being used or parts of the code that aren’t used.

How you go about this will depend a lot on your setup, but the process is usually referred to as tree shaking. This is commonly done via some kind of automated process. But in some systems, this step may not be worth the effort.

There’s also compression, which makes the file sizes smaller. Pretty much every file type used to build your website can be compressed, including CSS, JavaScript, Images, and HTML.

2. Closer is faster

Information takes time to travel. The further you are from a server, the longer it takes for the data to be transferred. Unless you serve a small geographical area, having a Content Delivery Network (CDN) is a good idea.

CDNs give you a way to connect and serve your site that’s closer to users. It’s like having copies of your server in different locations around the world.

3. Use the same server if possible

When you first connect to a server, there’s a process that navigates the web and establishes a secure connection between you and the server. This takes some time, and each new connection you need to make adds additional delay while it goes through the same process. If you host your resources on the same server, you can eliminate those extra delays.

If you can’t use the same server, you may want to use preconnect or DNS-prefetch to start connections earlier. A browser will typically wait for the HTML to finish downloading before starting a connection. But with preconnect or DNS-prefetch, it starts earlier than it normally would. Do note that DNS-prefetch has better support than preconnect.

4. Cache what you can

When you cache resources, they’re downloaded for the first page view but don’t need to be downloaded for subsequent page views. With the resources already available, additional page loads will be much faster. Check out how few files are downloaded in the second page load in the waterfall charts below.

First load of the page:

Waterfall chart for the first load of the page

Second load of the page:

Waterfall chart for the second load of the page, which is much smaller
5. Prioritization of resources

To pass the LCP check, you should prioritize how your resources are loaded in the critical rendering path. What I mean by that is you want to rearrange the order in which the resources are downloaded and processed. You should first load the resources needed to get the content users see immediately, then load the rest.

Many sites can get to a passing time for LCP by just adding some preload statements for things like the main image, as well as necessary stylesheets and fonts. Let’s look at how to optimize the various resource types.

Images Early

If you don’t need the image, the most impactful solution is to simply get rid of it. If you must have the image, I suggest optimizing the size and quality to keep it as small as possible.

On top of that, you may want to preload the image. This is going to start the download of that image a little earlier. This means it’s going to display a little earlier. A preload statement for a responsive image looks like this:

<link rel="preload" as="image" href=“cat.jpg"
imagesrcset=“cat_400px.jpg 400w,
cat_800px.jpg 800w, cat_1600px.jpg 1600w"
image>

Images Late

You should lazy load any images that you don’t need immediately. This loads images later in the process or when a user is close to seeing them. You can use loading=“lazy” like this:

<img src=“cat.jpg" alt=“cat" loading="lazy">

CSS Early

We already talked about removing unused CSS and minifying the CSS you have. The other major thing you should do is to inline critical CSS. What this does is it takes the part of the CSS needed to load the content users see immediately and then applies it directly into the HTML. When the HTML is downloaded, all the CSS needed to load what users see is already available.

Inlining critical CSS moves part of the CSS into the HTML
CSS Late

With any extra CSS that isn’t critical, you’ll want to apply it later in the process. You can go ahead and start downloading the CSS with a preload statement but not apply the CSS until later with an onload event. This looks like:

<link rel="preload" href="https://ahrefs.com/blog/core-web-vitals/stylesheet.css" as="style" onload="this.rel="stylesheet"">

Fonts

I’m going to give you a few options here for what I think is:

Good: Preload your fonts. Even better if you use the same server to get rid of the connection.

Better: Font-display: optional. This can be paired with a preload statement. This is going to give your font a small window of time to load. If the font doesn’t make it in time, the initial page load will simply show a default font. Your custom font will then be cached and show up on subsequent page loads.

Best: Just use a system font. There’s nothing to load—so no delays.

JavaScript Early

We already talked about removing unused JavaScript and minifying what you have. If you’re using a JavaScript framework, then you may want to prerender or server-side render (SSR) the page.

Your other options are to inline the JavaScript needed early. It’s similar to what we discussed about CSS, where you load portions of the code within the HTML or preload the JavaScript files so that you get them earlier. This should only be done for assets needed to load the content above the fold or if some functionality depends on this JavaScript.

JavaScript Late

Any JavaScript you don’t need immediately should be loaded later. There are two main ways to do that—defer and async attributes. These attributes can be added to your script tags.

Usually, a script being downloaded blocks the parser while downloading and executing. Async will let the parsing and downloading occur at the same time but still block parsing during the script execution. Defer will not block parsing during the download and only execute after the HTML has finished parsing.

How async and defer impact html loading

Which should you use? For anything that you want earlier or that has dependencies, I’ll lean toward async. For instance, I tend to use async on analytics tags so that more users are recorded. You’ll want to defer anything that is not needed until later or doesn’t have dependencies. The attributes are pretty easy to add. Check out these examples:

Normal:

<script src="https://www.domain.com/file.js"></script>

Async:

<script src="https://www.domain.com/file.js" async></script>

Defer:

<script src="https://www.domain.com/file.js" defer></script>

Misc

There are a few other technologies that you may want to look at to help with performance. These include Speculative Prerendering, Early Hints, Signed Exchanges, and HTTP/3.

Resources

Cumulative Layout Shift

CLS measures how elements move around or how stable the page layout is. It takes into account the size of the content and the distance it moves. Google has already updated how CLS is measured. Previously, it would continue to measure even after the initial page load. But now it’s restricted to a five-second time frame where the most shifting occurs.

It can be annoying if you try to click something on a page that shifts and you end up clicking on something you don’t intend to. It happens to me all the time. I click on one thing and, suddenly, I’m clicking on an ad and am now not even on the same website. As a user, I find that frustrating.

Example of the layout shifting when trying to click a link

Common causes of CLS include:

  • Images without dimensions.
  • Ads, embeds, and iframes without dimensions.
  • Injecting content with JavaScript.
  • Applying fonts or styles late in the load.

How to see CLS

In PageSpeed Insights, if you select CLS, you can see all the related issues. The main one to pay attention to here is “Avoid large layout shifts.”

CLS issues in PageSpeed Insights

We’re using WebPageTest. In Filmstrip View, use the following options:

  • Highlight Layout Shifts
  • Thumbnail Size: Huge
  • Thumbnail Interval: 0.1 secs

Notice how our font restyles between 5.1 secs and 5.2 secs, shifting the layout as our custom font is applied.

Layout shift from applying a custom font

Smashing Magazine also had an interesting technique where it outlined everything with a 3px solid red line and recorded a video of the page loading to identify where layout shifts were happening.

Optimizing CLS

In most cases, to optimize CLS, you’re going to be working on issues related to images, fonts or, possibly, injected content. Let’s look at each case.

Images

For images, what you need to do is reserve the space so that there’s no shift and the image simply fills that space. This can mean setting the height and width of images by specifying them within the <img> tag like this:

<img src=“cat.jpg" width="640" height="360" alt=“cat with string" />

For responsive images, you need to use a srcset like this:

<img

width="1000"

height="1000"

src="https://ahrefs.com/blog/core-web-vitals/puppy-1000.jpg"

alt="Puppy with balloons" />

And reserve the max space needed for any dynamic content like ads.

Fonts

For fonts, the goal is to get the font on the screen as fast as possible and to not swap it with another font. When a font is loaded or changed, you end up with a noticeable shift like a Flash of Invisible Text (FOIT) or Flash of Unstyled Text (FOUT).

If you can use a system font, do that. There’s nothing to load, so there are no delays or changes that will cause a shift.

If you have to use a custom font, the current best method for minimizing CLS is to combine  <link rel=”preload”> (which is going to try to grab your font as soon as possible) and font-display: optional (which is going to give your font a small window of time to load). If the font doesn’t make it in time, the initial page load will simply show a default font. Your custom font will then be cached and show up on subsequent page loads.

Injected content

When content is dynamically inserted above existing content, this causes a layout shift. If you’re going to do this, reserve enough space for it ahead of time.

Resources

First Input Delay

FID is the time from when a user interacts with your page to when the page responds. You can also think of it as responsiveness.

Example interactions:

  • Clicking on a link or button
  • Inputting text into a blank field
  • Selecting a drop-down menu
  • Clicking a checkbox

Some events like scrolling or zooming are not counted.

It can be frustrating trying to click something, and nothing happens on the page.

Not all users will interact with a page, so the page may not have an FID value. This is also why lab test tools won’t have the value because they’re not interacting with the page. What you may want to look at for lab tests is Total Blocking Time (TBT). In PageSpeed Insights, you can use the TBT tab to see related issues.

TBT issues in PageSpeed Insights

What causes the delay?

JavaScript competing for the main thread. There’s just one main thread, and JavaScript competes to run tasks on it. Think of it like JavaScript having to take turns to run.

While a task is running, a page can’t respond to user input. This is the delay that is felt. The longer the task, the longer the delay experienced by the user. The breaks between tasks are the opportunities that the page has to switch to the user input task and respond to what they wanted to do.

Optimizing FID

Most pages pass FID checks. But if you need to work on FID, there are just a few items you can work on. If you can reduce the amount of JavaScript running, then do that.

If you’re on a JavaScript framework, there’s a lot of JavaScript needed for the page to load. That JavaScript can take a while to process in the browser, and that can cause delays. If you use prerendering or (SSR), you shift this burden from the browser to the server.

Another option is to break up the JavaScript so that it runs for less time. You take those long tasks that delay response to user input and break them into smaller tasks that block for less time. This is done with code splitting, which breaks the tasks into smaller chunks.

There’s also the option of moving some of the JavaScript to a service worker. I did mention that JavaScript competes for the one main thread in the browser, but this is sort of a workaround that gives it another place to run.

There are some trade-offs as far as caching goes. And the service worker can’t access the DOM, so it can’t do any updates or changes. If you’re going to move JavaScript to a service worker, you really need to have a developer that knows what to do.

Resources

Tools for measuring Core Web Vitals

There are many tools you can use for testing and monitoring. Generally, you want to see the actual field data, which is what you’ll be measured on. But the lab data is more useful for testing.

The difference between lab and field data is that field data looks at real users, network conditions, devices, caching, etc. But lab data is consistently tested based on the same conditions to make the test results repeatable.

Many of these tools use Lighthouse as the base for their lab tests. The exception is WebPageTest, although you can also run Lighthouse tests with it as well. The field data comes from CrUX.

Field Data

There are some additional tools you can use to gather your own Real User Monitoring (RUM) data that provide more immediate feedback on how speed improvements impact your actual users (rather than just relying on lab tests).

Lab Data

PageSpeed Insights is great to check one page at a time. But if you want both lab data and field data at scale, the easiest way to get that is through the API. You can connect to it easily with Ahrefs Webmaster Tools (free) or Ahrefs’ Site Audit and get reports detailing your performance.

CWV reports in Ahrefs' Site Audit

Note that the Core Web Vitals data shown will be determined by the user-agent you select for your crawl during the setup.

I also like the report in GSC because you can see the field data for many pages at once. But the data is a bit delayed and on a 28-day rolling average, so changes may take some time to show up in the report.

Another thing that may be useful is you can find the scoring weights for Lighthouse at any point in time and see the historical changes. This can give you some idea of why your scores have changed and what Google may be weighting more over time.

Lighthouse scoring calculator with metric weights

Final thoughts

I don’t think Core Web Vitals have much impact on SEO and, unless you are extremely slow, I generally won’t prioritize fixing them. If you want to argue for Core Web Vitals improvements, I think that’s hard to do for SEO.

However, you can make a case for it for user experience. Or as I mentioned in my page speed article, improvements should help you record more data in your analytics, which “feels” like an increase. You may also be able to make a case for more conversions, as there are a lot of studies out there that show this (but it also may be a result of recording more data).

Here’s another key point: work with your developers; they are the experts here. Page speed can be extremely complex. If you’re on your own, you may need to rely on a plugin or service (e.g., WP Rocket or Autoptimize) to handle this.

Things will get easier as new technologies are rolled out and many of the platforms like your CMS, your CDN, or even your browser take on some of the optimization tasks. My prediction is that within a few years, most sites won’t even have to worry much because most of the optimizations will already be handled.

Many of the platforms are already rolling out or working on things that will help you.

Already, WordPress is preloading the first image and is putting together a team to work on Core Web Vitals. Cloudflare has already rolled out many things that will make your site faster, such as Early Hints, Signed Exchanges, and HTTP/3. I expect this trend to continue until site owners don’t even have to worry about working on this anymore.

As always, message me on Twitter if you have any questions.





Source link

SEO

Top 5 Essential SEO Reporting Tools For Agencies

Published

on

Top 5 Essential SEO Reporting Tools For Agencies

Your clients trust you to create real results and hit KPIs that drive their businesses forward.

Understanding the intricacies of how that works can be difficult, so it’s essential to demonstrate your progress and efforts.

SEO reporting software showcases important metrics in a digestible and visually represented way. They save guesswork and manual referencing, highlighting achievements over a specified time.

A great tool can also help you formulate action items, gauge the performance of campaigns, and see real results that can help you create new and innovative evaluations.

The latest and allegedly greatest tools hit the market all the time, promising to transform how you conduct reports.

Certainly, you have to weigh a few factors when deciding which software to implement. Price, features, and ease of use are the most important to consider.

A cost-effective tool with a steep learning curve might not be worth it for the features. Similarly, an expensive tool might be more appealing if it is user-friendly but could quickly run up costs.

Just like any transformational business decision, you’ll have to weigh the pros and cons carefully to determine the right one for you.

Key Takeaways

  • Cost, accessibility, and features are the common thread of comparison for SEO reporting tools.
  • To truly get the best use out of an SEO reporting tool for your agency, you’ll need to weigh several details, including scalability, customization, integrations, and access to support.
  • What might be considered a subpar tool could be a game-changer for an agency. Due diligence and research are the keys to knowing what will work for your team.

What To Look For In SEO Reporting Tools

It can be tough to make heads or tails of the available tools and choose which will benefit your agency the most.

Here are the 10 essential requirements of SEO reporting tools.

1. Accurate And Current Regional Data

SEO reporting is all about data. The software must have access to accurate and current data localized to your client’s targeted region.

Search data from the U.S. is meaningless if your client tries to rank for [London plumbing services], so localization matters.

The tool must update data regularly and with reliable accuracy so you can make informed decisions about where your client stands against the competition.

2. Integration With Third-Party Tools

Especially for full-scale digital marketing campaigns, the ability to report on all KPIs in one place is essential.

The more available integrations with third-party tools (e.g., Google Analytics, Google Business Profile, Majestic), the better.

Some tools even allow you to upload custom data sets.

3. Scalability

You don’t want to have to retrain or reinvest in new software every time your agency reaches a new tier.

The right SEO reporting tool should work well for your current business size and leave room for expansion as you onboard more clients.

4. Strong Suite Of Features

A great SEO reporting tool should include:

  • Position tracking.
  • Backlink monitoring.
  • Competitor data.
  • Analytics.

It is a bonus if the tool has reporting features for social media, email marketing, call tracking, and/or paid ads to make it a full-suite digital marketing software.

5. Continually Improving And Updating Features

SEO is constantly evolving, and so should SEO reporting tools.

As we continue the transition from website optimization to web presence optimization, a tool’s ability to integrate new features is essential.

6. Ability To Customize Reports

Each client will have different KPIs, objectives, and priorities.

Presenting the information that clients want to see is paramount to successful campaigns and retention.

Your reporting software of choice should be able to emphasize the correct data at the right times.

7. Client Integration

A good SEO reporting tool must have the client in mind.

It should have a simple bird’s eye overview of the basics but also be easy for clients to dig into the data at a deeper level.

This can mean automated summary reports or 24/7 client access to the dashboard.

8. Ability To White Label Reports

While white labeling is not essential (no client will sniff at receiving a report with a Google logo in the top corner), it helps keep branding consistent and gives a professional sheen to everything you send a client’s way.

9. Access To Support Resources

Quality support resources can help you find a detour when you encounter a roadblock.

Whether it’s detailed support documentation, a chat feature/support desk, or responsive customer support on social media, finding the help you need to solve the issue is important.

10. Cost-To-Value Ratio

With a proper process, time investment, and leveraging support resources, it is possible to get better results from a free reporting tool than one that breaks the bank.

This can mean automated summary reports or 24/7 client access to the dashboard.

Top 5 SEO Reporting Tools

In evaluating five of the most popular SEO reporting tools, based on the above criteria, here is how they stack up:

1. AgencyAnalytics

My Overall Rating: 4.7/5

Image credit: AgencyAnalytics, December 2022

AgencyAnalytics is a quality introductory/intermediate reporting tool for agencies.

Among the tools on this list, it is one of the easiest to use for small to mid-sized agencies.

It starts at $12 per month, per client, with unlimited staff and client logins, a white-label dashboard, and automated branded reports. The minimum purchase requirements mean the first two tiers work out to $60 per month and $180 per month, respectively. But your ability to change the payment based on the number of clients could help keep costs lean.

AgencyAnalytics comes with 70+ supported third-party data integrations.

However, this reliance on third-party data means you may have incomplete reports when there is an interruption in the transmission.

Though new integrations are always being added, they can be glitchy at first, making them unreliable to share with clients until stabilized.

With the ability for clients to log in and view daily data updates, it provides real-time transparency.

Automated reports can be customized, and the drag-and-drop customized dashboard makes it easy to emphasize priority KPIs.

2. SE Ranking

My Overall Rating: 4.5/5

SE Ranking has plans starting at $39.20 per month, although the $87.20 per month plan is necessary if you need historical data or more than 10 projects.

Setup is a breeze, as the on-screen tutorial guides you through the process.

SE Ranking features a strong collection of SEO-related tools, including current and historical position tracking, competitor SEO research, keyword suggestion, a backlink explorer, and more.

SE Ranking is hooked up with Zapier, which allows users to integrate thousands of apps and provide a high level of automation between apps like Klipfolio, Salesforce, HubSpot, and Google Apps.

SE Ranking is an effective SEO reporting tool at a beginner to intermediate level.

However, you may want to look in a different direction if your agency requires more technical implementations or advanced customization.

3. Semrush

My Overall Rating: 4.4/5

Semrush is one of the most SEO-focused reporting tools on the list, which is reflected in its features.

Starting at $229.95 per month for the agency package, it’s one of the more expensive tools on the list. But Semrush provides a full suite of tools that can be learned at an intermediate level.

A major downside of Semrush, especially for cost-conscious agencies, is that an account comes with only one user login.

Having to purchase individual licenses for each SEO analyst or account manager adds up quickly, and the users you can add are limited by the plan features. This makes scalability an issue.

Semrush has both branded and white-label reports, depending on your subscription level. It uses a proprietary data stream, tracking more than 800 million keywords.

The ever-expanding “projects” feature covers everything from position tracking to backlink monitoring and social media analysis.

Though it doesn’t fall specifically under the scope of SEO reporting, Semrush’s innovation makes it a one-stop shop for many agencies.

Project features include Ad Builder, which helps craft compelling ad text for Google Ads, and Social Media Poster, which allows agencies to schedule client social posts.

Combining such diverse features under the Semrush umbrella offsets its relatively high cost, especially if you can cancel other redundant software.

4. Looker Studio

My Overall Rating: 3.6/5

Looker StudioScreenshot from Looker Studio, December 2022

Formerly known as Google Data Studio, Looker Studio is a Google service that has grown considerably since its initial launch.

Though it is much more technical and requires more time investment to set up than most other tools on this list, it should be intuitive for staff familiar with Google Analytics.

If you’re on the fence, Looker Studio is completely free.

A major upside to this software is superior integration with other Google properties like Analytics, Search Console, Ads, and YouTube.

Like other reporting tools, it also allows third-party data integration, but the ability to query data from databases, including MySQL, PostgreSQL, and Google’s Cloud SQL, sets it apart.

You can customize reports with important KPIs with proper setup, pulling from lead and customer information. For eCommerce clients, you can even integrate sales data.

Though the initial setup will be much more technical, the ability to import templates saves time and effort.

You can also create your own templates that better reflect your processes and can be shared across clients. Google also has introductory video walk-throughs to help you get started.

5. Authority Labs

My Overall Rating: 3.2/5

Authority Labs Ranking ReportImage credit: Authority Labs, December 2022

Authority Labs does the job if you’re looking for a straightforward position-tracking tool.

Authority Labs is $49 per month for unlimited users, though you will need to upgrade to the $99 per month plan for white-label reporting.

You can track regional ranking data, get insights into “(not provided)” keywords, track competitor keywords, and schedule automated reporting.

However, lacking other essential features like backlink monitoring or analytic data means you will have to supplement this tool to provide a full SEO reporting picture for clients.

Conclusion

There are many quality SEO reporting tools on the market. What makes them valuable depends on their ability to work for your clients’ needs.

SE Ranking has a fantastic cost-to-value ratio, while Looker Studio has advanced reporting capabilities if you can withstand a higher barrier to entry.

Agency Analytics prioritizes client access, which is a big deal if transparency is a core value for your agency.

Authority Labs keeps it lean and clean, while Semrush always adds innovative features.

These five are simply a snapshot of what is available. There are new and emerging tools that might have some features more appealing to your current clients or fill gaps that other software creates despite being a great solution.

Ultimately, you need to consider what matters most to your agency. Is it:

  • Feature depth?
  • Scalability?
  • Cost-to-value ratio?

Once you weigh the factors that matter most for your agency, you can find the right SEO reporting tool. In the meantime, don’t shy away from testing out a few for a trial period.

If you don’t want to sign up for a full month’s usage, you can also explore walkthrough videos and reviews from current users. The most informed decision requires an understanding of the intricate details.


Featured Image: Paulo Bobita/Search Engine Journal



Source link

Continue Reading

SEO

How to Block ChatGPT From Using Your Website Content

Published

on

How to Block ChatGPT From Using Your Website Content

There is concern about the lack of an easy way to opt out of having one’s content used to train large language models (LLMs) like ChatGPT. There is a way to do it, but it’s neither straightforward nor guaranteed to work.

How AIs Learn From Your Content

Large Language Models (LLMs) are trained on data that originates from multiple sources. Many of these datasets are open source and are freely used for training AIs.

Some of the sources used are:

  • Wikipedia
  • Government court records
  • Books
  • Emails
  • Crawled websites

There are actually portals and websites offering datasets that are giving away vast amounts of information.

One of the portals is hosted by Amazon, offering thousands of datasets at the Registry of Open Data on AWS.

Screenshot from Amazon, January 2023

The Amazon portal with thousands of datasets is just one portal out of many others that contain more datasets.

Wikipedia lists 28 portals for downloading datasets, including the Google Dataset and the Hugging Face portals for finding thousands of datasets.

Datasets of Web Content

OpenWebText

A popular dataset of web content is called OpenWebText. OpenWebText consists of URLs found on Reddit posts that had at least three upvotes.

The idea is that these URLs are trustworthy and will contain quality content. I couldn’t find information about a user agent for their crawler, maybe it’s just identified as Python, I’m not sure.

Nevertheless, we do know that if your site is linked from Reddit with at least three upvotes then there’s a good chance that your site is in the OpenWebText dataset.

More information about OpenWebText is here.

Common Crawl

One of the most commonly used datasets for Internet content is offered by a non-profit organization called Common Crawl.

Common Crawl data comes from a bot that crawls the entire Internet.

The data is downloaded by organizations wishing to use the data and then cleaned of spammy sites, etc.

The name of the Common Crawl bot is, CCBot.

CCBot obeys the robots.txt protocol so it is possible to block Common Crawl with Robots.txt and prevent your website data from making it into another dataset.

However, if your site has already been crawled then it’s likely already included in multiple datasets.

Nevertheless, by blocking Common Crawl it’s possible to opt out your website content from being included in new datasets sourced from newer Common Crawl data.

The CCBot User-Agent string is:

CCBot/2.0

Add the following to your robots.txt file to block the Common Crawl bot:

User-agent: CCBot
Disallow: /

An additional way to confirm if a CCBot user agent is legit is that it crawls from Amazon AWS IP addresses.

CCBot also obeys the nofollow robots meta tag directives.

Use this in your robots meta tag:

<meta name="robots" content="nofollow">

Blocking AI From Using Your Content

Search engines allow websites to opt out of being crawled. Common Crawl also allows opting out. But there is currently no way to remove one’s website content from existing datasets.

Furthermore, research scientists don’t seem to offer website publishers a way to opt out of being crawled.

The article, Is ChatGPT Use Of Web Content Fair? explores the topic of whether it’s even ethical to use website data without permission or a way to opt out.

Many publishers may appreciate it if in the near future, they are given more say on how their content is used, especially by AI products like ChatGPT.

Whether that will happen is unknown at this time.

More resources:

Featured image by Shutterstock/ViDI Studio



Source link

Continue Reading

SEO

Google’s Mueller Criticizes Negative SEO & Link Disavow Companies

Published

on

Google's Mueller Criticizes Negative SEO & Link Disavow Companies

John Mueller recently made strong statements against SEO companies that provide negative SEO and other agencies that provide link disavow services outside of the tool’s intended purpose, saying that they are “cashing in” on clients who don’t know better.

While many frequently say that Mueller and other Googlers are ambiguous, even on the topic of link disavows.

The fact however is that Mueller and other Googlers have consistently recommended against using the link disavow tool.

This may be the first time Mueller actually portrayed SEOs who liberally recommend link disavows in a negative light.

What Led to John Mueller’s Rebuke

The context of Mueller’s comments about negative SEO and link disavow companies started with a tweet by Ryan Jones (@RyanJones)

Ryan tweeted that he was shocked at how many SEOs regularly offer disavowing links.

He tweeted:

“I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.”

The reason why Ryan is shocked is because Google has consistently recommended the tool for disavowing paid/spammy links that the sites (or their SEOs) are responsible for.

And yet, here we are, eleven years later, and SEOs are still misusing the tool for removing other kinds of tools.

Here’s the background information about that.

Link Disavow Tool

In the mid 2000’s there was a thriving open market for paid links prior to the Penguin Update in April 2012. The commerce in paid links was staggering.

I knew of one publisher with around fifty websites who received a $30,000 check every month for hosting paid links on his site.

Even though I advised my clients against it, some of them still purchased links because they saw everyone else was buying them and getting away with it.

The Penguin Update caused the link selling boom collapsed.

Thousands of websites lost rankings.

SEOs and affected websites strained under the burden of having to contact all the sites from which they purchased paid links to ask to have them removed.

So some in the SEO community asked Google for a more convenient way to disavow the links.

Months went by and after resisting the requests, Google relented and released a disavow tool.

Google cautioned from the very beginning to only use the tool for disavowing links that the site publishers (or their SEOs) are responsible for.

The first paragraph of Google’s October 2012 announcement of the link disavow tool leaves no doubt on when to use the tool:

“Today we’re introducing a tool that enables you to disavow links to your site.

If you’ve been notified of a manual spam action based on ‘unnatural links’ pointing to your site, this tool can help you address the issue.

If you haven’t gotten this notification, this tool generally isn’t something you need to worry about.”

The message couldn’t be clearer.

But at some point in time, link disavowing became a service applied to random and “spammy looking” links, which is not what the tool is for.

Link Disavow Takes Months To Work

There are many anecdotes about link disavows that helped sites regain rankings.

They aren’t lying, I know credible and honest people who have made this claim.

But here’s the thing, John Mueller has confirmed that the link disavow process takes months to work its way through Google’s algorithm.

Sometimes things happen that are not related, no correlation. It just looks that way.

John shared how long it takes for a link disavow to work in a Webmaster Hangout:

“With regards to this particular case, where you’re saying you submitted a disavow file and then the ranking dropped or the visibility dropped, especially a few days later, I would assume that that is not related.

So in particular with the disavow file, what happens is we take that file into account when we reprocess the links kind of pointing to your website.

And this is a process that happens incrementally over a period of time where I would expect it would have an effect over the course of… I don’t know… maybe three, four, five, six months …kind of step by step going in that direction.

So if you’re saying that you saw an effect within a couple of days and it was a really strong effect then I would assume that this effect is completely unrelated to the disavow file. …it sounds like you still haven’t figured out what might be causing this.”

John Mueller: Negative SEO and Link Disavow Companies are Making Stuff Up

Context is important to understand what was said.

So here’s the context for John Mueller’s remark.

An SEO responded to Ryan’s tweet about being shocked at how many SEOs regularly disavow links.

The person responding to Ryan tweeted that disavowing links was still important, that agencies provide negative SEO services to take down websites and that link disavow is a way to combat the negative links.

The SEO (SEOGuruJaipur) tweeted:

“Google still gives penalties for backlinks (for example, 14 Dec update, so disavowing links is still important.”

SEOGuruJaipur next began tweeting about negative SEO companies.

Negative SEO companies are those that will build spammy links to a client’s competitor in order to make the competitor’s rankings drop.

SEOGuruJaipur tweeted:

“There are so many agencies that provide services to down competitors; they create backlinks for competitors such as comments, bookmarking, directory, and article submission on low quality sites.”

SEOGuruJaipur continued discussing negative SEO link builders, saying that only high trust sites are immune to the negative SEO links.

He tweeted:

“Agencies know what kind of links hurt the website because they have been doing this for a long time.

It’s only hard to down for very trusted sites. Even some agencies provide a money back guarantee as well.

They will provide you examples as well with proper insights.”

John Mueller tweeted his response to the above tweets:

“That’s all made up & irrelevant.

These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”

Then someone else joined the discussion:

Mueller tweeted a response:

“Don’t waste your time on it; do things that build up your site instead.”

Unambiguous Statement on Negative SEO and Link Disavow Services

A statement by John Mueller (or anyone) can appear to conflict with prior statements when taken out of context.

That’s why I not only placed his statements into their original context but also the history going back eleven years that is a part of that discussion.

It’s clear that John Mueller feels that those selling negative SEO services and those providing disavow services outside of the intended use are “making stuff up” and “cashing in” on clients who might not “know better.”

Featured image by Shutterstock/Asier Romero



Source link

Continue Reading

Trending

en_USEnglish