Connect with us

SEO

What It Is & How to Use It

Published

on

What It Is & How to Use It

Diagnosing technical issues on your website can be one of the most time-consuming but important aspects of running a website.

To make things worse, Google only allows you to inspect one URL at a time to diagnose potential issues on your website (this is done within Google Search Console).

Luckily, there is now a faster way to test your website: enter the Google Search Console URL Inspection API…

What is the Google Search Console URL Inspection API?

The Google Search Console URL Inspection API is a way to bulk-check the data that Google Search Console has on URLs. Its purpose is to help developers and SEOs more efficiently debug and optimize their pages using Google’s own data.

Here’s an example of me using the API to check whether a few URLs are indexed and submitted in my sitemap:

Advertisement
In table form, API showing Chris' URLs are indexed but not submitted in sitemap

What type of data can you get from the Google Search Console URL Inspection API?

The Google Search Console URL Inspection API allows you to pull a wide range of data. Below is a list of some of the most discussed features:

lastCrawlTime

With this field, you can understand exactly when Googlebot last crawled your website. This is extremely useful for SEOs and developers to measure the frequency of Google’s crawling of their sites. Previously, you could only get access to this type of data through log file analysis or spot-checking individual URLs with Google Search Console.

robotsTxtState

With this field, you can understand whether you have any robots.txt rules that will block Googlebot. This is something you can check manually, but being able to test it at scale with Google’s own data is a fantastic step forward.

googleCanonical and userCanonical

In some situations, Google has been known to select a different canonical from the one that has been specified in the code. In this situation, having the ability to compare both (side by side and at scale) using the API is useful for enabling you to make the appropriate changes.

crawledAs

This field allows you to understand which user agent is used for a crawl of your site: Mobile/Desktop. The response codes are below for reference:

  • DESKTOP – Desktop user agent
  • MOBILE – Mobile user agent

pageFetchState

Understanding the pageFetchState can help you diagnose server errors, not found 4xxs, soft 404s, redirection errors, pages blocked by robots.txt, and invalid URLs. A list of responses is below for reference.

Field What it means
PAGE_FETCH_STATE_UNSPECIFIED Unknown fetch state
SUCCESSFUL Successful fetch
SOFT_404 Soft 404
BLOCKED_ROBOTS_TXT Blocked by robots.txt
NOT_FOUND Not found (404)
ACCESS_DENIED Blocked due to unauthorized request (401)
SERVER_ERROR Server error (5xx)
REDIRECT_ERROR Redirection error
ACCESS_FORBIDDEN Blocked due to access forbidden (403)
BLOCKED_4XX Blocked due to other 4xx issue (not 403, 404)
INTERNAL_CRAWL_ERROR Internal error
INVALID_URL Invalid URL

indexingState

The indexing state tells you the current status of indexation for the URLs. Apart from the more obvious Pass and Fail responses, there are other responses:

  • NEUTRAL is equivalent to the “Excluded” message in Search Console.
  • PARTIAL is equivalent to the “Valid with warnings” message in Search Console.
  • VERDICT_UNSPECIFIED means that Google is unable to come to a conclusion about the URL(s) in question.

coverageState 

This gives you detail on whether a URL has been submitted in your sitemap and indexed.

referringUrls

This allows you to see where each page is linked from, according to Google.

Advertisement

Sitemap 

This enables you to understand which URLs are included in the sitemap(s).

Other uses for the API

You can also use the API to inspect your AMP site—if you have one.

How to use the Google Search Console URL Inspection API step by step

Using the Google Search Console URL Inspection API involves making a request to Google. The request parameters you need to define are the URL that you want to inspect and also the URL of the property in Google Search Console.

The request body contains data with the following structure:

Structure of data

If you are curious to learn more about how to use the API, Google has extensive documentation about this.

Below is an example of the type of response you can get from the API:

Code response from API

If you’re not comfortable with code or just want to give it a go straight away, you can use valentin.app’s free Google Bulk Inspect URLs tool. The tool provides a quick way to query the API without any coding skills!

Here’s how to use it. You can:

Advertisement
  1. Go to https://valentin.app/inspect.html, authorize access to your Google account, and select the Search Console property you want to test. Then paste your URLs into the box below. (The data will be processed in your browser and not uploaded to a server or shared with anyone.)
  2. Click the “Inspect URLs” button. The data will start to pull from the API.
  3. Export the data as a CSV or Excel file by clicking the button.
  4. Analyze the data and check for any potential issues.

How can you use the Google Search Console URL Inspection API in practice?

In theory, the Google Search Console URL Inspection API seems like a great way to understand more about your website. However, you can pull so much data that it’s difficult to know where to start. So let’s look at a few examples of use cases.

1. Site migration – diagnosing any technical issues

Site migrations can cause all kinds of issues. For example, developers can accidentally block Google from crawling your site or certain pages via robots.txt.

Luckily, the Google Search Console URL Inspection API makes auditing for these issues a doddle.

For example, you can check whether you’re blocking Googlebot from crawling URLs in bulk by calling robotsTxtState.

Here is an example of me using the Google Search Console URL Inspection API (via valentin.app) to call robotsTxtState to see the current status of my URLs.

In table form, API showing status of Chris' URLs as "allowed"

As you can see, these pages are not blocked by robots.txt, and there are no issues here.

TIP

 Site migrations can sometimes lead to unforeseen technical SEO issues. After the migration, we recommend using a tool, such as Ahrefs’ Site Audit, to check your website for over 100 pre-defined SEO issues.

2. Understand if Google has respected your declared canonicals

If you make a change to the canonical tags across your site, you will want to know whether or not Google is respecting them.

Advertisement

You may be wondering why Google ignores the canonical that you declared. Google can do this for a number of reasons, for example:

  • Your declared canonical is not https. (Google prefers https for canonicals.)
  • Google has chosen a page that it believes is a better canonical page than your declared canonical.
  • Your declared canonical is a noindex page.

Below is an example of me using the Google Search Console URL Inspection API to see whether Google has respected my declared canonicals:

In table form, API showing Google respects Chris' canonicals

As we can see from the above screenshot, there are no issues with these particular pages and Google is respecting the canonicals.

TIP

To quickly see if the googleCanonical matches the userCanonical, export the data from the Google Bulk Inspect URLs tool to CSV and use an IF formula in Excel. For example, assuming your googleCanonical data is in Column A and your userCanonical is in Column B, you can use the formula =IF(A2=B2, “Self referencing”,”Non Self Referencing”) to check for non-matching canonicals.

3. Understand when Google recrawls after you make changes to your site

When you update many pages on your website, you will want to know the impact of your efforts. This can only happen after Google has recrawled your site.

With the Google Search Console URL Inspection API, you can see the precise time Google crawled your pages by using lastCrawlTime.

If you can’t get access to the log files for your website, then this is a great alternative to understand how Google crawls your site.

Advertisement

Here’s an example of me checking this:

In table form, API showing "last crawl" date and time for each URL

As you can see in the screenshot above, lastCrawlTime shows the date and time my website was crawled. In this example, the most recent crawl by Google is the homepage.

Understanding when Google recrawls your website following any changes will allow you to link whether or not the changes you made have any positive or negative impact following Google’s crawl.

FAQs

How to get around the Google Search Console URL Inspection API limits?

Although the Google Search Console URL Inspection API is limited to 2,000 queries per day, this query limit is determined by Google Property.

This means you can have multiple properties within one website if they are verified separately in Google Search Console, effectively allowing you to bypass the limit of 2,000 queries per day.

Google Search Console allows you to have 1,000 properties in your Google Search Console account, so this should be more than enough for most users.

Can I use the Google Search Console URL Inspection API on any website?

Another potential limiting factor is you can only run the Google Search Console URL Inspection API on a property that you own in Google Search Console. If you don’t have access to the property, then you cannot audit it using the Google Search Console URL Inspection API.

Advertisement

So this means auditing a site that you don’t have access to can be problematic.

How accurate is the data?

Accuracy of the data itself has been an issue for Google over the last few years. This API gives you access to that data. So arguably, the Google Search Console URL Inspection API is only as good as the data within it.

Tim Guillot shares tweet to John Mu about data issues; Tim adds the issues are "not fixed"

As we have previously shown in our study of Google Keyword Planner’s accuracy, data from Google is often not as accurate as people assume it to be.

Final thoughts

The Google Search Console URL Inspection API is a great way for site owners to get bulk data directly from Google on a larger scale than what was previously possible from Google Search Console.

Daniel Waisberg and the team behind the Google Search Console URL Inspection API have definitely done a great job of getting this released into the wild.

But one of the criticisms of the Google Search Console URL Inspection API from the SEO community is that the query rate limit is too low for larger sites. (It is capped at 2,000 queries per day, per property.)

Advertisement

For larger sites, this is not enough. Also, despite the possible workarounds, this number still seems to be on the low side.

What’s your experience of using the Google Search Console URL Inspection API? Got more questions? Ping me on Twitter. 🙂




Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Click to comment

You must be logged in to post a comment Login

Leave a Reply

SEO

WordPress Releases A Performance Plugin For “Near-Instant Load Times”

Published

on

By

WordPress speculative loading plugin

WordPress released an official plugin that adds support for a cutting edge technology called speculative loading that can help boost site performance and improve the user experience for site visitors.

Speculative Loading

Rendering means constructing the entire webpage so that it instantly displays (rendering). When your browser downloads the HTML, images, and other resources and puts it together into a webpage, that’s rendering. Prerendering is putting that webpage together (rendering it) in the background.

What this plugin does is to enable the browser to prerender the entire webpage that a user might navigate to next. The plugin does that by anticipating which webpage the user might navigate to based on where they are hovering.

Chrome lists a preference for only prerendering when there is an at least 80% probability of a user navigating to another webpage. The official Chrome support page for prerendering explains:

“Pages should only be prerendered when there is a high probability the page will be loaded by the user. This is why the Chrome address bar prerendering options only happen when there is such a high probability (greater than 80% of the time).

There is also a caveat in that same developer page that prerendering may not happen based on user settings, memory usage and other scenarios (more details below about how analytics handles prerendering).

Advertisement

The Speculative Loading API solves a problem that previous solutions could not because in the past they were simply prefetching resources like JavaScript and CSS but not actually prerendering the entire webpage.

The official WordPress announcement explains it like this:

Introducing the Speculation Rules API
The Speculation Rules API is a new web API that solves the above problems. It allows defining rules to dynamically prefetch and/or prerender URLs of certain structure based on user interaction, in JSON syntax—or in other words, speculatively preload those URLs before the navigation. This API can be used, for example, to prerender any links on a page whenever the user hovers over them.”

The official WordPress page about this new functionality describes it:

“The Speculation Rules API is a new web API… It allows defining rules to dynamically prefetch and/or prerender URLs of certain structure based on user interaction, in JSON syntax—or in other words, speculatively preload those URLs before the navigation.

This API can be used, for example, to prerender any links on a page whenever the user hovers over them. Also, with the Speculation Rules API, “prerender” actually means to prerender the entire page, including running JavaScript. This can lead to near-instant load times once the user clicks on the link as the page would have most likely already been loaded in its entirety. However that is only one of the possible configurations.”

The new WordPress plugin adds support for the Speculation Rules API. The Mozilla developer pages, a great resource for HTML technical understanding describes it like this:

“The Speculation Rules API is designed to improve performance for future navigations. It targets document URLs rather than specific resource files, and so makes sense for multi-page applications (MPAs) rather than single-page applications (SPAs).

The Speculation Rules API provides an alternative to the widely-available <link rel=”prefetch”> feature and is designed to supersede the Chrome-only deprecated <link rel=”prerender”> feature. It provides many improvements over these technologies, along with a more expressive, configurable syntax for specifying which documents should be prefetched or prerendered.”

Advertisement

See also: Are Websites Getting Faster? New Data Reveals Mixed Results

Performance Lab Plugin

The new plugin was developed by the official WordPress performance team which occasionally rolls out new plugins for users to test ahead of possible inclusion into the actual WordPress core. So it’s a good opportunity to be first to try out new performance technologies.

The new WordPress plugin is by default set to prerender “WordPress frontend URLs” which are pages, posts, and archive pages. How it works can be fine-tuned under the settings:

Settings > Reading > Speculative Loading

Browser Compatibility

The Speculative API is supported by Chrome 108 however the specific rules used by the new plugin require Chrome 121 or higher. Chrome 121 was released in early 2024.

Browsers that do not support will simply ignore the plugin and will have no effect on the user experience.

Check out the new Speculative Loading WordPress plugin developed by the official core WordPress performance team.

Advertisement

How Analytics Handles Prerendering

A WordPress developer commented with a question asking how Analytics would handle prerendering and someone else answered that it’s up to the Analytics provider to detect a prerender and not count it as a page load or site visit.

Fortunately both Google Analytics and Google Publisher Tags (GPT) both are able to handle prerenders. The Chrome developers support page has a note about how analytics handles prerendering:

“Google Analytics handles prerender by delaying until activation by default as of September 2023, and Google Publisher Tag (GPT) made a similar change to delay triggering advertisements until activation as of November 2023.”

Possible Conflict With Ad Blocker Extensions

There are a couple things to be aware of about this plugin, aside from the fact that it’s an experimental feature that requires Chrome 121 or higher.

A comment by a WordPress plugin developer that this feature may not work with browsers that are using the uBlock Origin ad blocking browser extension.

Download the plugin:
Speculative Loading Plugin by the WordPress Performance Team

Read the announcement at WordPress
Speculative Loading in WordPress

Advertisement

See also: WordPress, Wix & Squarespace Show Best CWV Rate Of Improvement

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

10 Paid Search & PPC Planning Best Practices

Published

on

By

10 Paid Search & PPC Planning Best Practices

Whether you are new to paid media or reevaluating your efforts, it’s critical to review your performance and best practices for your overall PPC marketing program, accounts, and campaigns.

Revisiting your paid media plan is an opportunity to ensure your strategy aligns with your current goals.

Reviewing best practices for pay-per-click is also a great way to keep up with trends and improve performance with newly released ad technologies.

As you review, you’ll find new strategies and features to incorporate into your paid search program, too.

Here are 10 PPC best practices to help you adjust and plan for the months ahead.

Advertisement

1. Goals

When planning, it is best practice to define goals for the overall marketing program, ad platforms, and at the campaign level.

Defining primary and secondary goals guides the entire PPC program. For example, your primary conversion may be to generate leads from your ads.

You’ll also want to look at secondary goals, such as brand awareness that is higher in the sales funnel and can drive interest to ultimately get the sales lead-in.

2. Budget Review & Optimization

Some advertisers get stuck in a rut and forget to review and reevaluate the distribution of their paid media budgets.

To best utilize budgets, consider the following:

  • Reconcile your planned vs. spend for each account or campaign on a regular basis. Depending on the budget size, monthly, quarterly, or semiannually will work as long as you can hit budget numbers.
  • Determine if there are any campaigns that should be eliminated at this time to free up the budget for other campaigns.
  • Is there additional traffic available to capture and grow results for successful campaigns? The ad platforms often include a tool that will provide an estimated daily budget with clicks and costs. This is just an estimate to show more click potential if you are interested.
  • If other paid media channels perform mediocrely, does it make sense to shift those budgets to another?
  • For the overall paid search and paid social budget, can your company invest more in the positive campaign results?

3. Consider New Ad Platforms

If you can shift or increase your budgets, why not test out a new ad platform? Knowing your audience and where they spend time online will help inform your decision when choosing ad platforms.

Go beyond your comfort zone in Google, Microsoft, and Meta Ads.

Advertisement

Here are a few other advertising platforms to consider testing:

  • LinkedIn: Most appropriate for professional and business targeting. LinkedIn audiences can also be reached through Microsoft Ads.
  • TikTok: Younger Gen Z audience (16 to 24), video.
  • Pinterest: Products, services, and consumer goods with a female-focused target.
  • Snapchat: Younger demographic (13 to 35), video ads, app installs, filters, lenses.

Need more detailed information and even more ideas? Read more about the 5 Best Google Ads Alternatives.

4. Top Topics in Google Ads & Microsoft Ads

Recently, trends in search and social ad platforms have presented opportunities to connect with prospects more precisely, creatively, and effectively.

Don’t overlook newer targeting and campaign types you may not have tried yet.

  • Video: Incorporating video into your PPC accounts takes some planning for the goals, ad creative, targeting, and ad types. There is a lot of opportunity here as you can simply include video in responsive display ads or get in-depth in YouTube targeting.
  • Performance Max: This automated campaign type serves across all of Google’s ad inventory. Microsoft Ads recently released PMAX so you can plan for consistency in campaign types across platforms. Do you want to allocate budget to PMax campaigns? Learn more about how PMax compares to search.
  • Automation: While AI can’t replace human strategy and creativity, it can help manage your campaigns more easily. During planning, identify which elements you want to automate, such as automatically created assets and/or how to successfully guide the AI in the Performance Max campaigns.

While exploring new features, check out some hidden PPC features you probably don’t know about.

5. Revisit Keywords

The role of keywords has evolved over the past several years with match types being less precise and loosening up to consider searcher intent.

For example, [exact match] keywords previously would literally match with the exact keyword search query. Now, ads can be triggered by search queries with the same meaning or intent.

A great planning exercise is to lay out keyword groups and evaluate if they are still accurately representing your brand and product/service.

Advertisement

Review search term queries triggering ads to discover trends and behavior you may not have considered. It’s possible this has impacted performance and conversions over time.

Critical to your strategy:

  • Review the current keyword rules and determine if this may impact your account in terms of close variants or shifts in traffic volume.
  • Brush up on how keywords work in each platform because the differences really matter!
  • Review search term reports more frequently for irrelevant keywords that may pop up from match type changes. Incorporate these into match type changes or negative keywords lists as appropriate.

6. Revisit Your Audiences

Review the audiences you selected in the past, especially given so many campaign types that are intent-driven.

Automated features that expand your audience could be helpful, but keep an eye out for performance metrics and behavior on-site post-click.

Remember, an audience is simply a list of users who are grouped together by interests or behavior online.

Therefore, there are unlimited ways to mix and match those audiences and target per the sales funnel.

Here are a few opportunities to explore and test:

Advertisement
  • LinkedIn user targeting: Besides LinkedIn, this can be found exclusively in Microsoft Ads.
  • Detailed Demographics: Marital status, parental status, home ownership, education, household income.
  • In-market and custom intent: Searches and online behavior signaling buying cues.
  • Remarketing: Advertisers website visitors, interactions with ads, and video/ YouTube.

Note: This varies per the campaign type and seems to be updated frequently, so make this a regular check-point in your campaign management for all platforms.

7. Organize Data Sources

You will likely be running campaigns on different platforms with combinations of search, display, video, etc.

Looking back at your goals, what is the important data, and which platforms will you use to review and report? Can you get the majority of data in one analytics platform to compare and share?

Millions of companies use Google Analytics, which is a good option for centralized viewing of advertising performance, website behavior, and conversions.

8. Reevaluate How You Report

Have you been using the same performance report for years?

It’s time to reevaluate your essential PPC key metrics and replace or add that data to your reports.

There are two great resources to kick off this exercise:

Advertisement

Your objectives in reevaluating the reporting are:

  • Are we still using this data? Is it still relevant?
  • Is the data we are viewing actionable?
  • What new metrics should we consider adding we haven’t thought about?
  • How often do we need to see this data?
  • Do the stakeholders receiving the report understand what they are looking at (aka data visualization)?

Adding new data should be purposeful, actionable, and helpful in making decisions for the marketing plan. It’s also helpful to decide what type of data is good to see as “deep dives” as needed.

9. Consider Using Scripts

The current ad platforms have plenty of AI recommendations and automated rules, and there is no shortage of third-party tools that can help with optimizations.

Scripts is another method for advertisers with large accounts or some scripting skills to automate report generation and repetitive tasks in their Google Ads accounts.

Navigating the world of scripts can seem overwhelming, but a good place to start is a post here on Search Engine Journal that provides use cases and resources to get started with scripts.

Luckily, you don’t need a Ph.D. in computer science — there are plenty of resources online with free or templated scripts.

10. Seek Collaboration

Another effective planning tactic is to seek out friendly resources and second opinions.

Advertisement

Much of the skill and science of PPC management is unique to the individual or agency, so there is no shortage of ideas to share between you.

You can visit the Paid Search Association, a resource for paid ad managers worldwide, to make new connections and find industry events.

Preparing For Paid Media Success

Strategies should be based on clear and measurable business goals. Then, you can evaluate the current status of your campaigns based on those new targets.

Your paid media strategy should also be built with an eye for both past performance and future opportunities. Look backward and reevaluate your existing assumptions and systems while investigating new platforms, topics, audiences, and technologies.

Also, stay current with trends and keep learning. Check out ebooks, social media experts, and industry publications for resources and motivational tips.

More resources: 

Advertisement

Featured Image: Vanatchanan/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Limits News Links In California Over Proposed ‘Link Tax’ Law

Published

on

By

A brown cardboard price tag with a twine string and a black dollar sign symbol, influenced by the Link Tax Law, set against a dark gray background.

Google announced that it plans to reduce access to California news websites for a portion of users in the state.

The decision comes as Google prepares for the potential passage of the California Journalism Preservation Act (CJPA), a bill requiring online platforms like Google to pay news publishers for linking to their content.

What Is The California Journalism Preservation Act?

The CJPA, introduced in the California State Legislature, aims to support local journalism by creating what Google refers to as a “link tax.”

If passed, the Act would force companies like Google to pay media outlets when sending readers to news articles.

However, Google believes this approach needs to be revised and could harm rather than help the news industry.

Advertisement

Jaffer Zaidi, Google’s VP of Global News Partnerships, stated in a blog post:

“It would favor media conglomerates and hedge funds—who’ve been lobbying for this bill—and could use funds from CJPA to continue to buy up local California newspapers, strip them of journalists, and create more ghost papers that operate with a skeleton crew to produce only low-cost, and often low-quality, content.”

Google’s Response

To assess the potential impact of the CJPA on its services, Google is running a test with a percentage of California users.

During this test, Google will remove links to California news websites that the proposed legislation could cover.

Zaidi states:

“To prepare for possible CJPA implications, we are beginning a short-term test for a small percentage of California users. The testing process involves removing links to California news websites, potentially covered by CJPA, to measure the impact of the legislation on our product experience.”

Google Claims Only 2% of Search Queries Are News-Related

Zaidi highlighted peoples’ changing news consumption habits and its effect on Google search queries (emphasis mine):

“It’s well known that people are getting news from sources like short-form videos, topical newsletters, social media, and curated podcasts, and many are avoiding the news entirely. In line with those trends, just 2% of queries on Google Search are news-related.”

Despite the low percentage of news queries, Google wants to continue helping news publishers gain visibility on its platforms.

Advertisement

However, the “CJPA as currently constructed would end these investments,” Zaidi says.

A Call For A Different Approach

In its current form, Google maintains that the CJPA undermines news in California and could leave all parties worse off.

The company urges lawmakers to consider alternative approaches supporting the news industry without harming smaller local outlets.

Google argues that, over the past two decades, it’s done plenty to help news publishers innovate:

“We’ve rolled out Google News Showcase, which operates in 26 countries, including the U.S., and has more than 2,500 participating publications. Through the Google News Initiative we’ve partnered with more than 7,000 news publishers around the world, including 200 news organizations and 6,000 journalists in California alone.”

Zaidi suggested that a healthy news industry in California requires support from the state government and a broad base of private companies.

As the legislative process continues, Google is willing to cooperate with California publishers and lawmakers to explore alternative paths that would allow it to continue linking to news.

Advertisement

Featured Image:Ismael Juan/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS