Connect with us

SEO

A Complete Google Search Console Guide For SEO Pros

Published

on

A Complete Google Search Console Guide For SEO Pros

Google search console provides data necessary to monitor website performance in search and improve search rankings, information that is exclusively available through Search Console.

This makes it indispensable for online business and publishers that are keen to maximize success.

Taking control of your search presence is easier to do when using the free tools and reports.

What Is Google Search Console?

Google Search Console is a free web service hosted by Google that provides a way for publishers and search marketing professionals to monitor their overall site health and performance relative to Google search.

It offers an overview of metrics related to search performance and user experience to help publishers improve their sites and generate more traffic.

Search Console also provides a way for Google to communicate when it discovers security issues (like hacking vulnerabilities) and if the search quality team has imposed a manual action penalty.

Important features:

  • Monitor indexing and crawling.
  • Identify and fix errors.
  • Overview of search performance.
  • Request indexing of updated pages.
  • Review internal and external links.

It’s not necessary to use Search Console to rank better nor is it a ranking factor.

However, the usefulness of the Search Console makes it indispensable for helping improve search performance and bringing more traffic to a website.

How To Get Started

The first step to using Search Console is to verify site ownership.

Google provides several different ways to accomplish site verification, depending on if you’re verifying a website, a domain, a Google site, or a Blogger-hosted site.

Domains registered with Google domains are automatically verified by adding them to Search Console.

The majority of users will verify their sites using one of four methods:

  1. HTML file upload.
  2. Meta tag
  3. Google Analytics tracking code.
  4. Google Tag Manager.

Some site hosting platforms limit what can be uploaded and require a specific way to verify site owners.

But, that’s becoming less of an issue as many hosted site services have an easy-to-follow verification process, which will be covered below.

How To Verify Site Ownership

There are two standard ways to verify site ownership with a regular website, like a standard WordPress site.

  1. HTML file upload.
  2. Meta tag.

When verifying a site using either of these two methods, you’ll be choosing the URL-prefix properties process.

Let’s stop here and acknowledge that the phrase “URL-prefix properties” means absolutely nothing to anyone but the Googler who came up with that phrase.

Don’t let that make you feel like you’re about to enter a labyrinth blindfolded. Verifying a site with Google is easy.

HTML File Upload Method

Step 1: Go to the Search Console and open the Property Selector dropdown that’s visible in the top left-hand corner on any Search Console page.

Screenshot by author, May 2022

Step 2: In the pop-up labeled Select Property Type, enter the URL of the site then click the Continue button.

Step 2Screenshot by author, May 2022

Step 3: Select the HTML file upload method and download the HTML file.

Step 4: Upload the HTML file to the root of your website.

Root means https://example.com/. So, if the downloaded file is called verification.html, then the uploaded file should be located at https://example.com/verification.html.

Step 5: Finish the verification process by clicking Verify back in the Search Console.

Verification of a standard website with its own domain in website platforms like Wix and Weebly is similar to the above steps, except that you’ll be adding a meta description tag to your Wix site.

Duda has a simple approach that uses a Search Console App that easily verifies the site and gets its users started.

Troubleshooting With GSC

Ranking in search results depends on Google’s ability to crawl and index webpages.

The Search Console URL Inspection Tool warns of any issues with crawling and indexing before it becomes a major problem and pages start dropping from the search results.

URL Inspection Tool

The URL inspection tool shows whether a URL is indexed and is eligible to be shown in a search result.

For each submitted URL a user can:

  • Request indexing for a recently updated webpage.
  • View how Google discovered the webpage (sitemaps and referring internal pages).
  • View the last crawl date for a URL.
  • Check if Google is using a declared canonical URL or is using another one.
  • Check mobile usability status.
  • Check enhancements like breadcrumbs.

Coverage

The coverage section shows Discovery (how Google discovered the URL), Crawl (shows whether Google successfully crawled the URL and if not, provides a reason why), and Enhancements (provides the status of structured data).

The coverage section can be reached from the left-hand menu:

CoverageScreenshot by author, May 2022

Coverage Error Reports

While these reports are labeled as errors, it doesn’t necessarily mean that something is wrong. Sometimes it just means that indexing can be improved.

For example, in the following screenshot, Google is showing a 403 Forbidden server response to nearly 6,000 URLs.

The 403 error response means that the server is telling Googlebot that it is forbidden from crawling these URLs.

Coverage report showing 403 server error responsesScreenshot by author, May 2022

The above errors are happening because Googlebot is blocked from crawling the member pages of a web forum.

Every member of the forum has a member page that has a list of their latest posts and other statistics.

The report provides a list of URLs that are generating the error.

Clicking on one of the listed URLs reveals a menu on the right that provides the option to inspect the affected URL.

There’s also a contextual menu to the right of the URL itself in the form of a magnifying glass icon that also provides the option to Inspect URL.

Inspect URLScreenshot by author, May 2022

Clicking on the Inspect URL reveals how the page was discovered.

It also shows the following data points:

  • Last crawl.
  • Crawled as.
  • Crawl allowed?
  • Page fetch (if failed, provides the server error code).
  • Indexing allowed?

There is also information about the canonical used by Google:

  • User-declared canonical.
  • Google-selected canonical.

For the forum website in the above example, the important diagnostic information is located in the Discovery section.

This section tells us which pages are the ones that are showing links to member profiles to Googlebot.

With this information, the publisher can now code a PHP statement that will make the links to the member pages disappear when a search engine bot comes crawling.

Another way to fix the problem is to write a new entry to the robots.txt to stop Google from attempting to crawl these pages.

By making this 403 error go away, we free up crawling resources for Googlebot to index the rest of the website.

Google Search Console’s coverage report makes it possible to diagnose Googlebot crawling issues and fix them.

Fixing 404 Errors

The coverage report can also alert a publisher to 404 and 500 series error responses, as well as communicate that everything is just fine.

A 404 server response is called an error only because the browser or crawler’s request for a webpage was made in error because the page does not exist.

It doesn’t mean that your site is in error.

If another site (or an internal link) links to a page that doesn’t exist, the coverage report will show a 404 response.

Clicking on one of the affected URLs and selecting the Inspect URL tool will reveal what pages (or sitemaps) are referring to the non-existent page.

From there you can decide if the link is broken and needs to be fixed (in the case of an internal link) or redirected to the correct page (in the case of an external link from another website).

Or, it could be that the webpage never existed and whoever is linking to that page made a mistake.

If the page doesn’t exist anymore or it never existed at all, then it’s fine to show a 404 response.

Taking Advantage Of GSC Features

The Performance Report

The top part of the Search Console Performance Report provides multiple insights on how a site performs in search, including in search features like featured snippets.

There are four search types that can be explored in the Performance Report:

  1. Web.
  2. Image.
  3. Video.
  4. News.

Search Console shows the web search type by default.

Change which search type is displayed by clicking the Search Type button:

Default search typeScreenshot by author, May 2022

A menu pop-up will display allowing you to change which kind of search type to view:

Search Types MenuScreenshot by author, May 2022

A useful feature is the ability to compare the performance of two search types within the graph.

Four metrics are prominently displayed at the top of the Performance Report:

  1. Total Clicks.
  2. Total Impressions.
  3. Average CTR (click-through rate).
  4. Average position.
Screenshot of Top Section of the Performance PageScreenshot by author, May 2022

By default, the Total Clicks and Total Impressions metrics are selected.

By clicking within the tabs dedicated to each metric, one can choose to see those metrics displayed on the bar chart.

Impressions

Impressions are the number of times a website appeared in the search results. As long as a user doesn’t have to click a link to see the URL, it counts as an impression.

Additionally, if a URL is ranked at the bottom of the page and the user doesn’t scroll to that section of the search results, it still counts as an impression.

High impressions are great because it means that Google is showing the site in the search results.

But, the meaning of the impressions metric is made meaningful by the Clicks and the Average Position metrics.

Clicks

The clicks metric shows how often users clicked from the search results to the website. A high number of clicks in addition to a high number of impressions is good.

A low number of clicks and a high number of impressions is less good but not bad. It means that the site may need improvements to gain more traffic.

The clicks metric is more meaningful when considered with the Average CTR and Average Position metrics.

Average CTR

The average CTR is a percentage representing how often users clicked from the search results to the website.

A low CTR means that something needs improvement in order to increase visits from the search results.

A higher CTR means the site is performing well.

This metric gains more meaning when considered together with the Average Position metric.

Average Position

Average Position shows the average position in search results the website tends to appear in.

An average in positions one to 10 is great.

An average position in the twenties (20 – 29) means that the site is appearing on page two or three of the search results. This isn’t too bad. It simply means that the site needs additional work to give it that extra boost into the top 10.

Average positions lower than 30 could (in general) mean that the site may benefit from significant improvements.

Or, it could be that the site ranks for a large number of keyword phrases that rank low and a few very good keywords that rank exceptionally high.

In either case, it may mean taking a closer look at the content. It may be an indication of a content gap on the website, where the content that ranks for certain keywords isn’t strong enough and may need a dedicated page devoted to that keyword phrase to rank better.

All four metrics (Impressions, Clicks, Average CTR, and Average Position), when viewed together, present a meaningful overview of how the website is performing.

The big takeaway about the Performance Report is that it is a starting point for quickly understanding website performance in search.

It’s like a mirror that reflects back how well or poorly the site is doing.

Performance Report Dimensions

Scrolling down to the second part of the Performance page reveals several of what’s called Dimensions of a website’s performance data.

There are six dimensions:

1. Queries: Shows the top search queries and the number of clicks and impressions associated with each keyword phrase.

2. Pages: Shows the top-performing web pages (plus clicks and impressions).

3. Countries: Top countries (plus clicks and impressions).

4. Devices: Shows the top devices, segmented into mobile, desktop, and tablet.

5. Search Appearance: This shows the different kinds of rich results that the site was displayed in. It also tells if Google displayed the site using Web Light results and video results, plus the associated clicks and impressions data. Web Light results are results that are optimized for very slow devices.

6. Dates: The dates tab organizes the clicks and impressions by date. The clicks and impressions can be sorted in descending or ascending order.

Keywords

The keywords are displayed in the Queries as one of the dimensions of the Performance Report (as noted above). The queries report shows the top 1,000 search queries that resulted in traffic.

Of particular interest are the low-performing queries.

Some of those queries display low quantities of traffic because they are rare, what is known as long-tail traffic.

But, others are search queries that result from webpages that could need improvement, perhaps it could be in need of more internal links, or it could be a sign that the keyword phrase deserves its own webpage.

It’s always a good idea to review the low-performing keywords because some of them may be quick wins that, when the issue is addressed, can result in significantly increased traffic.

Links

Search Console offers a list of all links pointing to the website.

However, it’s important to point out that the links report does not represent links that are helping the site rank.

It simply reports all links pointing to the website.

This means that the list includes links that are not helping the site rank. That explains why the report may show links that have a nofollow link attribute on them.

The Links report is accessible  from the bottom of the left-hand menu:

Links reportScreenshot by author, May 2022

The Links report has two columns: External Links and Internal Links.

External Links are the links from outside the website that points to the website.

Internal Links are links that originate within the website and link to somewhere else within the website.

The External links column has three reports:

  1. Top linked pages.
  2. Top linking sites.
  3. Top linking text.

The Internal Links report lists the Top Linked Pages.

Each report (top linked pages, top linking sites, etc.) has a link to more results that can be clicked to view and expand the report for each type.

For example, the expanded report for Top Linked Pages shows Top Target pages, which are the pages from the site that are linked to the most.

Clicking a URL will change the report to display all the external domains that link to that one page.

The report shows the domain of the external site but not the exact page that links to the site.

Sitemaps

A sitemap is generally an XML file that is a list of URLs that helps search engines discover the webpages and other forms of content on a website.

Sitemaps are especially helpful for large sites, sites that are difficult to crawl if the site has new content added on a frequent basis.

Crawling and indexing are not guaranteed. Things like page quality, overall site quality, and links can have an impact on whether a site is crawled and pages indexed.

Sitemaps simply make it easy for search engines to discover those pages and that’s all.

Creating a sitemap is easy because more are automatically generated by the CMS, plugins, or the website platform where the site is hosted.

Some hosted website platforms generate a sitemap for every site hosted on its service and automatically update the sitemap when the website changes.

Search Console offers a sitemap report and provides a way for publishers to upload a sitemap.

To access this function click on the link located on the left-side menu.

sitemaps

The sitemap section will report on any errors with the sitemap.

Search Console can be used to remove a sitemap from the reports. It’s important to actually remove the sitemap however from the website itself otherwise Google may remember it and visit it again.

Once submitted and processed, the Coverage report will populate a sitemap section that will help troubleshoot any problems associated with URLs submitted through the sitemaps.

Search Console Page Experience Report

The page experience report offers data related to the user experience on the website relative to site speed.

Search Console displays information on Core Web Vitals and Mobile Usability.

This is a good starting place for getting an overall summary of site speed performance.

Rich Result Status Reports

Search Console offers feedback on rich results through the Performance Report. It’s one of the six dimensions listed below the graph that’s displayed at the top of the page, listed as Search Appearance.

Selecting the Search Appearance tabs reveals clicks and impressions data for the different kinds of rich results shown in the search results.

This report communicates how important rich results traffic is to the website and can help pinpoint the reason for specific website traffic trends.

The Search Appearance report can help diagnose issues related to structured data.

For example, a downturn in rich results traffic could be a signal that Google changed structured data requirements and that the structured data needs to be updated.

It’s a starting point for diagnosing a change in rich results traffic patterns.

Search Console Is Good For SEO

In addition to the above benefits of Search Console, publishers and SEOs can also upload link disavow reports, resolve penalties (manual actions), and security events like site hackings, all of which contribute to a better search presence.

It is a valuable service that every web publisher concerned about search visibility should take advantage of.

More Resources:


Featured Image: bunny pixar/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

LinkedIn Rolls Out New Newsletter Tools

Published

on

By

LinkedIn Rolls Out New Newsletter Tools

LinkedIn is launching several new features for people who publish newsletters on its platform.

The professional networking site wants to make it easier for creators to grow their newsletter audiences and engage readers.

More People Publishing Newsletters On LinkedIn

The company says the number of LinkedIn members publishing newsletter articles has increased by 59% over the past year.

Engagement on these creator-hosted newsletters is also up 47%.

With this growing interest, LinkedIn is updating its newsletter tools.

A New Way To View & Comment

One of the main changes is an updated reading experience that displays comments alongside the newsletter articles.

This allows readers to view and participate in discussions more easily while consuming the content.

See an example of the new interface below.

Screenshot from: linkedin.com, June 2024.

Design Your Own Cover Images

You can now use Microsoft’s AI-powered Designer tool to create custom cover images for their newsletters.

The integration provides templates, size options, and suggestions to help design visually appealing covers.

More Subscriber Notifications

LinkedIn is improving the notifications sent to newsletter subscribers to drive more readership.

When a new issue is published, subscribers will receive email alerts and in-app messages. LinkedIn will also prompt your followers to subscribe.

Mention Other Profiles In Articles

You can now embed links to other LinkedIn profiles and pages directly into their newsletter articles.

This lets readers click through and learn more about the individuals or companies mentioned.

In the example below, you can see it’s as easy as adding a link.

1718346362 491 LinkedIn Rolls Out New Newsletter ToolsScreenshot from: linkedin.com, June 2024.

Preview Links Before Publishing

Lastly, LinkedIn allows you to access a staging link that previews the newsletter URL before hitting publish.

This can help you share and distribute their content more effectively.

Why SEJ Cares

As LinkedIn continues to lean into being a publishing platform for creators and thought leaders, updates that enhance the newsletter experience are noteworthy for digital marketers and industry professionals looking to build an audience.

The new tools are part of LinkedIn’s broader effort to court creators publishing original content on its platform amid rising demand for newsletters and knowledge-sharing.

How This Can Help You

If you publish a newsletter on LinkedIn, these new tools can help you design more visually appealing content, grow your subscriber base, interact with your audience through comments, and preview your content before going live.


Featured Image: Tada Images/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

The 6 Biggest SEO Challenges You’ll Face in 2024

Published

on

The 6 Biggest SEO Challenges You'll Face in 2024

Seen any stressed-out SEOs recently? If so, that’s because they’ve got their work cut out this year.

Between navigating Google’s never-ending algorithm updates, fighting off competitors, and getting buy-in for projects, there are many significant SEO challenges to consider.

So, which ones should you focus on? Here are the six biggest ones I think you should pay close attention to.

Make no mistake—Google’s algorithm updates can make or break your site.

Core updates, spam updates, helpful content updates—you name it, they can all impact your site’s performance.

As we can see below, the frequency of Google updates has increased in recent years, meaning that the likelihood of being impacted by a Google update has also increased.

How to deal with it:

Recovering from a Google update isn’t easy—and sometimes, websites that get hit by updates may never fully recover.

For the reasons outlined above, most businesses try to stay on the right side of Google and avoid incurring Google’s wrath.

SEOs do this by following Google’s Search Essentials, SEO best practices and avoiding risky black hat SEO tactics. But sadly, even if you think you’ve done this, there is no guarantee that you won’t get hit.

If you suspect a website has been impacted by a Google update, the fastest way to check is to plug the domain into Ahrefs’ Site Explorer.

Ahrefs Site Explorer screenshotAhrefs Site Explorer screenshot

Here’s an example of a website likely affected by Google’s August 2023 Core Update. The traffic drop started on the update’s start date.

Website impacted by Google's August 2023 Core UpdateWebsite impacted by Google's August 2023 Core Update
Hover over the G circles on the X axis to get information about each update.

From this screen, you can see if a drop in traffic correlates with a Google update. If there is a strong correlation, then that update may have hit the site. To remedy it, you will need to understand the update and take action accordingly.

Follow SEO best practices

It’s important your website follows SEO best practices so you can understand why it has been affected and determine what you need to do to fix things.

For example, you might have missed significant technical SEO issues impacting your website’s traffic. To rule this out, it’s worth using Site Audit to run a technical crawl of your website.

Site Audit screenshot, via Ahrefs Site AuditSite Audit screenshot, via Ahrefs Site Audit

Monitor the latest SEO news

In addition to following best practices, it’s a good idea to monitor the latest SEO news. You can do this through various social media channels like X or LinkedIn, but I find the two websites below to be some of the most reliable sources of SEO news.

Even if you escape Google’s updates unscathed, you’ve still got to deal with your competitors vying to steal your top-ranking keywords from right under your nose.

This may sound grim, but it’s a mistake to underestimate them. Most of the time, they’ll be trying to improve their website’s SEO just as much as you are.

And these days, your competitors will:

How to deal with it:

If you want to stay ahead of your competitors, you need to do these two things:

Spy on your competitors and monitor their strategy

Ok, so you don’t have to be James Bond, but by using a tool like Ahrefs Site Explorer and our Google Looker Studio Integration (GLS), you can extract valuable information and keep tabs on your competitors, giving you a competitive advantage in the SERPs.

Using a tool like Site Explorer, you can use the Organic Competitors report to understand the competitor landscape:

Organic competitors screenshot, via Ahrefs' Site ExplorerOrganic competitors screenshot, via Ahrefs' Site Explorer

You can check out their Organic traffic performance across the years:

Year on Year comparison of organic traffic, via Ahrefs' Site ExplorerYear on Year comparison of organic traffic, via Ahrefs' Site Explorer

You can use Calendar to see which days changes in Positions, Pages, Referring domains Backlinks occurred:

Screenshot of Ahrefs' Calendar, via Ahrefs' Site ExplorerScreenshot of Ahrefs' Calendar, via Ahrefs' Site Explorer

You can see their Top pages’ organic traffic and Organic keywords:

Top pages report, via Ahrefs' Site ExplorerTop pages report, via Ahrefs' Site Explorer

And much, much more.

If you want to monitor your most important competitors more closely, you can even create a dashboard using Ahrefs’ GLS integration.

Google Looker Studio integration screenshot,Google Looker Studio integration screenshot,

Acquire links and create content that your competitors can’t recreate easily

Once you’ve done enough spying, it’s time to take action.

Links and content are the bread and butter for many SEOs. But a lot of the time the links that are acquired and the content that is created just aren’t that great.

So, to stand the best chance of maintaining your rankings, you need to work on getting high-quality backlinks and producing high-quality content that your competitors can’t easily recreate.

It’s easy to say this, but what does it mean in practice?

The best way to create this type of content is to create deep content.

At Ahrefs, we do this by running surveys, getting quotes from industry experts, running data studies, creating unique illustrations or diagrams, and generally fine-tuning our content until it is the best it can be.

As if competing against your competitors wasn’t enough, you must also compete against Google for clicks.

As Google not-so-subtly transitions from a search engine to an answer engine, it’s becoming more common for it to supply the answer to search queries—rather than the search results themselves.

The result is that even the once top-performing organic search websites have a lower click-through rate (CTR) because they’re further down the page—or not on the first page.

Whether you like it or not, Google is reducing traffic to your website through two mechanisms:

  • AI overviews – Where Google generates an answer based on sources on the internet
  • Zero-click searches – Where Google shows the answer in the search results

With AI overviews, we can see that the traditional organic search results are not visible.

And with zero-click searches, Google supplies the answer directly in the SERP, so the user doesn’t have to click anything unless they want to know more.

Zero Click searches example, via Google.comZero Click searches example, via Google.com

These features have one thing in common: They are pushing the organic results further down the page.

With AI Overviews, even when links are included, Kevin Indig’s AI overviews traffic impact study suggests that AI overviews will reduce organic clicks.

In this example below, shared by Aleyda, we can see that even when you rank organically in the number one position, it doesn’t mean much if there are Ads and an AI overview with the UX with no links in the AI overview answer; it just perpetuates the zero-clicks model through the AI overview format.

How to deal with it:

You can’t control how Google changes the SERPs, but you can do two things:

Make your website the best it can be

If you focus on the latter, your website will naturally become more authoritative over time. This isn’t a guarantee that your website will be included in the AI overview, but it’s better than doing nothing.

Prevent Google from showing your website in an AI Overview

If you want to be excluded from Google’s AI Overviews, Google says you can add no snippet to prevent your content from appearing in AI Overviews.

nosnippet code explanation screemshot, via Google's documentationnosnippet code explanation screemshot, via Google's documentation

One of the reasons marketers gravitated towards Google in the early days was that it was relatively easy to set up a website and get traffic.

Recently, there have been a few high-profile examples of smaller websites that have been impacted by Google:

Apart from the algorithmic changes, I think there are two reasons for this:

  • Large authoritative websites with bigger budgets and SEO teams are more likely to rank well in today’s Google
  • User-generated content sites like Reddit and Quora have been given huge traffic boosts from Google, which has displaced smaller sites from the SERPs that used to rank for these types of keyword queries

Here’s Reddit’s traffic increase over the last year:

Reddit's organic traffic increase, via Ahrefs Site ExplorerReddit's organic traffic increase, via Ahrefs Site Explorer

And here’s Quora’s traffic increase:

Quora's organic traffic increase, via Ahrefs Site ExplorerQuora's organic traffic increase, via Ahrefs Site Explorer

How to deal with it:

There are three key ways I would deal with this issue in 2024:

Focus on targeting the right keywords using keyword research

Knowing which keywords to target is really important for smaller websites. Sadly, you can’t just write about a big term like “SEO” and expect to rank for it in Google.

Use a tool like Keywords Explorer to do a SERP analysis for each keyword you want to target. Use the effort-to-reward ratio to ensure you are picking the right keyword battles:

Effort to reward ratio illustrationEffort to reward ratio illustration

If you’re concerned about Reddit, Quora, or other UGC sites stealing your clicks, you can also use Keywords Explorer to target SERPs where these websites aren’t present.

To do this:

  • Enter your keyword in the search bar and head to the matching terms report
  • Click on the SERP features drop-down box
  • Select Not on SERP and select Discussions and forums
Example of removing big UGC sites from keyword searches using filters in Ahrefs' Keywords ExplorerExample of removing big UGC sites from keyword searches using filters in Ahrefs' Keywords Explorer

This method can help you find SERPs where these types of sites are not present.

Build more links to become more authoritative

Another approach you could take is to double down on the SEO basics and start building more high-quality backlinks.

Write deep content

Most SEOs are not churning out 500-word blog posts and hoping for the best; equally, the content they’re creating is often not deep or the best it can possibly be.

This is often due to time restraints, budget and inclination. But to be competitive in the AI era, deep content is exactly what you should be creating.

As your website grows, the challenge of maintaining the performance of your content portfolio gets increasingly more difficult.

And what may have been an “absolute banger” of an article in 2020 might not be such a great article now—so you’ll need to update it to keep the clicks rolling in.

So how can you ensure that your content is the best it can be?

How to deal with it:

Here’s the process I use:

Steal this content updating framework

And here’s a practical example of this in action:

Use Page Inspect with Overview to identify pages that need updating

Here’s an example of an older article Michal Pecánek wrote that I recently updated. Using Page Inspect, we can pinpoint the exact date of the update was on May 10, 2024, with no other major in the last year.

Ahrefs Page Inspect screenshot, via Ahrefs' Site ExplorerAhrefs Page Inspect screenshot, via Ahrefs' Site Explorer

According to Ahrefs, this update almost doubled the page’s organic traffic, underlining the value of updating old content. Before the update, the content had reached its lowest performance ever.

Example of a content update and the impact on organic traffic, via Ahrefs' Site ExplorerExample of a content update and the impact on organic traffic, via Ahrefs' Site Explorer

So, what changed to casually double the traffic? Clicking on Page Inspect gives us our answer.

Page Inspect detail screenshot, via Ahrefs' Site ExplorerPage Inspect detail screenshot, via Ahrefs' Site Explorer

I was focused on achieving three aims with this update:

  • Keeping Michal’s original framework for the post intact
  • Making the content as concise and readable as it can be
  • Refreshing the template (the main draw of the post) and explaining how to use the updated version in a beginner-friendly way to match the search intent

Getting buy-in for SEO projects has never been easy compared to other channels. Unfortunately, this meme perfectly describes my early days of agency life.

SEO meme, SEO vs PPC budgetsSEO meme, SEO vs PPC budgets

SEO is not an easy sell—either internally or externally to clients.

With companies hiring fewer SEO roles this year, the appetite for risk seems lower than in previous years.

SEO can also be slow to take impact, meaning getting buy-in for projects is harder than other channels.

How long does SEO take illustrationHow long does SEO take illustration

How to deal with it:

My colleague Despina Gavoyannis has written a fantastic article about how to get SEO buy-in, here is a summary of her top tips:

  • Find key influencers and decision-makers within the organization, starting with cross-functional teams before approaching executives. (And don’t forget the people who’ll actually implement your changes—developers.)
  • Adapt your language and communicate the benefits of SEO initiatives in terms that resonate with different stakeholders’ priorities.
  • Highlight the opportunity costs of not investing in SEO by showing the potential traffic and revenue being missed out on using metrics like Ahrefs’ traffic value.
  • Collaborate cross-functionally by showing how SEO can support other teams’ goals, e.g. helping the editorial team create content that ranks for commercial queries.

And perhaps most important of all: build better business cases and SEO opportunity forecasts.

If you just want to show the short-term trend for a keyword, you can use Keywords Explorer:

Forecasting feature for keywords, via Ahrefs' Keywords ExplorerForecasting feature for keywords, via Ahrefs' Keywords Explorer
The forecasted trend is shown in orange as a dotted line.

If you want to show the Traffic potential of a particular keyword, you can use our Traffic potential metric in SERP overview to gauge this:

Traffic potential example, via Ahrefs' Site ExplorerTraffic potential example, via Ahrefs' Site Explorer

And if you want to go the whole hog, you can create an SEO forecast. You can use a third-party tool to create a forecast, but I recommend you use Patrick Stox’s SEO forecasting guide.

Final thoughts

Of all the SEO challenges mentioned above, the one keeping SEOs awake at night is AI.

It’s swept through our industry like a hurricane, presenting SEOs with many new challenges. The SERPs are changing, competitors are using AI tools, and the bar for creating basic content has been lowered, all thanks to AI.

If you want to stay competitive, you need to arm yourself with the best SEO tools and search data on the market—and for me, that always starts with Ahrefs.

Got questions? Ping me on X.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Why Now’s The Time To Adopt Schema Markup

Published

on

By

Why Now's The Time To Adopt Schema Markup

There is no better time for organizations to prioritize Schema Markup.

Why is that so, you might ask?

First of all, Schema Markup (aka structured data) is not new.

Google has been awarding sites that implement structured data with rich results. If you haven’t taken advantage of rich results in search, it’s time to gain a higher click-through rate from these visual features in search.

Secondly, now that search is primarily driven by AI, helping search engines understand your content is more important than ever.

Schema Markup allows your organization to clearly articulate what your content means and how it relates to other things on your website.

The final reason to adopt Schema Markup is that, when done correctly, you can build a content knowledge graph, which is a critical enabler in the age of generative AI. Let’s dig in.

Schema Markup For Rich Results

Schema.org has been around since 2011. Back then, Google, Bing, Yahoo, and Yandex worked together to create the standardized Schema.org vocabulary to enable website owners to translate their content to be understood by search engines.

Since then, Google has incentivized websites to implement Schema Markup by awarding rich results to websites with certain types of markup and eligible content.

Websites that achieve these rich results tend to see higher click-through rates from the search engine results page.

In fact, Schema Markup is one of the most well-documented SEO tactics that Google tells you to do. With so many things in SEO that are backward-engineered, this one is straightforward and highly recommended.

You might have delayed implementing Schema Markup due to the lack of applicable rich results for your website. That might have been true at one point, but I’ve been doing Schema Markup since 2013, and the number of rich results available is growing.

Even though Google deprecated how-to rich results and changed the eligibility of FAQ rich results in August 2023, it introduced six new rich results in the months following – the most new rich results introduced in a year!

These rich results include vehicle listing, course info, profile page, discussion forum, organization, vacation rental, and product variants.

There are now 35 rich results that you can use to stand out in search, and they apply to a wide range of industries such as healthcare, finance, and tech.

Here are some widely applicable rich results you should consider utilizing:

  • Breadcrumb.
  • Product.
  • Reviews.
  • JobPosting.
  • Video.
  • Profile Page.
  • Organization.

With so many opportunities to take control of how you appear in search, it’s surprising that more websites haven’t adopted it.

A statistic from Web Data Commons’ October 2023 Extractions Report showed that only 50% of pages had structured data.

Of the pages with JSON-LD markup, these were the top types of entities found.

  • http://schema.org/ListItem (2,341,592,788 Entities)
  • http://schema.org/ImageObject (1,429,942,067 Entities)
  • http://schema.org/Organization (907,701,098 Entities)
  • http://schema.org/BreadcrumbList (817,464,472 Entities)
  • http://schema.org/WebSite (712,198,821 Entities)
  • http://schema.org/WebPage (691,208,528 Entities)
  • http://schema.org/Offer (623,956,111 Entities)
  • http://schema.org/SearchAction (614,892,152 Entities)
  • http://schema.org/Person (582,460,344 Entities)
  • http://schema.org/EntryPoint (502,883,892 Entities)

(Source: October 2023 Web Data Commons Report)

Most of the types on the list are related to the rich results mentioned above.

For example, ListItem and BreadcrumbList are required for the Breadcrumb Rich Result, SearchAction is required for Sitelink Search Box, and Offer is required for the Product Rich Result.

This tells us that most websites are using Schema Markup for rich results.

Even though these Schema.org types can help your site achieve rich results and stand out in search, they don’t necessarily tell search engines what each page is about in detail and help your site be more semantic.

Help AI Search Engines Understand Your Content

Have you ever seen your competitor’s sites using specific Schema.org Types that are not found in Google’s structured data documentation (i.e. MedicalClinic, IndividualPhysician, Service, etc)?

The Schema.org vocabulary has over 800 types and properties to help websites explain what the page is about. However, Google’s structured data features only require a small subset of these properties for websites to be eligible for a rich result.

Many websites that solely implement Schema Markup to get rich results tend to be less descriptive with their Schema Markup.

AI search engines now look at the meaning and intent behind your content to provide users with more relevant search results.

Therefore, organizations that want to stay ahead should use more specific Schema.org types and leverage appropriate properties to help search engines better understand and contextualize their content. You can be descriptive with your content while still achieving rich results.

For example, each type (e.g. Article, Person, etc.) in the Schema.org vocabulary has 40 or more properties to describe the entity.

The properties are there to help you fully describe what the page is about and how it relates to other things on your website and the web. In essence, it’s asking you to describe the entity or topic of the page semantically.

The word ‘semantic’ is about understanding the meaning of language.

Note that the word “understanding” is part of the definition. Funny enough, in October 2023, John Mueller at Google released a Search Update video. In this six-minute video, he leads with an update on Schema Markup.

For the first time, Mueller described Schema Markup as “a code you can add to your web pages, which search engines can use to better understand the content. ”

While Mueller has historically spoken a lot about Schema Markup, he typically talked about it in the context of rich result eligibility. So, why the change?

This shift in thinking about Schema Markup for enhanced search engine understanding makes sense. With AI’s growing role and influence in search, we need to make it easy for search engines to consume and understand the content.

Take Control Of AI By Shaping Your Data With Schema Markup

Now, if being understood and standing out in search is not a good enough reason to get started, then doing it to help your enterprise take control of your content and prepare it for artificial intelligence is.

In February 2024, Gartner published a report on “30 Emerging Technologies That Will Guide Your Business Decisions,”  highlighting generative AI and knowledge graphs as critical emerging technologies companies should invest in within the next 0-1 years.

Knowledge graphs are collections of relationships between entities defined using a standardized vocabulary that enables new knowledge to be gained by way of inferencing.

Good news! When you implement Schema Markup to define and connect the entities on your site, you are creating a content knowledge graph for your organization.

Thus, your organization gains a critical enabler for generative AI adoption while reaping its SEO benefits.

Learn more about building content knowledge graphs in my article, Extending Your Schema Markup From Rich Results to Knowledge Graphs.

We can also look at other experts in the knowledge graph field to understand the urgency of implementing Schema Markup.

In his LinkedIn post, Tony Seale, Knowledge Graph Architect at UBS in the UK, said,

“AI does not need to happen to you; organizations can shape AI by shaping their data.

It is a choice: We can allow all data to be absorbed into huge ‘data gravity wells’ or we can create a network of networks, each of us connecting and consolidating our data.”

The “networks of networks” Seale refers to is the concept of knowledge graphs – the same knowledge graph that can be built from your web data using semantic Schema Markup.”

The AI revolution has only just begun, and there is no better time than now to shape your data, starting with your web content through the implementation of Schema Markup.

Use Schema Markup As The Catalyst For AI

In today’s digital landscape, organizations must invest in new technology to keep pace with the evolution of AI and search.

Whether your goal is to stand out on the SERP or ensure your content is understood as intended by Google and other search engines, the time to implement Schema Markup is now.

With Schema Markup, SEO pros can become heroes, enabling generative AI adoption through content knowledge graphs while delivering tangible benefits, such as increased click-through rates and improved search visibility.

More resources: 


Featured Image by author

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending