Connect with us

SEO

A Complete Guide To Local Markup & Rich Results

Published

on

A Complete Guide To Local Markup & Rich Results

How important is schema markup for local search engine optimization (SEO)?

Most local SEO experts and webmasters are familiar with the impact of having well-optimized SEO elements on their landing pages, such as optimized title tags, well-written content, and more.

However, what exactly can you accomplish by applying schema markup to your local business website?

Quite a bit, actually.

When it comes to organic search, there are several reasons why having a proper and thorough schema applied to your website is a substantial competitive advantage.

Advertisement

In fact, it’s been reiterated by Google time and time again that schema helps search crawlers do their job more effectively by helping them comprehend a landing page and delivering relevant information in the SERPs.

In this post, we will share a few recommendations to help your local business get the most out of using schema to boost your local SEO.

First, let’s start with defining what exactly schema markup is.

The Difference Between Schema, Structured Data & Rich Results

The terms “structured data” and “schema” are often used interchangeably in webmaster and SEO verticals.

However, before we dive into the recommendations it’s helpful to know the semantic differences between these terms.

Structured Data

Google defines structured data as “a standardized format for providing information about a page and classifying the page content.”

Advertisement

To put it simply, this format was developed to help search engines accurately understand a webpage to properly display snippets of information in the search results pages.

Schema

Schema is a form of structured data that was officially launched via schema.org.

Schema was created via a collaborative project by all the major search engines (Google, Yahoo, Bing, and Yandex) in 2011.

Utilizing the markup available on schema.org enables a landing page to be eligible for rich results.

Rich Results

Rich results (formerly called rich snippets) are any extra information you see in the search engine results pages (SERPs) that are beyond the atypical blue title tag and meta description (breadcrumbs, review stars, sitelinks, etc.).

Google provides two tools to audit structured data on your website: the Schema Markup Validator and the Rich Results Test.

Advertisement

Below are a few examples of local businesses that are benefitting from rich results:

Review Rich Results Example

Image from Google, May 2022Review Rich Results

Breadcrumb Rich Results Example

Breadcrumb rich resultsImage from Google, May 2022Breadcrumb rich results

Sitelink Rich Results Example

Sitelink Rich Results exampleImage from Google, May 2022Sitelink Rich Results example

FAQ Rich Results Example

FAQ Rich Results exampleImage from Google, May 2022FAQ Rich Results example

Is Structured Data A Local Ranking Signal?

There has been much debate over the years about whether or not structured data in itself is a search engine ranking signal.

Prominent Google engineer John Mueller has specified more than once that structured data by itself is not a direct search engine ranking signal.

However, structured data indirectly improves search engine visibility through the following means.

Structured Data Helps Search Engine Crawlers Better Comprehend Landing Pages

Properly and thoroughly implemented structured data makes the search crawler’s job easier.

A good analogy would be comparing website properties (content, images, media files, etc.) to a garage full of various boxes and items (snow shovel for the winter, inflatable pool for the summer, etc.).

Advertisement

Let’s say you are having a garage sale and you want visitors (i.e. more website visitors).

It’s Google’s job to advertise your garage sale on the search results pages.

For most websites, Google provides the bare minimum blue title tag and meta descriptions.

However, if your website is properly marked up with structured data then Google may very well reward your websites with a bigger advertisement (i.e. rich results) about your garage sale.

Structured data essentially puts labels on the different objects in your garage making the Google search crawler’s job easier.

Structured Data Improves The Possibility Of Obtaining Rich Results Which Improves Click-through Rates

A rich result is much more eye-catching in the search results and will most likely improve CTR (click-through rates).

Advertisement

The CTR boost can vary depending on what kind of rich result is obtained, for example, FAQ results do very well.

This means your landing page is receiving more traffic because users are seeing relevant snippets about what it contains.

There is also some debate that increased CTR might be a positive SEO signal in itself (signals more engagement & relevancy).

Either way, having an improved CTR means more traffic wherever your website ranks.

What Structured Data Is Recommended For Local Business Websites?

Most local websites have at least some basic structured data enabled.

However, the more thorough and detailed structured data is properly applied the better.

Advertisement

Next, we’ll offer some step-by-step recommendations for how to properly apply structured data:

Select The Best Schema.org Category

Schema.org provides several different schema property options that are uniquely relevant for local businesses.

In order to have necessary local business schema properties (which will be discussed further in detail below), it is imperative to select the most relevant schema category for your local business.

For example, if you are promoting an ice cream chain, the most relevant category is schema.org/IceCreamShop.

If you are trying to promote a local hardware chain then you’d select schema.org/HardwareStore.

Relevant schema categories will help Google better topically understand your website.

Advertisement

What If There Are No Relevant Schema Categories For My Local Business?

If you can’t find a schema.org category that is relevant for your business then the default category should be schema.org/LocalBusiness.

If you’re technically inclined, it is possible to post new schema category recommendations on the schema.org Github forum.

The schema.org developers respond to detailed recommendations on this forum and occasionally create new schema.org properties.

I Selected The Most Accurate Category So What Should I Implement?

After you’ve selected the appropriate category for your business you must have the below schema.org sub-properties to ensure your schema validates.

Errors could disqualify you from obtaining rich results.

The below schema properties are required for validation:

Advertisement
  • Url: The URL of the associated landing page.
  • Name: Name of the business.
  • OpeningHours: Opening and closing hours of a business.
  • Telephone: Contact telephone number for the business.
  • Image: This can be any relevant image file on your landing page.  It is recommended to use a storefront image if that’s available.
  • Logo: This should be a link to your business logo image.
  • Address: The business address which should be visible on the landing page.
  • Geo: This is the geo coordinates of your business location.
  • AreaServed: It is recommended to use a zipcode for this schema property.
  • MainContentOfPage: Main body content of your landing page.

Common schema properties that are highly recommended:

  • Review: A review of your local business.
  • AggregrateRating: The overall rating, based on a collection of reviews or ratings, of the item.  Make sure to follow Google’s rules on Review Rich Results on this.
  • FAQPage: If you have a FAQ page it is imperative to add this specialty schema. Make sure to follow Google’s rules and guidelines.
  • AlternateName: Businesses commonly have related names e.g. Acme Stores vs. Acme Inc. The alternateName property marks up other well-known corporate name variations (including abbreviations).
  • SameAs: This is a reference to a 3rd party websites that are related to the website’s identity i.e. Facebook pages, Youtube Channel pages, Wikipedia pages, etc.
  • HasMap: A URL to the map of your local business.
  • Breadcrumb: This schema marks up the existing breadcrumb navigation structure on your website. This schema is highly recommended because it often appears in the SERPS as a rich result.
  • Department: Many chain retailers have internal departments (e.g. pharmacies inside grocery stores). This specialty schema helps markup these department stores.
  • PriceRange: The price range of the business, for example, $$$.

More advanced schema types:

  • Sitelinks Search Box: A sitelinks search box is a quick way for users to do an internal search on your website via the Google SERP vs. visiting your website directly.
  • AdditionalType: This is a specialty schema that helps Google understand what your website is topically related to. This can be accomplished by using Wikipedia categories as values for this property. For example, if a local business sells sporting gear it is recommended to have the additionalType property  https://en.wikipedia.org/wiki/Sports_equipment.

How Do You Make Sure Your Structured Data Is Validated?

It is very important to make sure your structured data is properly validated.

If it’s not then your landing page will most likely not qualify for rich results.

Google specifically says that if there are error(s): The rich result cannot appear in Google Search as a rich result.

As mentioned earlier there are two different tools to make sure your schema is properly validated: Schema Markup Validator and the Rich Results Test.

Google Search Console also provides enhancement reports on structured data which will be explained in further detail below.

Schema Markup Validator

The Schema Markup Validator enables you to get into the details of structured data itself.

It shows both errors and warnings.

Advertisement

It also allows you to test structured data before it’s enabled on your webpages via pasting code directly into the tool.

Example Of Schema Markup Validator Result

Example of Schema Markup Validator ResultImage from Schema Markup Validator, May 2022Example of Schema Markup Validator Result

Also just to note that while it’s imperative to correct structured data errors you will also often see structured “warnings.”

These warnings are of much lesser concern and Google’s John Muller even mentioned you don’t have to fix all warnings.

A lot of sites have earnings with structured data and that’s perfectly fine.

Rich Results Test

The Rich Results Test is Google’s official tool to see which rich results can be generated by structured data.

This tool also lets you preview how rich results will look in Google SERPs.

Advertisement

Example Of Rich Results Test Preview

Rich Results Test PreviewImage from Rich Result test tool, May 2022Rich Results Test Preview

The Rich Result test tool will report structured data errors and warnings as well.

As mentioned earlier, warnings are common and won’t prevent rich results from appearing.

However, structured data errors must be resolved to qualify for rich results.

Structured Data Monitoring Via Google Search Console

Google also offers sitewide structured data monitoring via Google Search Console.

It is highly recommended to have a verified Google Search Console account for your local business website to enable monitoring.

Google Search Console will provide sitewide enhancement reports on how many webpages have validated structured data, warnings, and errors.

Advertisement

Google also sends notification emails if there are issues with structured data on your local business website.

It is recommended to pay attention to these notifications.

Example Of Sitewide Structured Data Report

Example of Sitewide Structured Data ReportImage from Google Search Console, May 2022Example of Sitewide Structured Data Report

How Can I Tell How Many Rich Results My Website Is Getting In The SERPs?

Besides spot-checking rich results, it would be ideal to see how well a local business website is performing across all the Google SERPs.

There are few third-party SEO tools that scrape Google SERPs and provide reports.

One notable tool, Semrush, has a “SERP Feature” report that shows how many aggregate rich results your website is getting.

Example Of Semrush SERP Feature Report

Advertisement
SERP featuresImage from Semrush, May 2022SERP features

Is There Anything I Should Avoid When Using Structured Data?

Structured data is meant to be code to label or markup existing properties on your local business website.

Google explicitly requires that your structured data matches what is on the associated landing page.

However, structured data spam does exist and Google can apply manual penalties if they believe a webmaster is egregiously breaking the rules.

Make sure to follow Google’s structured data guidelines carefully.

Conclusion

There is no drawback in applying properly formatted and relevant structured data to your local business’ website.

Also, schema.org is continually coming out with new schema properties along with more integration via Google Search Console.

Most common SEO strategies (meta tag optimization, custom copywriting, design changes, etc.) usually require significant effort and visible on-page website updates.

Advertisement

In comparison, structured data updates are invisible to users visiting your website.

They also don’t require any direct changes to anything on your website besides including a new source code script.

They also have great potential to substantially improve visibility in the Google SERPs via rich results.

If you’re a local business looking to further optimize your website make sure to visit schema.org along with a webmaster to start applying structured data.

More resources:


Featured Image: Hangouts Vector Pro/Shutterstock

Advertisement

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘schema-local-seo’,
content_category: ‘local-search technical-seo’
});

Source link

Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

How We Built A Strong $10 Million Agency: A Proven Framework

Published

on

By

How We Built A Strong $10 Million Agency: A Proven Framework

Building a successful agency can be a daunting task in today’s ever-evolving space. Do you know the secrets to succeeding with yours?

Watch this informative, on-demand webinar, where link building expert Jon Ball reveals the closely guarded secrets that have propelled Page One Power to become a highly successful $10 million agency.

You’ll learn:

  • The foundational principles on which to build your business to succeed.
  • The importance of delegation, market positioning, and staffing.
  • More proven lessons learned from 14 years of experience.

With Jon, we’ll provide you with actionable insights that you can use to take your business to the next level, using foundational principles that have contributed to Page One Power’s success.

If you’re looking to establish yourself as a successful entrepreneur or grow your agency in the constantly evolving world of SEO, this webinar is for you.

Learn the secrets of establishing a thriving agency in an increasingly competitive SEO space.

Advertisement

View the slides below or check out the full webinar for all the details.

Join Us For Our Next Webinar!

How An Enterprise Digital PR Firm Earns 100’s Of Links In 30 Days

Join us as we explore how to scale the very time-consuming and complicated process of earning links from digital PR, with proven case studies showing how you can earn hundreds of links in 30 days.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

SEO Woe or a Load of Baloney?

Published

on

SEO Woe or a Load of Baloney?

Toxic backlinks are links that some SEO tools say could hurt your website’s Google rankings. The implication is that you should disavow them to keep your site safe.

But there’s some disagreement and confusion among SEOs as to whether “toxic” links are actually a thing and what, if anything, you should do about them. 

If you believe Google’s John Mueller, they’re not: 

Yet, according to my poll, the majority (just!) of SEOs think they are: 

So… what’s the deal here? Are toxic backlinks actually a thing? Are they hurting your site? And if so, what should you be doing about them? 

Before we can answer those questions, we need to understand the terminology… 

Every website has some spammy backlinks that just don’t make sense. But that doesn’t necessarily make them manipulative or “toxic.”

For example, here are a couple of obviously spammy links to our site: 

Advertisement
Example of spammy links, via Ahrefs' Site ExplorerExample of spammy links, via Ahrefs' Site Explorer

We didn’t build or buy either of these, so they’re not “manipulative” by definition. They’re just low-quality links we’ve attracted over time because the internet is rife with spammers. 

If you study Google’s link spam documentation carefully, you’ll see that, in theory, these aren’t the kind of spammy links they have a problem with. They warn only against the ill effects of spam links intended to manipulate rankings. 

Google uses links as an important factor in determining the relevancy of web pages. Any links that are intended to manipulate rankings in Google Search results may be considered link spam. This includes any behavior that manipulates links to your site or outgoing links from your site. 

Here are the examples Google gives of these manipulative links: 

What Google says are manipulative linksWhat Google says are manipulative links

As for “toxic backlinks,” this is just a term made up by certain SEO tools to describe backlinks they think could hurt your rankings based on several so-called “markers.”

Key takeaway

  • Spammy links are low-quality links that every site attracts through no fault of their own. 
  • Manipulative links are links built or bought solely to improve Google rankings. 
  • Toxic links are links that certain SEO tools say could hurt your website’s rankings. 

If you asked this question before September 2016, the answer would have likely been “yes.”

So what changed? 

Advertisement

Penguin 4.0.

With this algorithm update, Google switched from demoting pages to a system that tries to ignore bad links.

Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site. 

Since then, Google’s stance has been that you can ignore spammy backlinks. 

If you’re seeing individual links that pop up and you say, “oh this looks like a spammer dropped the link” or whatever, I would completely ignore those. […] because these spammy links happen to every website and Google’s system has seen them so many times over the years that we’re very good at just ignoring them. 

John MuellerJohn Mueller

But is this true? Is Google really as good at ignoring low-level spam as we’re made to believe? 

Judging by my colleague Chris’s recent poll on LinkedIn, a good chunk of SEOs (38%) don’t think so, as they’re still disavowing them. 

Most SEOs either disavow or do nothing about spammy backlinksMost SEOs either disavow or do nothing about spammy backlinks

Does that mean they’re right to do so? Not necessarily. It just means they don’t fully trust Google that they won’t do any harm. They’re being careful. 

Personally, the person I trust most to answer this question in 2024 is Dr. Marie Haynes. I don’t think anyone’s done more research into this than her. She’s spent well over a decade working to understand Google’s search algorithms and auditing link profiles on behalf of business owners. 

Now, the interesting part of that statement (and why I actually trust her!) is the obvious conflict of interest. Until fairly recently, she made her living selling link audit and disavow file creation services—and for a pretty hefty sum at that! 

Advertisement
Pricing from Marie's link audit services page in March 2023Pricing from Marie's link audit services page in March 2023
Pricing from Marie’s link audit services page in March 2023

Clearly, it would be good news for Marie if Google were still terrible at ignoring spammy backlinks because she could sell more link audits! 

Yet, these days, she no longer appears to offer such services. In fact, she’s actually been warning folks against the need to disavow low-quality, spammy backlinks for a few years. 

Here’s a quote from a 2022 blog post of hers:

While there is no harm in disavowing low quality spammy links, it likely does not help improve rankings. We believe that Google’s algorithms are already ignoring these links. […]. When we do see improvements these days after disavowing, it is always in sites where we have disavowed links that were purposely made for SEO and very little else. 

Marie HaynesMarie Haynes

It’s clear that Marie is being cautious with her words here. But overall, her opinion after digging into this for many years seems to be that, yes, Google is now pretty good at ignoring most low-quality spammy links. 

Does that mean they’re perfect? No. But it does mean that worrying about obvious low-quality link spam is probably a waste of time for most people.

If you’re buying or building the types of links that Google class as “link spam” then, yes, they can absolutely hurt your rankings.

But before you panic about that link exchange you did with your best friend’s wife’s brother, Google is likely looking for patterns of manipulation here. In other words, manipulative link profiles rather than manipulative individual links: 

Advertisement

Danny Richman, founder of Richman SEO Training, agrees: 

Here’s a bit more context from Danny: 

As for Marie Haynes, she echoes a similar sentiment in this post. She states that manual actions aside, she would only recommend a client disavow links if they have “a very large number of links that [they] feel the webspam team would consider to be ‘manipulative.’ ”

In these cases, Google often slaps the worst offenders with an unnatural links manual action. If you get one of those, that’s Google telling you, “Hey… you’re being demoted in search because we think you’ve been trying to game the system with manipulative links.” 

Advertisement

But this doesn’t have to happen for manipulative links to be a problem. It’s possible for Google to algorithmically demote a site if they detect a large volume of spammy and manipulative links, at least according to John Mueller.

If we see a very strong pattern [of spammy links] there, then it can happen that our algorithms say well, we really have kind of lost trust with this website and at the moment based on the bigger picture on the web, we kind of need to be more on almost a conservative side when it comes to to understanding this website’s content and ranking it in the search results. And then you can see kind of a drop in the visibility there. 

John MuellerJohn Mueller

Either way, the point remains: it’s patterns of manipulation that are likely to hurt rankings. There’s very little chance that you need to worry about the odd potentially dodgy link here and there. 

While it might be tempting to use an SEO tool that finds “toxic backlinks” for you, I’d seriously urge you to reconsider. Trusting these can do more harm than good. Way more. 

Just look at this unfortunate Redditor’s reply to John Mueller: 

Someone on Reddit's traffic tanked 60% after disavowing "toxic" backlinks in one SEO toolSomeone on Reddit's traffic tanked 60% after disavowing "toxic" backlinks in one SEO tool
A 60% drop in traffic! That’s no joke! 

Even if this is an extreme case, worrying about these links likely only wastes time because, according to Marie Haynes, they’re rarely truly toxic: 

I find that the truly toxic links…the ones that could have the potential to harm your site algorithmically (although you’d have to really overdo it, as I’ll describe below), are rarely returned by an SEO tool. 

Marie HaynesMarie Haynes

Sam McRoberts, CEO of VUVU Marketing, seems to agree: 

So… how do you find truly toxic backlinks that are likely to be hurting your site? 

Advertisement

The truth? You might not even need to look for them. If you haven’t built or bought links that Google considers link spam at any reasonable scale, chances are you’re good. 

If you’re not confident about that, do a manual backlink audit with a tool like Ahrefs’ Site Explorer.

The Anchors report is a good starting point if you’ve never done this. It shows you the words and phrases people use when linking to you. If they look unnatural or over-optimized (lots of exact matches of keywords you’re trying to rank for), that could be a sign you have paid or other links intended to manipulate rankings. 

Example of keyword-rich anchors, which are often a sign of paid backlinksExample of keyword-rich anchors, which are often a sign of paid backlinks

If things look fishy there, use the Backlinks report to dig deeper and check the context of those links. It’s usually quite easy to spot paid and unnatural ones. 

The Backlinks report in Ahrefs' Site Explorer showing the context of the backlinkThe Backlinks report in Ahrefs' Site Explorer showing the context of the backlink

Just remember that you’re looking for patterns of unnatural links, not just one or two. 

WARNING

If you’re not 100% sure what you’re looking for when doing a backlink audit, hire someone who knows what they’re doing. You need to be confident that the links are truly “toxic.”

Advertisement

If you have a manual action for unnatural links or a bunch of what you believe to be truly toxic backlinks, yes. Google’s advice is to disavow them (assuming you can’t get the links removed). 

You should disavow backlinks only if: 

You have a considerable number of spammy, artificial, or low-quality links pointing to your site, 

AND

The links have caused a manual action, or likely will cause a manual action, on your site. 

Marie Haynes advises the same: 

There are two situations where we will recommend to our clients a thorough link audit followed by filing a disavow: 

  1. The site has a manual action for unnatural links in GSC. 
  2. The site has a very large number of links that we feel the webspam team would consider to be “manipulative”.
Marie HaynesMarie Haynes

If you just have a bunch of spammy backlinks that most sites naturally attract or the odd paid backlink, probably not. Google probably ignores most, if not all, of these links, so disavowing them is likely a waste of time. 

While there is no harm in disavowing these links other than the time spent analyzing them, there is likely no benefit either. 

Marie HaynesMarie Haynes

But what about negative SEO?

Being the victim of a negative SEO attack is indeed the possible exception here. This is when a competitor sends a load of spammy or toxic backlinks your way to try to get your site penalized. 

Advertisement

Google remains adamant that it basically never works, but it really comes down to what you believe. 

[I’ve] looked at hundreds of supposed cases of negative SEO, but none have actually been the real reason a website was hurt. […] While it’s easier to blame negative SEO, typically the culprit of a traffic drop is something else you don’t know about–perhaps an algorithm update or an issue with their website. 

Gary IllyesGary Illyes

If you see a traffic drop after an influx of backlinks in Site Explorer, I’d say that it’s at least worth a bit more investigation. 

Site with traffic drop coinciding with an influx of backlinksSite with traffic drop coinciding with an influx of backlinks
This site experienced a traffic drop coinciding with an influx of referring domains. Maybe there’s benefit to disavowing here… and maybe it’s something else!

As Gary said above, something else could be to blame—but you never know. There’s always a chance that Google’s algorithms rule it was you who built or bought those backlinks to try to manipulate rankings and penalize you for it. 

If you just found a bunch of so-called “toxic backlinks” in an SEO tool, probably not. Again, most of these are probably just link spam Google already ignores. 

Here’s yet another quote from Marie Haynes backing this up: 

While there is probably no harm in disavowing [links reported as toxic in SEO tools], you are not likely to see any improvement as a result. Disavowing is meant for sites trying to remove a manual action and for those who have been actively building links for the purpose of improving rankings. 

Marie HaynesMarie Haynes

There’s also the risk that you could end up disavowing links that are actually helping you… 

Patrick showed further evidence that this can absolutely happen when he experimented with disavowing links to the Ahrefs blog. Traffic dipped, then went back up after he removed the disavow. 

The impact of disavowing links to the Ahrefs blogThe impact of disavowing links to the Ahrefs blog

Final thoughts

“Toxic backlinks” is a term made up by certain SEO tools to scare you. That’s not to say bad links can’t hurt your site. They absolutely can. But fortunately for most site owners, it’s rarely a problem worth worrying all that much about. 

Got questions? Disagree? Ping me on Twitter X.

Advertisement



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

On-Page SEO Checklist for 2024: A Comprehensive Guide

Published

on

On-Page SEO Checklist 2024

On-Page SEO Checklist 2024

Want to make your pages rank high on Google? You won’t be able to do that if you don’t know where or how to start your on-page SEO — and with each Google update, this pillar of SEO gets more and more complicated. To keep you updated with the best and most relevant practices when it comes to this aspect of your website, I have prepared an on-page SEO checklist for 2024. 

On-Page SEO Factors

On-page SEO, in simple terms, is all the ways you can optimize your website take place on your website. Tweaking certain elements of your pages can enable them to climb very quickly up the ranks when done right. These elements include essentially everything you can see on your webpage, like its title tags, headers, and images.

Webmaster’s Note: This is part two of our SEO checklist series. Part one covers our technical SEO checklist, so go back if you haven’t seen that yet. I also do deep dives into other aspects of on-page SEO in other articles, like the best content strategy for SEO, how to hack on-page factors, and ways to dominate niche keywords in your industry.

1. Identify Your Target Keyword

This is where any SEO effort should start. Identify which basic keywords you would like each page to rank for. From there, you can expand into common phrases, questions, and related words people use to find pages like yours through keyword research. 

Key Aspects of Keyword Optimization:

Advertisement
  • Keyword Research: Identifying the right keywords that your target audience is searching for.
  • Keyword Placement: Sensibly incorporating keywords in titles, headings, the first paragraph, and throughout the content.
  • Searcher Intent: Catering to why someone is performing a search, whether it’s to find information, make a purchase, etc.

Effective keyword optimization allows you to create pages that best meet user intent. This boosts your chances of ranking highly for your chosen keywords. 

Using a Keyword Research Tool for On-Page SEOUsing a Keyword Research Tool for On-Page SEO

I have longer guides on the types of keywords you should look at, and another on how to do keyword research you can follow for this step.

2. High-Quality Content Creation

Quality content is the keystone of on-page SEO. It is, after all, fundamental to the selling point of Google — which is that it is the go-to place to find answers to your questions. It’s why Google pushes Helpful Content Updates every so often.

So, your content must meet Google’s standards of quality in order to make it to the top. To do that, your content must be authoritative, valuable to the reader, and deliver on the promises made by your meta tags and headings.

What Constitutes Quality Content:

  • Originality: Your content must be unique and offer fresh insights.
  • Relevancy: It should align with your target user’s intent and be updated regularly.
  • Engagement: Content must encourage users to spend time on your site and interact with your offerings.

Creating content that exceeds user expectations can dramatically bolster your SEO as it can directly affect user engagement metrics and boost the credibility of your site. 

Webmaster’s Note: Beyond making sure all new content is high-quality, however, is ensuring all of your existing content is also up to par. I’ll be covering that in part four of this series, so keep an eye out for that. 

3. URL Structure

URLs are not only a ranking factor but also enhance the user experience when structured logically. 

Advertisement

Features of an Effective URL Structure:

  • Concise and Descriptive: A URL should be concise and explain your page content. No stop words.
  • Keyword Inclusion: A relevant keyword can enhance a URL’s performance.
  • Use Hyphens instead of Underscores: Conventional use dictates using hyphens to separate words.

A clear URL helps users and search engines make sense of the page’s content before they even reach it.

Here’s an example of a bad URL slug. 

Example of Bad URL StructureExample of Bad URL Structure

And here’s an example of a good, optimized one.

Example of Good URL StructureExample of Good URL Structure

4. Title Tag and Headings

I find that certain practices for these two elements give the most benefit to a page’s SEO. 

Best Practices for Title Tag and Heading Optimization:

  • Use a Keyword-First Approach: Place keywords first in your title tag, as uninterrupted by stop-words as possible.
  • Keep it Simple: Title tags should be concise to ensure the entire tag is displayed on the SERPs.
  • Same Keyword, Different Phrasing: Use the same keyword in your title tag and heading 1. However, use different phrasing or wording for each. 
  • Insert Related Keywords: Do this for your heading 2, 3, and so on, where it makes sense.
  • Avoid Duplicates: Use different title tags and headings for every unique page.

4. Meta Tags Enhancement

Meta tags, such as the meta description, serve as a brief pitch to users on search engine results pages. Other meta tags, like your image alt text and links, provide important context to both the user and crawlbot.

Tips for Enhanced Meta Tags:

Advertisement
  • Compelling Copy: Write title tags and meta descriptions that accurately summarize the page content and entice clicks.
  • Keyword Usage: Try to insert target keywords and/or related keywords effectively in your meta descriptions, and within the limit.
  • Uniqueness: Each page should have unique meta tags. 
  • Be Descriptive: Your image alt text should not only include a related keyword but should also adequately describe what is seen on the image. 
  • Add internal and external links: Semantic search means Google can use the links in your pages to gain a better understanding of its content. Always add relevant internal links, and only include external links from trusted websites. 
  • Use Noindex Robots Meta Tag: Add this to prevent any pages with thin content, or pages with little value and no intent from appearing in the SERPs.
  • Use rel=”canonical” Link Tag: Use this for any duplicate pages you have on your website. Doing this can help you control which version of the page gets indexed and ranks for your targeted keywords. 
  • Set your Open Graph Meta Tags: This will let you optimize how your pages look when they’re shared on social media.
  • Set your Viewport Meta Tag: This configures how your pages are scaled and displayed on different devices and platforms, which is important for user experience (more on that later). 

To get the most out of your SEO, don’t neglect this part of your on-page SEO checklist. The small tweaks here can add up to the big picture. 

Well-crafted meta tags have the potential to increase click-through rates, boost your visibility on organic search and image search, enhance user experience, and also distribute link equity throughout your pages. All these contribute to how well your page ranks. 

5. Internal Linking

Internal linking spreads link equity throughout your site and can help search engines discover new pages. Always link back to pillar content, or other high-value content on your website. 

Benefits of Strategic Internal Linking:

  • Navigation: They guide users through other relevant pages on your website.
  • Page Authority: Anchor text can help to convey what the linked-to page is about, which can aid in ranking for those terms.
  • User Time on Site: Providing relevant links can keep users engaged on your site for longer periods.

Good internal linking can significantly increase your engagement rates and contribute to building a robust site architecture. I have a separate post on how to build topical authority through internal linking you can check out.

6. User Experience (UX)

User experience affects on-page SEO because search engines favor websites that provide a positive user experience.

UX Factors to Consider in Your Website Design:

  • Mobile-Friendliness: The site must perform well across all devices — but especially on mobile-view, as most users use Google through their phones.
  • Ease of Use: The site should be navigable and logical in its layout. Navigation bars and other menus should be intuitive and prioritize the most important pages of your website.
  • Page Speed: Pages should load quickly to reduce bounce rates. Follow this guide to site speed optimization for this point.

As UX becomes an even more important ranking factor, I find it is necessary to add to this on-page SEO checklist. Sites that deliver a high-quality user experience will dominate search engine results pages.

Key Takeaway

Mastering this pillar of SEO is crucial for achieving high rankings on Google, and staying updated with evolving best practices is essential. But with every update, what works best changes. 

Advertisement

My 2024 on-page SEO checklist provides basically the most up-to-date practices for the elements on your website. Follow it, and you should be able to boost your website’s authority, credibility, and long-term SEO performance.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS