Connect with us
Cloak And Track Your Affiliate Links With Our User-Friendly Link Cloaking Tool, Try It Free

SEO

20 Essential Technical SEO Tools For Agencies

Published

on

20 Essential Technical SEO Tools For Agencies

Technical SEO tools are plentiful.

However, tools are just part of the equation.

Tools are useless without an experienced technical SEO professional to guide the strategy and ensure successful results.

But, within the hands of an experienced professional, tools can do many wondrous things. From scaling a site’s SEO effortlessly to creating content, it’s possible to improve things with less effort (rather than more hard work).

Tools can increase efficiencies, including:

  • Identifying issues on-site, like crawling and indexing.
  • Diagnosing page speed issues.
  • Identifying missing or duplicate text and other elements.
  • Redirect issues.
  • And many others.

Your SEO tools arsenal, and how you use them, can mean the difference between great success and failure.

And indeed, there is no shortage of technical SEO tools for agencies.

Here are a few of the best you should consider.

1. Screaming Frog

Screenshot of Screaming Frog, December 2022

Screaming Frog is the crawler to have.

To create a substantial website audit, it is crucial to first perform a website crawl with this tool.

Depending on specific settings, it is possible to introduce false positives or errors into an audit that you otherwise would not know about.

Screaming Frog can help you identify the basics like:

  • Missing page titles.
  • Missing meta descriptions.
  • Missing meta keywords.
  • Large images.
  • Errored response codes.
  • Errors in URLs.
  • Errors in canonicals.

Advanced things Screaming Frog can help you do include:

  • Identifying issues with pagination.
  • Diagnosing international SEO implementation issues.
  • Taking a deep dive into a website’s architecture.

2. Google Search Console

Screenshot of Google Search Console showing the initial user interface upon logging in.Screenshot from Google Search Console, December 2022

The primary tool of any SEO pro should be the Google Search Console (GSC).

This critical tool has recently been overhauled to replace many old features while adding more data, features, and reports.

What makes this tool great for agencies? Setting up a reporting process.

For agencies that do SEO, good reporting is critical. If you have not already set up a reporting process, it is highly recommended that you do so.

This process can save you if you have an issue with website change-overs when GSC accounts can be wiped out. If your account is wiped out, it is possible to go back to all your GSC data, because you have been saving it for all these months.

Agency applications can also include utilizing the API for interfacing with other data usage as well.

3. Google Analytics

Where would we be without a solid analytics platform to analyze organic search performance?

While free, Google Analytics provides much in the way of information that can help you identify things like penalties, issues with traffic, and anything else that may come your way.

In much the same way as Google Search Console works, if you set up Google Analytics correctly, it’s ideal to have a monthly reporting process in place.

This process will help you save data for those situations where something unexpected happens to the client’s Google Analytics access.

At least, you won’t have a situation where you lose all data for your clients.

4. Web Developer Toolbar

Screenshot of the Web Developer Toolbar Google Chrome Extension.Screenshot from Web Developer Toolbar extension, December 2022

The web developer toolbar extension for Google Chrome can be downloaded here.

It is an official port of the Firefox web developer extension.

One of the primary uses for this extension is identifying issues with code, specifically JavaScript implementations with menus and the user interface.

Turning off JavaScript and CSS makes it possible to identify where these issues are occurring in the browser.

Your auditing is not just limited to JavaScript and CSS issues.

You can also see alt text, find broken images, and view meta tag information and response headers.

5. WebPageTest

Screenshot showing the user interface of the Webpagetest.org tool.Screenshot of WebPageTest, December 2022

Page speed has been a hot topic in recent years, and auditing website page speed brings you many useful tools.

To that end, WebPageTest is one of those essential SEO tools for your agency.

Cool things that you can do with WebPageTest include:

  • Waterfall speed tests.
  • Competitor speed tests.
  • Competitor speed videos.
  • Identifying how long it takes a site to load fully.
  • Time to first byte.
  • Start render time.
  • Document object model (DOM) elements.

This is useful for determining how a site’s technical elements interact to create the final result or display time.

6. Google PageSpeed Insights

Screenshot of the output of Google's Page Speed InsightsScreenshot of Google PageSpeed Insights, December 2022

Through a combination of speed metrics for both desktop and mobile, Google’s PageSpeed Insights is critical for agencies that want to get their website page speed ducks in a row.

It should not be used as the be-all, end-all of page metrics testing, but it is a good starting point.

Here’s why: PageSpeed Insights does not always use exact page speed. It uses approximations.

While you may get one result with Google PageSpeed, you may also get different results with other tools.

Remember that Google’s PageSpeed provides only part of the picture, and you need more complete data for an effective analysis. Use multiple tools for your analysis to get a full picture of your website’s performance.

7. Google Mobile-Friendly Testing Tool

Screenshot of the user interface of the Google Mobile Friendly testing tool user interface.Screenshot of Google Mobile-Friendly Testing tool, December 2022

Determining a website’s mobile technical aspects is also critical for any website audit.

When putting a website through its paces, Google’s Mobile-Friendly testing tool can give you insights into a website’s mobile implementation.

8. Google’s Rich Results Testing Tool

Screenshot showing the user interface of the Google Rich Results Testing Tool.

The Google Structured Data Testing Tool has been deprecated and replaced by the Google Rich Results Testing Tool.

This tool performs one function, and performs it well: it helps you test Schema structured data markup against the known data from Schema.org that Google supports.

This is a fantastic way to identify issues with your Schema coding before the code is implemented.

9. GTmetrix Page Speed Report

Screenshot of GTMetrix report showing its output of page speed performance.Screenshot of GTmetrix, December 2022

GTmetrix is a page speed report card that provides a different perspective on page speed.

By diving deep into page requests, CSS and JavaScript files that need to load, and other website elements, it is possible to clean up many elements that contribute to high page speed.

10. W3C Validator

Screenshot of the W3C validator showing its outputScreenshot of W3C Validator, December 2022

You may not normally think of a code validator like W3C Validator as an SEO tool, but it is important just the same.

Be careful! If you don’t know what you are doing, it is easy to misinterpret the results and actually make things worse.

For example, say you are validating code from a site that was developed in XHTML, but the code was ported over to WordPress.

Copying and pasting the entire code into WordPress during development does not change its document type automatically. If during testing, you run across pages that have thousands of errors across the entire document, that is likely why.

A website that was developed in this fashion is more likely to need a complete overhaul with new code, especially if the former code does not exist.

11. Semrush

Semrush toolScreenshot of Semrush, January 2022

Semrush’s greatest claim to fame is accurate data for keyword research and other technical research.

But what makes Semrush so valuable is its competitor analysis data.

You may not normally think of Semrush as a technical analysis tool.

However, if you go deep enough into a competitor analysis, the rankings data and market analysis data can reveal surprising information.

You can use these insights to better tailor your SEO strategy and gain an edge over your competitors.

12. Ahrefs

Ahrefs toolScreenshot of Ahrefs, January 2022

Ahrefs is considered by many to be a tool that is a critical component of modern technical link analysis.

By identifying certain patterns in a website’s link profile, you can figure out what a site is doing for its linking strategy.

It is possible to identify anchor text issues that may be impacting a site using its word cloud feature.

Also, you can identify the types of links linking back to the site – whether it’s a blog network, a high-risk link profile with many forum and Web 2.0 links, or other major issues.

Other abilities include identifying when a site’s backlinks started going missing, its linking patterns, and much more.

13. Majestic

Majestic is a long-standing tool in the SEO industry with unique linking insights.

Like Ahrefs, you can identify things like linking patterns by downloading reports of the site’s full link profile.

It is also possible to find things like bad neighborhoods and other domains a website owner owns.

Using this bad neighborhood report, you can diagnose issues with a site’s linking arising out of issues with the site’s website associations.

Like most tools, Majestic has its own values for calculating technical link attributes like Trust Flow, Citation Flow, and other linking elements contributing to trust, relevance, and authority.

It is also possible through its own link graphs to identify any issues occurring with the link profile over time.

Majestic is an exceptional tool in your link diagnostic process.

14. Moz Bar

It is hard to think of something like the MozBar, which lends itself to a little bit of whimsicality, as a serious technical SEO tool. But, there are many metrics that you can gain from detailed analysis.

Things like Moz’s Domain Authority and Page Authority, Google Caching status, other code like social open graph coding, and neat things like the page Metas at-a-glance while in the web browser.

Without diving deep into a crawl, you can also see other advanced elements like rel=”canonical” tags, page load time, Schema Markup, and even the page’s HTTP status.

This is useful for an initial survey of the site before diving deeper into a proper audit, and it can be a good idea to include the findings from this data in an actual audit.

15. Barracuda Panguin

Screenshot of Barracuda Panguin ToolScreenshot of Barracuda Panguin Tool, December 2022

If you are investigating a site for a penalty, the Barracuda Panguin tool should be a part of your workflow.

It works by connecting to the Google Analytics account of the site you are investigating. The overlay is intertwined with the GA data, and it will overlay data of when a penalty occurred with your GA data.

Using this overlay, it is possible to easily identify situations where potential penalties occur.

Now, it is important to note that there isn’t an exact science to this, and that correlation isn’t always causation.

It’s important to investigate all avenues of where data is potentially showing something happening, in order to rule out any potential penalty.

Using tools like this can help you zero in on approximations in data events as they occur, which can help for investigative reasons.

16. Google Search Console XML Sitemap Report

Screenshot of Google Search Console XML Sitemap reportScreenshot of Google Search Console XML Sitemap, December 2022

The Google Search Console XML Sitemap Report is one of those technical SEO tools that should be an important part of any agency’s reporting workflow.

Diagnosing sitemap issues is a critical part of any SEO audit, and this technical insight can help you achieve the all-important 1:1 ratio of URLs added to the site and the sitemap being updated.

For those who don’t know, it is considered an SEO best practice to ensure the following:

  • That a sitemap is supposed to contain all 200 OK URLs. No 4xx or 5xx URLs should be showing up in the sitemap.
  • There should be a 1:1 ratio of exact URLs in the sitemap as there are on the site. In other words, the sitemap should not have any orphaned pages that are not showing up in the Screaming Frog crawl.
  • Any parameter-laden URLs should be removed from the sitemap if they are not considered primary pages. Certain parameters will cause issues with XML sitemaps validating, so make sure that these parameters are not included in URLs.

17. BrightLocal

If you are operating a website for a local business, your SEO strategy should involve local SEO for a significant portion of its link acquisition efforts.

This is where BrightLocal comes in.

It is normally not thought of as a technical SEO tool, but its application can help you uncover technical issues with the site’s local SEO profile.

For example, you can audit the site’s local SEO citations with this tool. Then, you can move forward with identifying and submitting your site to the appropriate citations.

It works kind of like Yext, in that it has a pre-populated list of potential citations.

One of BrightLocal’s essential tools is that it lets you audit, clean, and build citations to the most common citation sites (and others that are less common).

BrightLocal also includes in-depth auditing of your Google Business Profile presence, including in-depth local SEO audits.

If your agency is heavy into local SEO, this is one of those tools that is a no-brainer, from a workflow perspective.

18. Whitespark

Whitespark is more in-depth when compared to BrightLocal.

Its local citation finder allows you to dive deeper into your site’s local SEO by finding where your site is across the competitor space.

To that end, it also lets you identify all of your competitor’s local SEO citations.

In addition, part of its auditing capabilities allows it to track rankings through detailed reporting focused on distinct Google local positions such as the local pack and local finder, as well as detailed organic rankings reports from both Google and Bing.

19. Botify

Botify is one of the most complete technical SEO tools available.

Its claim to fame includes the ability to reconcile search intent and technical SEO with its in-depth keywords analysis tool. You can tie things like crawl budget and technical SEO elements that map to searcher intent.

Not only that, but it’s also possible to identify all the technical SEO factors that are contributing to ranking through Botify’s detailed technical analysis.

In its detailed reporting, you can also use the tool to detect changes in how people are searching, regardless of the industry that you are focused on.

The powerful part of Botify includes its in-depth reports that are capable of tying data to information that you can really act on.

20. Excel

Technical SEO Tools - Excel TricksScreenshot by author, December 2022

Many SEO pros aren’t aware that Excel can be considered a technical SEO tool.

Surprising, right?

Well, there are a number of Excel super tricks that one can use to perform technical SEO audits. Tasks that would otherwise take a significantly long time manually can be accomplished much faster.

Let’s look at a few of these “super tricks.”

Super Trick #1: VLOOKUP

With VLOOKUP, it is possible to pull data from multiple sheets based on data that you want to populate in the primary sheet.

This function allows you to do things like perform a link analysis using data gathered from different tools.

If you gathered linking data from GSC’s “who links to you the most” report, other data from Ahrefs, and other data from Moz, you know that it is impossible to reconcile all the information together.

What if you wanted to determine which internal links are valuable in accordance with a site’s inbound linking strategy?

Using this VLOOKUP video, you can combine data from GSC’s report with data from Ahrefs’ report to get the entire picture of what’s happening here.

Super Trick #2: Easy XML Sitemaps

Coding XML Sitemaps manually is a pain, isn’t it?

Not anymore.

Using a process of coding that is implemented quickly, it is possible to code a sitemap in Excel in a matter of minutes – if you work smart.

See the video I created showing this process.

Super Trick #3: Conditional Formatting

Using conditional formatting, it is possible to reconcile long lists of information in Excel.

This is useful in many SEO situations where lists of information are compared daily.

Although Tools Create Efficiencies, They Do Not Replace Manual Work

For any SEO agency that wants a competitive edge, SEO tools run the gamut from crawling to auditing, data gathering, analysis, and much more.

You don’t want to leave your results up to chance.

The right tools can provide another dimension to your analysis that standard analysis might otherwise not provide.

They can also give you an edge in creating an output that will delight your clients and keep them coming back for years to come.

Which of these tools will you use to wow your customers?


Featured Image: Paulo Bobita/Search Engine Journal



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

What SEO Should Know About Brand Marketing With Mordy Oberstein

Published

on

By

What SEO Should Know About Brand Marketing With Mordy Oberstein

For the SEO industry, the Google documents leak offered an important view behind the scenes. Although the leak was not a blueprint of how the algorithm worked, there was considerable confirmation that SEO professionals were right about many elements of the algorithm.

From all the analysis and discussion following the leak, the one insight that got my attention was how important the brand is.

Rand Fishkin, who broke the leak, said this:

“Brand matters more than anything else … If there was one universal piece of advice I had for marketers seeking to broadly improve their organic search rankings and traffic, it would be: “Build a notable, popular, well-recognized brand in your space, outside of Google search.”

Mike King echoed this statement with the following observation:

“All these potential demotions can inform a strategy, but it boils down to making stellar content with strong user experience and building a brand, if we’re being honest.”

Mordy Oberstein, who is an advocate for building a brand online, posted on X (Twitter):

“I am SO happy that the SEO conversation has shifted to thinking about “brand.”

It’s not the first time that “brand” has been mentioned in SEO. We began to talk about this around 2012 after the impact of Panda and Penguin when it first became apparent that Google’s aim was to put more emphasis on brand.

Compounding this is the introduction of AI, which has accelerated the importance of taking a more holistic approach to online marketing with less reliance on Google SERPs.

When I spoke to Pedro Dias, he said, “We need to focus more than ever on building our own communities with users aligned to our brands.”

As someone who had 15 years of offline experience in marketing, design, and business before moving into SEO, I have always said that having this wide knowledge allows me to take a holistic view of SEO. So, I welcome the mindset shift towards building a brand online.

As part of his X/Twitter post, Mordy also said:

“I am SO happy that the SEO conversation has shifted to thinking about “brand” (a lot of which is the direct result of @randfish’s & @iPullRank’s great advice following the “Google leaks”).

As someone who has straddled the brand marketing and SEO world for the better part of 10 years – branding is A LOT harder than many SEOs would think and will be a HUGE adjustment for many SEOs.”

Following his X/Twitter post, I reached out to Mordy Oberstein, Head of SEO Brand at Wix, to have a conversation about branding and SEO.

What Do SEO Pros Need To Know About ‘Brand’ To Make The Mindset Shift?

I asked Mordy, “In your opinion, what does brand and building a brand mean, and can SEO pros make this mindset shift?”

Mordy responded, “Brand building basically means creating a connection between one entity and another entity, meaning the company and the audience.

It’s two people meeting, and that convergence is the building of a brand. It’s very much a relationship. And I think that’s what makes it hard for SEOs. It’s a different way of thinking; it’s not linear, and there aren’t always metrics that you can measure it by.

I’m not saying you don’t use data, or you don’t have data, but it’s harder to measure to tell a full story.

You’re trying to pick up on latent signals. A lot of the conversation is unconscious.

It’s all about the micro things that compound. So, you have to think about everything you do, every signal, to ensure that it is aligned with the brand.

For example, a website writes about ‘what is a tax return.’ However, if I’m a professional accountant and I see this on your blog, I might think this isn’t relevant to me because you’re sending me a signal that you’re very basic. I don’t need to know what a tax return is; I have a master’s degree in accounting.

The latent signals that you’re sending can be very subtle, but this is where it is a mindset shift for SEO.”

I recalled a recent conversation with Pedro Dias in which he stressed it was important to put your users front and center and create content that is relevant to them. Targeting high-volume keywords is not going to connect with your audience. Instead, think about what is going to engage, interest, and entertain them.

I went on to say that for some time, the discussion online has been about SEO pros shifting away from the keyword-first approach. However, the consequences of moving away from a focus on traffic and clicks will mean we are likely to experience a temporary decline in performance.

How Does An SEO Professional Sell This To Stakeholders – How Do They Measure Success?

I asked Mordy, “How do you justify this approach to stakeholders – how do they measure success?”

Mordy replied, “I think selling SEO will become harder over time. But, if you don’t consider the brand aspect, then you could be missing the point of what is happening. It’s not about accepting lower volumes of traffic; it’s that traffic will be more targeted.

You might see less traffic right now, but the idea is to gain a digital presence and create digital momentum that will result in more qualified traffic in the long term.”

Mordy went on to say, “It’s going to be a habit to break out of, just like when you have to go on a diet for a long-term health gain.

The ecosystem will change, and it will force change to our approach. SEOs may not have paid attention to the Google leak documents, but I think they will pay attention as the entire ecosystem shifts – they won’t have a choice.

I also think C-level will send a message that they don’t care about overall traffic numbers, but do care about whether a user appreciates what they are producing and that the brand is differentiated in some way.”

How Might The Industry Segment And What Will Be The Important Roles?

I interjected to make the point that it does look a lot like SEO is finally making that shift across marketing.

Technical SEO will always be important, and paid/programmatic will remain important because it is directly attributable.

For the rest of SEO, I anticipate it merges across brand, SEO, and content into a hybrid strategy role that will straddle those disciplines.

What we thought of as “traditional SEO” will fall away, and SEO will become absorbed into marketing.

In response, Mordy agreed and thought that SEO traffic is part of a wider scope or part of a wider paradigm, and it will sit under brand and communications.

An SEO pro that functions as part of the wider marketing and thinks about how we are driving revenue, how we are driving growth, what kind of growth we are driving, and using SEO as a vehicle to that.

The final point I raised was about social media and whether that would become a more combined facet of SEO and overall online marketing.

Mordy likened Google to a moth attracted to the biggest digital light.

He said, “Social media is a huge vehicle for building momentum and the required digital presence.

For example, the more active I am on social media, the more organic branded searches I gain through Google Search. I can see the correlation between that.

I don’t think that Google is ignoring branded searches, and it makes a semantic connection.”

SEO Will Shift To Include Brand And Marketing

The conversation I had with Mordy raised an interesting perspective that SEO will have to make significant shifts to a brand and marketing mindset.

The full impact of AI on Google SERPs and how the industry might change is yet to be realized. But, I strongly recommend that anyone in SEO consider how they can start to take a brand-first approach to their strategy and the content they create.

I suggest building and measuring relationships with audiences based on how they connect with your brand and moving away from any strategy based on chasing high-volume keywords.

Think about what the user will do once you get the click – that is where the real value lies.

Get ahead of the changes that are coming.

Thank you to Mordy Oberstein for offering his opinion and being my guest on IMHO.

More resources:


Featured Image: 3rdtimeluckystudio/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

4 Ways PPC and SEO Can Work Together (And When They Can’t)

Published

on

4 Ways PPC and SEO Can Work Together (And When They Can’t)

Search engine optimization (SEO) is the process of optimizing your pages to rank in a search engine’s organic results.

Pay-per-click (PPC) is a form of online advertising where advertisers pay a fee each time someone clicks their ad.

There’s no conundrum between the two types of marketing. You don’t have to choose one or the other; the best companies use both.

Here’s how they can work together and produce magic:

Creating SEO content is the process of figuring out what your target audience is searching on Google and aligning your content to their search intent.

To start off, you need to find out what they’re searching for. The easiest way is to use a keyword research tool, like Ahrefs’ Keywords Explorer.

Here’s how you might find keywords for a hypothetical coffee equipment store:

  1. Go to Ahrefs’ Keywords Explorer
  2. Enter a relevant keyword (e.g., “coffee”)
  3. Go to Matching terms

Go through the list and pick out keywords that are relevant to the site. For example, the keyword “how to grind coffee beans” seems like a good keyword to target.

The keyword "how to grind coffee beans" and relevant SEO statsThe keyword "how to grind coffee beans" and relevant SEO stats

Once we’ve chosen our keyword, we want to know what searchers are looking for specifically. Sometimes the keyword gives us an idea, but to be sure, we can look at the top-ranking pages.

So, click the SERP button and then click Identify intents to see what searchers are looking for:

The Identify Intents feature in Ahrefs' Keywords ExplorerThe Identify Intents feature in Ahrefs' Keywords Explorer

We can see that searchers are looking for techniques and methods to grind coffee beans at home, and especially without a grinder. If we want to rank high, we’ll likely have to follow suit.

Those are the basics of creating SEO content. But doing just this isn’t enough. After all, the quote goes, “if a tree falls in a forest and no one hears it, does it make a sound?”

This applies to your content too. You don’t want to create into a void; you want people to see and consume your content. This is where PPC comes in. You can run PPC ads to ensure that as many people see your content as possible.

For example, at Ahrefs, we run Facebook ads for our content:

An example of a Facebook Ad we ran for our contentAn example of a Facebook Ad we ran for our content

We also run ads on Quora:

Our Quora ads campaigns we ran for the blogOur Quora ads campaigns we ran for the blog

This way, we make sure that none of our content efforts go to waste.

Links are an important Google ranking factor. Generally speaking, the more links your page has, the more likely it’ll rank high in the search results.

But acquiring links is hard. This is why it’s still a reliable ranking factor. And it’s also why there’s an entire industry behind link building, and tons of tactics you can use, all with varying levels of success.

One way you can consider building links to your pages is to run PPC ads. In fact, we ran an experiment a few years ago to prove that it was possible.

We spent ~$1,245 on Google search ads and acquired a total of 16 backlinks to two different pieces of content. (~$77-78 per backlink.) This is much cheaper than if you had to buy a backlink, which according to our study, costs around $361.44.

(It would be even more expensive if you acquired links via outreach, as you would have to consider additional costs like software, manpower, etc.)

Retargeting allows you to target visitors who have left your website.

Here’s how retargeting works:

  1. A visitor discovers your article on Google
  2. Your ad management software sets a cookie on the visitor’s browser, which allows you to show ads to these visitors
  3. When the visitor leaves your website and surfs the web, you can show ads and persuade them to return to your website

Depending on where they are on the buyer’s journey, you can convince them to take the next step.

buyer's journeybuyer's journey

For example, if someone found your website via your article on the “best espresso machines”, it’s likely they’re looking to buy. So, you can set your retargeting ad to encourage them to visit your espresso machines category page.

On the other hand, if a visitor discovered your website from your “what is a coffee grinder” article, they might still be early on the journey. In that case, it might be prudent to encourage them to sign up for your email list instead.

Every site has important keywords. For example, besides our brand and product terms, critical keywords are “keyword research”, “link building”, and “technical SEO”.

Since these keywords are important, it makes sense to dominate the SERPs for them. You can do this by simultaneously running ads for them while ranking in organic search. For example, Wix ranks for the keyword “create website for free” in both paid and organic SERPs:

Wix ranks for the keyword “create website for free” in both paid and organic SERPsWix ranks for the keyword “create website for free” in both paid and organic SERPs

This is especially useful if you’re a new or smaller site. The keywords that are important to you are likely important to your competitors too. Which means you can’t compete with them overnight.

So, a good strategy is to target those keywords via PPC first, while investing in your SEO strategy. Over time, as you acquire more backlinks and gain more website authority, you’ll be able to compete with your competitors in organic search too.

While both channels are complementary, there are times where it may make more sense to choose one over the other.

When to choose PPC

If you fit these scenarios, it might be a better idea to go for PPC:

  • You’re promoting a limited-time offer, event, or launching a product. According to our poll, SEO takes three to six months to show results. If your event, offer, or launch is shorter than the expected timeframe, it’ll be over even before SEO takes any effect.
  • You need immediate, short-term results. If you need to show some results now, then PPC will be a better choice.
  • You have a disruptive product or service. SEO depends on figuring out what people are already searching for. If your product or service is completely novel, then it’s likely no one is searching for it.
  • Hyper-competitive SERPs. Some niches have competing sites with large SEO teams and deep pockets. Coupled with Google’s preference for known brands, if you’re in these niches, it can be difficult to compete. PPC offers a viable alternative for gaining visibility on the first page.

When to choose SEO

Here are times when it may make better sense to choose SEO:

  • Keywords are too expensive. Some industries, like insurance or finance, have cost-per-clicks (CPC) up to a few hundred dollars. For example, the keyword “direct auto insurance san antonio” has a CPC of $275.
  • Your niche is restricted. Certain industries or niches (e.g., adult, weapons, gambling, etc.) are prohibited or restricted from advertising.
  • You have a limited budget. PPC requires money to begin, whereas SEO can drive traffic to your website at no direct cost per visitor.
  • You’re building an affiliate site. Affiliate sites earn a commission when people buy from their recommendations. While it’s not impossible to build an affiliate site from PPC, it’s difficult to control the return on investment (ROI) since affiliate site owners cannot control sales conversion rates.

Final thoughts

There are cases where focusing on either SEO or PPC makes sense.

But most of the time, the best companies don’t discriminate between channels. If they produce positive ROI, then you should be using all marketing channels.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

The Three Pillars Of SEO: Authority, Relevance, And Experience

Published

on

By

The Three Pillars Of SEO: Authority, Relevance, And Experience

If there’s one thing we SEO pros are good at, it’s making things complicated.

That’s not necessarily a criticism.

Search engine algorithms, website coding and navigation, choosing and evaluating KPIs, setting content strategy, and more are highly complex tasks involving lots of specialized knowledge.

But as important as those things all are, at the end of the day, there is really just a small set of things that will make most of the difference in your SEO success.

In SEO, there are really just three things – three pillars – that are foundational to achieving your SEO goals.

  • Authority.
  • Relevance.
  • Experience (of the users and bots visiting the site).

Nutritionists tell us our bodies need protein, carbohydrates, and fats in the right proportions to stay healthy. Neglect any of the three, and your body will soon fall into disrepair.

Similarly, a healthy SEO program involves a balanced application of authority, relevance, and experience.

Authority: Do You Matter?

In SEO, authority refers to the importance or weight given to a page relative to other pages that are potential results for a given search query.

Modern search engines such as Google use many factors (or signals) when evaluating the authority of a webpage.

Why does Google care about assessing the authority of a page?

For most queries, there are thousands or even millions of pages available that could be ranked.

Google wants to prioritize the ones that are most likely to satisfy the user with accurate, reliable information that fully answers the intent of the query.

Google cares about serving users the most authoritative pages for their queries because users that are satisfied by the pages they click through to from Google are more likely to use Google again, and thus get more exposure to Google’s ads, the primary source of its revenue.

Authority Came First

Assessing the authority of webpages was the first fundamental problem search engines had to solve.

Some of the earliest search engines relied on human evaluators, but as the World Wide Web exploded, that quickly became impossible to scale.

Google overtook all its rivals because its creators, Larry Page and Sergey Brin, developed the idea of PageRank, using links from other pages on the web as weighed citations to assess the authoritativeness of a page.

Page and Brin realized that links were an already-existing system of constantly evolving polling, in which other authoritative sites “voted” for pages they saw as reliable and relevant to their users.

Search engines use links much like we might treat scholarly citations; the more scholarly papers relevant to a source document that cite it, the better.

The relative authority and trustworthiness of each of the citing sources come into play as well.

So, of our three fundamental categories, authority came first because it was the easiest to crack, given the ubiquity of hyperlinks on the web.

The other two, relevance and user experience, would be tackled later, as machine learning/AI-driven algorithms developed.

Links Still Primary For Authority

The big innovation that made Google the dominant search engine in a short period was that it used an analysis of links on the web as a ranking factor.

This started with a paper by Larry Page and Sergey Brin called The Anatomy of a Large-Scale Hypertextual Web Search Engine.

The essential insight behind this paper was that the web is built on the notion of documents inter-connected with each other via links.

Since putting a link on your site to a third-party site might cause a user to leave your site, there was little incentive for a publisher to link to another site unless it was really good and of great value to their site’s users.

In other words, linking to a third-party site acts a bit like a “vote” for it, and each vote could be considered an endorsement, endorsing the page the link points to as one of the best resources on the web for a given topic.

Then, in principle, the more votes you get, the better and the more authoritative a search engine would consider you to be, and you should, therefore, rank higher.

Passing PageRank

A significant piece of the initial Google algorithm was based on the concept of PageRank, a system for evaluating which pages are the most important based on scoring the links they receive.

So, a page that has large quantities of valuable links pointing to it will have a higher PageRank and will, in principle, be likely to rank higher in the search results than other pages without as high a PageRank score.

When a page links to another page, it passes a portion of its PageRank to the page it links to.

Thus, pages accumulate more PageRank based on the number and quality of links they receive.

Not All Links Are Created Equal

So, more votes are better, right?

Well, that’s true in theory, but it’s a lot more complicated than that.

PageRank scores range from a base value of one to values that likely exceed trillions.

Higher PageRank pages can have a lot more PageRank to pass than lower PageRank pages. In fact, a link from one page can easily be worth more than one million times a link from another page.

Three Pillars of SEO: Authority, Relevance, and Trust | SEJ

But the PageRank of the source page of a link is not the only factor in play.

Google also looks at the topic of the linking page and the anchor text of the link, but those have to do with relevance and will be referenced in the next section.

It’s important to note that Google’s algorithms have evolved a long way from the original PageRank thesis.

The way that links are evaluated has changed in significant ways – some of which we know, and some of which we don’t.

What About Trust?

You may hear many people talk about the role of trust in search rankings and in evaluating link quality.

For the record, Google says it doesn’t have a concept of trust it applies to links (or ranking), so you should take those discussions with many grains of salt.

These discussions began because of a Yahoo patent on the concept of TrustRank.

The idea was that if you started with a seed set of hand-picked, highly trusted sites and then counted the number of clicks it took you to go from those sites to yours, the fewer clicks, the more trusted your site was.

Google has long said it doesn’t use this type of metric.

However, in 2013 Google was granted a patent related to evaluating the trustworthiness of links. We should not though that the existence of a granted patent does not mean it’s used in practice.

For your own purposes, however, if you want to assess a site’s trustworthiness as a link source, using the concept of trusted links is not a bad idea.

If they do any of the following, then it probably isn’t a good source for a link:

  • Sell links to others.
  • Have less than great content.
  • Otherwise, don’t appear reputable.

Google may not be calculating trust the way you do in your analysis, but chances are good that some other aspect of its system will devalue that link anyway.

Fundamentals Of Earning & Attracting Links

Now that you know that obtaining links to your site is critical to SEO success, it’s time to start putting together a plan to get some.

The key to success is understanding that Google wants this entire process to be holistic.

Google actively discourages, and in some cases punishes, schemes to get links in an artificial way. This means certain practices are seen as bad, such as:

  • Buying links for SEO purposes.
  • Going to forums and blogs and adding comments with links back to your site.
  • Hacking people’s sites and injecting links into their content.
  • Distributing poor-quality infographics or widgets that include links back to your pages.
  • Offering discount codes or affiliate programs as a way to get links.
  • And many other schemes where the resulting links are artificial in nature.

What Google really wants is for you to make a fantastic website and promote it effectively, with the result that you earn or attract links.

So, how do you do that?

Who Links?

The first key insight is understanding who it is that might link to the content you create.

Here is a chart that profiles the major groups of people in any given market space (based on research by the University of Oklahoma):

Three Pillars of SEO: Authority, Relevance, and Trust | SEJ

Who do you think are the people that might implement links?

It’s certainly not the laggards, and it’s also not the early or late majority.

It’s the innovators and early adopters. These are the people who write on media sites or have blogs and might add links to your site.

There are also other sources of links, such as locally-oriented sites, such as the local chamber of commerce or local newspapers.

You might also find some opportunities with colleges and universities if they have pages that relate to some of the things you’re doing in your market space.

Relevance: Will Users Swipe Right On Your Page?

You have to be relevant to a given topic.

Think of every visit to a page as an encounter on a dating app. Will users “swipe right” (thinking, “this looks like a good match!)?

If you have a page about Tupperware, it doesn’t matter how many links you get – you’ll never rank for queries related to used cars.

This defines a limitation on the power of links as a ranking factor, and it shows how relevance also impacts the value of a link.

Consider a page on a site that is selling a used Ford Mustang. Imagine that it gets a link from Car and Driver magazine. That link is highly relevant.

Also, think of this intuitively. Is it likely that Car and Driver magazine has some expertise related to Ford Mustangs? Of course it does.

In contrast, imagine a link to that Ford Mustang from a site that usually writes about sports. Is the link still helpful?

Probably, but not as helpful because there is less evidence to Google that the sports site has a lot of knowledge about used Ford Mustangs.

In short, the relevance of the linking page and the linking site impacts how valuable a link might be considered.

What are some ways that Google evaluates relevance?

The Role Of Anchor Text

Anchor text is another aspect of links that matters to Google.

Three Pillars of SEO: Authority, Relevance, and Trust | SEJ

The anchor text helps Google confirm what the content on the page receiving the link is about.

For example, if the anchor text is the phrase “iron bathtubs” and the page has content on that topic, the anchor text, plus the link, acts as further confirmation that the page is about that topic.

Thus, the links evaluate both the page’s relevance and authority.

Be careful, though, as you don’t want to go aggressively obtaining links to your page that all use your main keyphrase as the anchor text.

Google also looks for signs that you are manually manipulating links for SEO purposes.

One of the simplest indicators is if your anchor text looks manually manipulated.

Internal Linking

There is growing evidence that Google uses internal linking to evaluate how relevant a site is to a topic.

Properly structured internal links connecting related content are a way of showing Google that you have the topic well-covered, with pages about many different aspects.

By the way, anchor text is as important when creating external links as it is for external, inbound links.

Your overall site structure is related to internal linking.

Think strategically about where your pages fall in your site hierarchy. If it makes sense for users it will probably be useful to search engines.

The Content Itself

Of course, the most important indicator of the relevance of a page has to be the content on that page.

Most SEO professionals know that assessing content’s relevance to a query has become way more sophisticated than merely having the keywords a user is searching for.

Due to advances in natural language processing and machine learning, search engines like Google have vastly increased their competence in being able to assess the content on a page.

What are some things Google likely looks for in determining what queries a page should be relevant for?

  • Keywords: While the days of keyword stuffing as an effective SEO tactic are (thankfully) way behind us, having certain words on a page still matters. My company has numerous case studies showing that merely adding key terms that are common among top-ranking pages for a topic is often enough to increase organic traffic to a page.
  • Depth: The top-ranking pages for a topic usually cover the topic at the right depth. That is, they have enough content to satisfy searchers’ queries and/or are linked to/from pages that help flesh out the topic.
  • Structure: Structural elements like H1, H2, and H3, bolded topic headings, and schema-structured data may help Google better understand a page’s relevance and coverage.

What About E-E-A-T?

E-E-A-T is a Google initialism standing for Experienced-Expertise-Authoritativeness-Trustworthiness.

It is the framework of the Search Quality Rater’s Guidelines, a document used to train Google Search Quality Raters.

Search Quality Raters evaluate pages that rank in search for a given topic using defined E-E-A-T criteria to judge how well each page serves the needs of a search user who visits it as an answer to their query.

Those ratings are accumulated in aggregate and used to help tweak the search algorithms. (They are not used to affect the rankings of any individual site or page.)

Of course, Google encourages all site owners to create content that makes a visitor feel that it is authoritative, trustworthy, and written by someone with expertise or experience appropriate to the topic.

The main thing to keep in mind is that the more YMYL (Your Money or Your Life) your site is, the more attention you should pay to E-E-A-T.

YMYL sites are those whose main content addresses things that might have an effect on people’s well-being or finances.

If your site is YMYL, you should go the extra mile in ensuring the accuracy of your content, and displaying that you have qualified experts writing it.

Building A Content Marketing Plan

Last but certainly not least, create a real plan for your content marketing.

Don’t just suddenly start doing a lot of random stuff.

Take the time to study what your competitors are doing so you can invest your content marketing efforts in a way that’s likely to provide a solid ROI.

One approach to doing that is to pull their backlink profiles using tools that can do that.

With this information, you can see what types of links they’ve been getting and, based on that, figure out what links you need to get to beat them.

Take the time to do this exercise and also to map which links are going to which pages on the competitors’ sites, as well as what each of those pages rank for.

Building out this kind of detailed view will help you scope out your plan of attack and give you some understanding of what keywords you might be able to rank for.

It’s well worth the effort!

In addition, study the competitor’s content plans.

Learn what they are doing and carefully consider what you can do that’s different.

Focus on developing a clear differentiation in your content for topics that are in high demand with your potential customers.

This is another investment of time that will be very well spent.

Experience

As we traced above, Google started by focusing on ranking pages by authority, then found ways to assess relevance.

The third evolution of search was evaluating the site and page experience.

This actually has two separate but related aspects: the technical health of the site and the actual user experience.

We say the two are related because a site that is technically sound is going to create a good experience for both human users and the crawling bots that Google uses to explore, understand a site, and add pages to its index, the first step to qualifying for being ranked in search.

In fact, many SEO pros (and I’m among them) prefer to speak of SEO not as Search Engine Optimization but as Search Experience Optimization.

Let’s talk about the human (user) experience first.

User Experience

Google realized that authoritativeness and relevancy, as important as they are, were not the only things users were looking for when searching.

Users also want a good experience on the pages and sites Google sends them to.

What is a “good user experience”? It includes at least the following:

  • The page the searcher lands on is what they would expect to see, given their query. No bait and switch.
  • The content on the landing page is highly relevant to the user’s query.
  • The content is sufficient to answer the intent of the user’s query but also links to other relevant sources and related topics.
  • The page loads quickly, the relevant content is immediately apparent, and page elements settle into place quickly (all aspects of Google’s Core Web Vitals).

In addition, many of the suggestions above about creating better content also apply to user experience.

Technical Health

In SEO, the technical health of a site is how smoothly and efficiently it can be crawled by Google’s search bots.

Broken connections or even things that slow down a bot’s progress can drastically affect the number of pages Google will index and, therefore, the potential traffic your site can qualify for from organic search.

The practice of maintaining a technically healthy site is known as technical SEO.

The many aspects of technical SEO are beyond the scope of this article, but you can find many excellent guides on the topic, including Search Engine Journal’s Advanced Technical SEO.

In summary, Google wants to rank pages that it can easily find, that satisfy the query, and that make it as easy as possible for the searcher to identify and understand what they were searching for.

What About the Google Leak?

You’ve probably heard by now about the leak of Google documents containing thousands of labeled API calls and many thousands of attributes for those data buckets.

Many assume that these documents reveal the secrets of the Google algorithms for search. But is that a warranted assumption?

No doubt, perusing the documents is interesting and reveals many types of data that Google may store or may have stored in the past. But some significant unknowns about the leak should give us pause.

  • As  Google has pointed out, we lack context around these documents and how they were used internally by Google, and we don’t know how out of date they may be.
  • It is a huge leap from “Google may collect and store data point x” to “therefore data point x is a ranking factor.”
  • Even if we assume the document does reveal some things that are used in search, we have no indication of how they are used or how much weight they are given.

Given those caveats, it is my opinion that while the leaked documents are interesting from an academic point of view, they should not be relied upon for actually forming an SEO strategy.

Putting It All Together

Search engines want happy users who will come back to them again and again when they have a question or need.

They create and sustain happiness by providing the best possible results that satisfy that question or need.

To keep their users happy, search engines must be able to understand and measure the relative authority of webpages for the topics they cover.

When you create content that is highly useful (or engaging or entertaining) to visitors – and when those visitors find your content reliable enough that they would willingly return to your site or even seek you out above others – you’ve gained authority.

Search engines work hard to continually improve their ability to match the human quest for trustworthy authority.

As we explained above, that same kind of quality content is key to earning the kinds of links that assure the search engines you should rank highly for relevant searches.

That can be either content on your site that others want to link to or content that other quality, relevant sites want to publish, with appropriate links back to your site.

Focusing on these three pillars of SEO – authority, relevance, and experience – will increase the opportunities for your content and make link-earning easier.

You now have everything you need to know for SEO success, so get to work!

More resources: 


Featured Image: Kapralcev/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending