Connect with us


The Complete Guide for Advanced Users (with Checklist)



If you’re running a website, it’s important to make sure that you’re doing everything you can to optimize it for search engines. Technical SEO is one of the most important aspects of this process, and it can be tricky to get right.

In this follow-up to our Learn SEO: The Complete Guide for Beginners, we will walk you through everything you need to know about technical SEO. We’ll cover topics like crawlability, indexing, and page speed optimization. We’ll also provide tips on how to troubleshoot common issues and fix them. 

What is technical SEO?

Technical SEO is the process of optimizing your website for the technical aspects that search engines use to crawl and index it, with the end goal of increasing rankings.

What is technical SEO?

This includes things like your site’s structure, code, and page speed. By improving these technical factors, you can make your site more visible and easier to find in search results.


Why is technical SEO important?

You can have awesome content, but if your website has technical errors, that awesome content you worked hard on will be difficult to find. This is why technical SEO is important.

For one, it can help improve your site’s visibility in search results. If your site is well-optimized, it will be easier for search engines to find and index it. This can lead to higher rankings and more traffic over time.

Additionally, technical SEO can help improve your site’s user experience. Faster loading times and an easy-to-use interface can make a big difference for your visitors.

Is technical SEO difficult?

Technical SEO can seem daunting at first, but it’s actually not that difficult once you get the hang of it.

The most important thing to remember is that technical SEO is all about making your website as easy to find and index as possible.

That means ensuring that your pages are well-structured and free of errors, and providing clear and concise metadata.


It also means taking steps to improve your site’s speed and performance.

While this may sound like a lot of work, the payoff is worth it—technical SEO can help you attract more organic traffic and improve your search engine rankings.

Is it possible to rank a website properly without technical SEO?

In order for a website to rank on the search engines, technical SEO must be implemented.

This means that:

  • The website must be designed in such a way that it can be easily found and indexed by the search engine crawlers.
  • The website must also have fast loading times and be free of errors.
  • The website’s structure and hierarchy should be easy to understand.

So without optimizing the technical factors in your website, it won’t rank in the search engines.

Technical SEO vs on-page SEO

One common misconception is that technical SEO and on-page SEO are the same thing. But there is a big difference between the two.

On-page SEO refers to the optimization of individual pages on your website. This includes things like title tags, meta descriptions, and keyword research.


Technical SEO, on the other hand, refers to the technical aspects of your site that affect its crawlability and indexation. This includes things like site structure, code, and page speed.

So if you’re looking to improve your website’s visibility in search results, be sure to focus on both technical and on-page SEO, and not just one or the other.

Glossary of terms

Before we move forward, here are some terms for quick reference.


Search engines detect new and updated content by sending search spiders, which crawl and store information from websites for indexing and retrieval.


Search engines index content they find during crawling and store it. As soon as a page is indexed, it is eligible to appear when relevant to a user’s search query.


Search engines search their index database for results relevant to the search query, then retrieve those results and rank them according to what looks to be the most useful to the user.


XML sitemap

An XML sitemap is used to help search engines understand the structure of a website. It’s basically a map that shows search engines what pages are on a site and how they’re related.

This can be especially helpful if a website has a lot of content or if it’s regularly updated.


The robots.txt is a file used to control which files search engine spiders can access on your website. The file is located in the root directory of your website.

Structured data

Structured data is a specific format for organizing information on a website. This format can be used by search engines to understand the contents of a page, and provide more relevant results to users.

Structured data also makes the SERPs more informative by triggering knowledge panels, featured snippets, and event snippets.

Technical SEO audit tools

Here is a list of the tools you can use for your audit.


Screaming Frog SEO Spider

Technical SEO tool Screaming frog interface

We covered Screaming Frog extensively in our Learn SEO guide. Basically, it crawls your links and gives you an overview of what’s going on in your website: Your 404s, duplicates, missing metadata, and more.

Learn SEO The Complete Guide for Beginners Landing Page

Download: Learn SEO: The Complete Guide for Beginners

SE Ranking

New SE Ranking Website Audit Tool Dashboard 3

A paid tool I’ve found to be useful as well for auditing is SE Ranking. The best part for me about this tool is that the report interface is so clean and intuitive. You can check out our review for SE Ranking’s website audit feature here.


Technical SEO tool Semrush site audit

Semrush has a site audit functionality as well which you can use to check orphan pages, core web vitals, and more.

Further reading: Semrush Core Web Vitals Report Review

Google Search Console

Google Search Console Coverage Interface

Google Search Console is an indispensable tool for technical SEO. This is where you check and verify URLs, submit your sitemap, and more.


PageSpeed Insights

Technical SEO tool PageSpeed Insights Analysis

It’s important that you know how fast your website is and what’s causing it to slow down. PageSpeed Insights is the tool you need for that.

Think with Google page speed

No matter how good your content is, if people just leave your website because of technical issues like slowness, your traffic and engagement will remain low.

How to perform a technical SEO audit (with checklist)

One of the most important aspects of technical SEO is auditing your website. This helps you identify any potential issues that could be holding your site back from ranking higher in search results.

How often should I do a technical SEO audit?

There’s no one-size-fits-all answer to this question. The frequency of your technical SEO audits will depend on the size and complexity of your website, as well as how quickly things change on your site.

That being said, we recommend doing even just a short technical SEO audit once a month, and a more in-depth audit every quarter. This will give you a chance to identify and fix any technical issues before they have a chance to hurt your site’s performance.

Let’s begin.


Technical SEO Checklist

Review your site structure

Tools to use: Screaming Frog, Semrush

One of the most important things you can do to improve your website’s SEO is to review its structure. This will help you identify any potential issues that could be hindering your site’s performance in search engines.

Website structure includes things like your site’s hierarchy, URL structure, and internal linking.

For businesses with new sites, planning your website’s structure is the first and most important thing to do.

You have to make sure that both visitors and search engines can easily navigate through your website.

A poorly designed site structure can make things confusing for developers, leaving behind orphan pages. And when this happens, you will have to take the extra time and effort to look for these orphan pages and link to them.


To plan your website structure, you can create a simple chart or outline like this example below:

site structure

Go as simple with your site structure as you can, so visitors and search engines can easily navigate between your web pages.

There are articles that argue that the most important information on a website should only be a couple of clicks deep (see: the 3-click rule), while the Nielsen Norman Group argued that it’s an arbitrary rule not backed by data.

But in 2018, Google’s John Mueller discussed that it does matter—so a good rule of thumb would be to ensure that the most important information is accessible from the homepage.

Look for orphan pages

1651735021 443 The Complete Guide for Advanced Users with Checklist

As mentioned earlier, you will have to look for orphan pages as part of your audit.

Orphan pages are pages of a website that is not internally linked or has zero links from other pages of your website. This makes it difficult for search engine bots to crawl and index these pages.


Orphan pages may occur for different reasons. It could be old blog posts, old products that are not being sold anymore, old services pages that are not being offered anymore.

While there are some pages that are purposely left out such as testing pages and tags pages, it is critical that you check if there are orphan pages that are still relevant for the users.

Does it affect my SEO?

The answer is both yes and no. The effect of orphan pages in a website’s rankings depends on how you look at it. If a page that is orphan was created to be shown to users and has content that is important to users, it hurts your SEO because crawlers can’t see this page thus it won’t appear in the search results. Users won’t be able to see them either.

However, if a page that is orphan was created for other purposes not related to users such as testing functionalities or testing a new website design, then you can leave these pages as it is.

How to find orphan pages using Screaming Frog

To find orphan pages using Screaming Frog, you have to first make sure that your Google Analytics and Google Search Console accounts are connected.

1651735022 851 The Complete Guide for Advanced Users with Checklist

To do that, under Configuration, scroll down to API access and connect Google Analytics and Google Search Console.


1651735023 919 The Complete Guide for Advanced Users with Checklist

Once you got them connected, make sure that under the General tab of the API window, you select Crawl New URLs Discovered in Google Analytics.

1651735023 437 The Complete Guide for Advanced Users with Checklist

After connecting your GA and GSC accounts, under Configuration, go to Spider, and check Crawl Linked XML Sitemaps. Then check the option Crawl These Sitemaps: and input the URL of your website’s sitemap.

1651735024 315 The Complete Guide for Advanced Users with Checklist

After setting everything up, you could now start crawling your website. Once it’s finished crawling, under Crawl Analysis, click on configure and check the box beside Sitemaps. It will start analyzing the crawl log of your website and will allow you to see the orphan pages.

After the analysis, in the Overview under Sitemaps, you can now see all orphan pages that were crawled by Screaming Frog.

Orphan URLs

How to find orphan pages using Semrush

You could also find orphan pages by setting up Site Audit in Semrush. If you don’t have a website set up, create a new project first and let Semrush crawl your website.

Once the set up of the project is complete, go to the Site Audit of your website then go to Issues. Under the Notices tab, scroll down to check if orphan pages report is enabled.


1651735025 764 The Complete Guide for Advanced Users with Checklist

1651735026 825 The Complete Guide for Advanced Users with Checklist

If it hasn’t been enabled yet, connect your Google Analytics account in the Site Audit Settings. The process is similar to Screaming Frog. It will prompt you to log in with your Google Account, select the Profile, Property, and View of your selected Website and click Save.

1651735026 925 The Complete Guide for Advanced Users with Checklist

Once you complete the setup, Semrush will automatically collect data from Google Analytics. Unlike Screaming Frog, you don’t have to connect Google Search Console to get orphan pages data in Semrush.

After a few minutes, refresh your browser and check the Issues tab again. Click the drop down menu Select an Issue and you will find Orphaned Pages (Google Analytics) under Notices.

1651735027 954 The Complete Guide for Advanced Users with Checklist

Optimize or scrap?

Once you collected all orphan pages, it is now up to you what to do with these. You could place them inside a Google Sheet.

  • If a page is still relevant, label them as ‘optimize’ and find possible pages to link to this page.
  • If a page was relevant but now irrelevant such as old products or old services, you could delete them and leave them as 404. No need to redirect these as they don’t carry any link value at all.
  • If a page is purposely left out, you could leave them as it is.

Here’s a sample template that you could use:

1651735028 477 The Complete Guide for Advanced Users with Checklist

While orphan pages can be harmless to your website’s overall rankings and SEO value, it could be a critical issue when important pages are left out. Include monitoring of orphan pages in your regular website maintenance audit. Make sure that your website has a healthy site structure and good flow of link juice by internally linking pages to each other.

Fix 404 pages

Screaming frog 404

The 404 error is an HTTP status code that is sent by a website server to the browser if it is unable to find the webpage a user wants to access. It usually displays a message “Page Not Found” to users.


To find them, open your Screaming Frog and input your website URL, then start the crawl.

Then click Client Error (4xx) to see the status codes.

You can opt to compile these and send them over to your team’s web developers. You can also redirect the URLs (301) you believe are still useful to the new pages created to replace them.

Secure your website

Switch to HTTPS by using a Secure Sockets Layer (SSL). SSL encrypts the link between a web server and a browser.

If you’re building a website from scratch, you can add this to your purchase for free depending on the hosting you’re using.

If not, you can ask the web developers assigned to your website to purchase the SSL and activate it. You can also see how to do it by yourself through this in-depth guide by HubSpot.


Generate and submit an XML sitemap

Tools to use: Google Search Console

You can check if your website has a sitemap by entering in the search bar: www.[website]/sitemap.xml.

SEO Hacker Sitemaps

If your website doesn’t have one yet, you can use a plugin such as the Google Sitemap Generator Plugin by Arne Brachhold. You can also use to generate your sitemap.

XML sitemaps

After you generate your sitemap, you need to submit it. To do that, you need to open Google Search Console, then click Sitemaps. You can then add your sitemap URL and submit.

Submit XML sitemap

Further reading: Ultimate Sitemap SEO Guide

Check your robots.txt

To verify your website’s robots.txt, you can type in the search bar: www.[website]/robots.txt.



If you don’t have one yet, open your Notepad and copy-paste the following:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Sitemap: https://www.[website]/sitemap.xml

This is the default directive found in a basic robots.txt file.

You can add other rules based on your needs.

Once you’re done, save it as a .txt file and upload it to the root directory of your website. You can ask the web developers to do so or you can contact your web hosting service provider to help you out.


Further reading: The Complete Guide to Robots.txt and Noindex Meta Tag

Check your robots meta tags

Tools to use: Screaming Frog

Robots meta tags pieces of code used to instruct search engines what to follow, what not to follow, what to index, and what not to index. It is found in the <head> section of your webpage.

They usually look like this:

<meta name=”robots” content=”noindex”>

You can opt to press CTRL+U on an individual page to view the page source and check for the meta tag, or you can use Screaming Frog to check the directives in your website.


Screaming Frog noindex

For example, /category/marketing/ has a directive of noindex, follow. Let’s look at it through the page source:

technical SEO noindex follow

There you go.

Further reading: What Meta Robots Tag Are For

Optimize URL slugs

Next, optimize your URL slugs.

SEO Philippines

When you create a slug, make sure that it contains your keyword, and that it’s as short yet descriptive as possible.

So instead of using /seo-in-the-philippines/ as our slug, we just use /seo-philippines/.


It’s much easier to understand and remember. It’s also a good web accessibility practice.

If the webpage already exists, you can still change the URL—but make sure to redirect the old URL to the new one. In our case, we use the 301 Redirects plugin, so our redirect settings look like this:

WP 301 Redirects

Provide clear and concise metadata

Meta tags are snippets of information that describe a page content—they appear in the source code, not on the page. Meta tags are essentially little content descriptors that help tell search engines what a web page is about.

According to WordStream, “The “meta” stands for “metadata,” which is the kind of data these tags provide—data about the data on your page.”

To optimize your meta tags, you can opt to change your title tags and meta descriptions—and of course, include your keyword.

A rule of thumb for title tags is to keep them between 50-60 characters.


For meta descriptions, keep them under 155 characters.

Technical SEO meta description

When you have a compelling and informative title tag and description, the likelihood that users will click on your page will increase.

And when Google notices you’re getting more clicks through your page, it will tag you as a relevant site, thus making you rank higher.

Further reading: INFOGRAPH: 5 Steps to Create the Perfect Meta Description

Optimize site speed and performance

Tools to use: PageSpeed Insights (you can also choose Google Lighthouse (Chrome Extension) or Test My Site)

Another important aspect of technical SEO is page speed optimization. Site speed is a ranking factor for both desktop and mobile searches, so it’s important to make sure your pages load quickly.


Do take note that some websites use site speed and page speed interchangeably.

However, some websites say that site speed is different from page speed.

Nevertheless, it’s important that the content on your webpage loads faster. This is to prevent your site visitors from leaving your website and going to your competitors.

Optimizing images

If you use a large image size, it will greatly affect your site speed. In this case, you need to reduce your image size.

The trouble with reducing your image size though is that it also reduces its quality. The good news is that there are plugins or scripts that let you decrease the size of your file without reducing its quality.

To reduce your image size without compromising its quality, simply choose the right combination of your file format and compression type you will use. Ideally, your image’s file format should be in PNG and at a medium compression rate.


You can use the following tools when optimizing images for your webpage:

  • Adobe Photoshop
  • Gimp
  • FileOptimizer

ImageResizer dashboard

Enabling browser caching

The concept of browser caching is straightforward: it takes whatever files you define as files that don’t change often (such as your company logo and website menu) and downloads them once to the visitor’s browser.

This way, they don’t have to be redownloaded every single time they visit your website, making your webpages load much faster.

There are three ways to go about this:

Ask your web hosting provider

You can contact your web hosting provider and have them edit your site’s .htaccess file. That way, you don’t have to touch anything in your website.

Edit the .htaccess file yourself

The header says it all. I wouldn’t recommend this unless you have knowledge on how to troubleshoot in case you make a mistake. If you’re using Yoast, here’s their guide.

Use a plugin

There are various plugins you can choose from that you’ll just download and activate. Quick and easy as pie. Just check with your web developers to confirm if they won’t be causing any issues with other plugins.

Enable Gzip Compression

When a browser loads your website, it requires downloading all the relevant files stored on your server. If your files like HTML, PHP, CSS, and Javascript are too large to load, chances are it will affect your site speed as well.

This is where Gzip comes into play. What it does is that it compresses or reduces the size of your file by up to 30% or less than its original size so your webpage loads faster.

You can use the Gzip Compression Test tool to check if Gzip is enabled on your website, or if you simply want to check your website’s Gzip compression rate.


A word of caution. When you use Gzip, you’re only meant to compress files, not images—or you’ll end up with low-quality, pixelated images on your webpage.


There are many other aspects you need to check when it comes to optimizing your site speed. I’ve only included three in this post. For our in-depth guide to site-speed optimization, check out:

Further reading: Ultimate Guide to Site-Speed Optimization

Also, keep in mind that consistency is key when it comes to site speed optimization. An optimized site speed requires adding scripts or plugins on occasion, so it’s best to always do a site audit every month to keep site downtime from happening.

Site Speed Optimization Package

Fix content issues

If you’re having trouble with technical SEO, there are a few common issues that could be to blame.

One of the most common is duplicate content. This can happen if you have multiple pages with similar or identical content.

To fix this, you’ll need to either apply a 301 redirect or add rel=”canonical” tags to your pages.


Another common issue is thin content. This happens when a page has very little useful content.

Different websites have their own opinions on how long content should be—Yoast recommends >300 words for regular posts or pages, while HubSpot claims that your content length shouldn’t go lower than 2,100 words.

Which one should you follow? I would argue that it’s not exactly a matter of word count, but of content quality.

  • Does your blog post go in-depth and answer what your target audience needs to know about the topic?
  • Does it match search intent?

I suggest focusing on those instead of word vomiting irrelevant content just to “fix” thin content.

Further reading: How TO NOT Screw up Your Canonical Tags and Search Intent SEO for Beginners

Implement structured data

According to Yoast,

“Where Schema is the language in which you present your content, structured data is the actual data you provide. It describes the content on your page and what actions site visitors can perform with this content. This is the input you give search engines to get a better understanding of your pages.”

Technical SEO schema

Simply put, it helps make your page more understandable for search engines and users.


To implement structured data, there are a number of things you can do, but I would suggest using a plugin like Yoast or WPSSO Core.

It gives you the option to select entity types and automatically applies the schema for you instead of you having to manually choose a template and inputting data.

Schema Markup and Structured data

Further reading: How to Create Structured Data Markup for Rich Snippets

Prioritize mobile-friendliness

Percentage of Global Mobile Traffic

Source: BroadbandSearch

The mobile friendliness of a website can be measured by how well it is designed and optimized for mobile devices such as smartphones and tablets.

Just looking at the graph above, we can confidently say it’s integral to optimize for mobile.


Install an AMP (Accelerated Mobile Pages) plugin to your website

AMP for WordPress

Instead of having to manually work on optimizing your site for mobile, we recommend installing the AMP for WordPress plugin.

This is what we use here in SEO Hacker, and it’s made our job a lot easier when it comes to optimizing for mobile.

  • Install the AMP WordPress plugin.
  • Activate the plugin—what it will do is append /amp on all your pages but it won’t redirect visitors to them.
  • Edit your .htaccess file—you could use an FTP program to do this. I personally use Filezilla.
  • (Optional) Just in case you want to check if your AMP pages are working across the board—in your .htaccess file, paste this code:

RewriteEngine On
RewriteCond %{REQUEST_URI} !/amp$ [NC]RewriteCond %{HTTP_USER_AGENT} (android|blackberry|googlebot-mobile|iemobile|iphone|ipod|#opera mobile|palmos|webos) [NC]RewriteRule ^([a-zA-Z0-9-]+)([/]*)$$1/amp [L,R=302]

Note that you have to change to your site’s domain name. I explicitly made the redirect into a 302 because we don’t want all the link equity to be passed on to your /amp pages since it’s merely an accelerated mobile page version.

  • Edit the CSS to make your Accelerated Mobile Pages look and feel more like your site. You can edit the CSS using FTP by going to your wp-content > plugins > AMP > template.php
  • Use rel=”canonical” tags to your original pages. Just to be sure to keep anything Panda-related off your back.

That’s it!

SEO Hacker AMP

You could see that SEO Hacker’s mobile version still look and feel like our desktop page design—without all the fluff.

How can I make AMP work for my non-WordPress site?

You will have to go to the AMP Project’s site and learn how to integrate it via hard-code, hands-on.

1651735046 188 The Complete Guide for Advanced Users with Checklist

If you want to know more about AMP, Moz’s Whiteboard Friday does a swell job on explaining it further:

Verifying your AMP pages on Google Search Console

Once you’ve set up AMP on your website, Google will start crawling and indexing them. In a few days, you should now see that there is an AMP section under Enhancements in your Google Search Console.

Technical SEO AMP SEO Hacker GSC

Google Search Console will notify you if there are any errors in your AMP pages. Just like any other coverage errors, it is divided into 3; Errors, Valid with Warnings, and Valid. When you first apply AMP on your website, there is a high chance that most of your pages will fall under Valid with Warnings.

Don’t worry, your AMP pages will still be available to users. It’s just that there are missing data that could further enhance your AMP pages such as structured data or image sizes.

Structured Data on AMP pages

Aside from loading speeds and user experience on mobile, adding structured data on your AMP pages also makes them eligible for Rich Results.

If you’re using WordPress, there are plugins that enable structured data on AMP pages. Google highly recommends having structured data on both the original page and the AMP version.


Further reading: The SEO Hacker Mobile Optimization Checklist

How do I know if my technical SEO is working?

There are a number of metrics you can measure to gauge the effectiveness of your technical SEO.

Organic traffic

This measures the number of visitors coming to your site from search engines. If you’re seeing an increase in organic traffic, it’s likely that your technical SEO is working.

Click-through rate (CTR) from SERPs

This measures the percentage of people who click on your listing when it appears in search results. A high CTR indicates that your listing is relevant and appealing to users, and that your technical SEO is working.

You can also check if you’re showing up on the Knowledge Panels and other snippets.

Loading times

If you’ve optimized for page speed correctly, you will notice that your site is a lot faster and smoother than before.


Troubleshooting common technical SEO issues

I asked some folks over at Reddit what they wanted to know about technical SEO, and some of them gave me questions that I wanted to include here.

How to fix unused CSS/Javascripts in WordPress

Reduce unused CSS and javascript

Unused code can make your website load slower, but thankfully there is a way to at least reduce this issue.

Use WP Rocket

WP Rocket is a plugin you can use to remove unused CSS and delay JavaScript execution.

You can check out and follow this really good tutorial from WPBeginner.

How to fix XML sitemap errors

An XML sitemap is a file that lists all the important pages of your website. They ensure that Google is able to crawl and index your web pages.

XML sitemaps also help search engines understand your site structure.


Essentially, a good XML sitemap tool will greatly benefit your website. However, if it’s not done correctly, you might increase your risk of not getting recognized by Google.

Here are some pitfalls that you need to avoid with XML sitemap:

Submitted URL has crawl issues

This is one of the most common XML sitemap issues you will come across in Google Search Console.

This means that your sitemap has listed a page with a known crawl error, but Google will not tell you exactly what kind of error it was.

You will need to reanalyze your sitemap for any error that is undetected. The most common crawl issue errors are:

  • Robots.txt blocking crawlers
  • Error pages other than 404, such as “403” forbidden and 401
  • Javascript or CSS blocked by search engines

You can address these crawl issues with the 11 steps I outlined earlier.

Then go to Google Search Console and resubmit your sitemap.

Sitemap size error

As we discussed in this post earlier, size matters in SEO. Your sitemap size must NOT:

  • Be longer than 50MB
  • Contain no more than 1,000 images per URL
  • Contain no more than 50,000 URLs

If you have a simple site, your sitemap’s size shouldn’t be an issue.

However, if you have (for example) an eCommerce website that’s growing fast, it’s best that you create separate sitemaps for every 10,000 URLs you have.

Fewer URLs mean fewer crawl issues for you.

I suggest that you take a look at Common XML sitemap errors by Yoast and Polishing a sitemap: fixing errors, weeding out trash pages, and finding hidden gems by SE Ranking for a more in-depth treatment of this issue.

How to handle multiple connected websites

The Redditor who asked this question works as a software developer who put their apps under different <appname.>app websites.

It’s generally not recommended to have multiple domains or websites for the following reasons:

  • They can compete for the same keywords.
  • They can be expensive.
  • It can take a while for you to rank the websites because you’ll be doing a full SEO strategy for each of them.

I recommend to figure out first what the purpose is of your website. For example, is it to showcase your work as a software developer?

In that case, you can put your apps under one domain, then create subdirectories in your website for the various apps you want to feature.

This works especially if your apps are connected to one another, or they generally fall under a specific theme (e.g., productivity tools}.

HubSpot CRM

For example, under HubSpot you have various software you can check out—and they’re all under

If they are completely different and you want to market them as such (and you have the time and energy to SEO each of them), then you can go ahead and put them in different websites.

Key takeaway

Technical SEO is one of the most important aspects of ranking your website properly. It covers all the behind-the-scenes elements that are necessary for Google and other search engines to understand your site correctly.

Without proper technical SEO, it’s difficult to rank at all, let alone achieve high SERP positions. That’s why we created this guide—to help you audit and troubleshoot any issues on your own website and improve your rankings with these 11 steps.


If you want us to take care of your technical SEO for you, check out our SEO Services Package or contact us today for a free consultation.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address


WordPress Releases A Performance Plugin For “Near-Instant Load Times”




WordPress speculative loading plugin

WordPress released an official plugin that adds support for a cutting edge technology called speculative loading that can help boost site performance and improve the user experience for site visitors.

Speculative Loading

Rendering means constructing the entire webpage so that it instantly displays (rendering). When your browser downloads the HTML, images, and other resources and puts it together into a webpage, that’s rendering. Prerendering is putting that webpage together (rendering it) in the background.

What this plugin does is to enable the browser to prerender the entire webpage that a user might navigate to next. The plugin does that by anticipating which webpage the user might navigate to based on where they are hovering.

Chrome lists a preference for only prerendering when there is an at least 80% probability of a user navigating to another webpage. The official Chrome support page for prerendering explains:

“Pages should only be prerendered when there is a high probability the page will be loaded by the user. This is why the Chrome address bar prerendering options only happen when there is such a high probability (greater than 80% of the time).

There is also a caveat in that same developer page that prerendering may not happen based on user settings, memory usage and other scenarios (more details below about how analytics handles prerendering).


The Speculative Loading API solves a problem that previous solutions could not because in the past they were simply prefetching resources like JavaScript and CSS but not actually prerendering the entire webpage.

The official WordPress announcement explains it like this:

Introducing the Speculation Rules API
The Speculation Rules API is a new web API that solves the above problems. It allows defining rules to dynamically prefetch and/or prerender URLs of certain structure based on user interaction, in JSON syntax—or in other words, speculatively preload those URLs before the navigation. This API can be used, for example, to prerender any links on a page whenever the user hovers over them.”

The official WordPress page about this new functionality describes it:

“The Speculation Rules API is a new web API… It allows defining rules to dynamically prefetch and/or prerender URLs of certain structure based on user interaction, in JSON syntax—or in other words, speculatively preload those URLs before the navigation.

This API can be used, for example, to prerender any links on a page whenever the user hovers over them. Also, with the Speculation Rules API, “prerender” actually means to prerender the entire page, including running JavaScript. This can lead to near-instant load times once the user clicks on the link as the page would have most likely already been loaded in its entirety. However that is only one of the possible configurations.”

The new WordPress plugin adds support for the Speculation Rules API. The Mozilla developer pages, a great resource for HTML technical understanding describes it like this:

“The Speculation Rules API is designed to improve performance for future navigations. It targets document URLs rather than specific resource files, and so makes sense for multi-page applications (MPAs) rather than single-page applications (SPAs).

The Speculation Rules API provides an alternative to the widely-available <link rel=”prefetch”> feature and is designed to supersede the Chrome-only deprecated <link rel=”prerender”> feature. It provides many improvements over these technologies, along with a more expressive, configurable syntax for specifying which documents should be prefetched or prerendered.”


See also: Are Websites Getting Faster? New Data Reveals Mixed Results

Performance Lab Plugin

The new plugin was developed by the official WordPress performance team which occasionally rolls out new plugins for users to test ahead of possible inclusion into the actual WordPress core. So it’s a good opportunity to be first to try out new performance technologies.

The new WordPress plugin is by default set to prerender “WordPress frontend URLs” which are pages, posts, and archive pages. How it works can be fine-tuned under the settings:

Settings > Reading > Speculative Loading

Browser Compatibility

The Speculative API is supported by Chrome 108 however the specific rules used by the new plugin require Chrome 121 or higher. Chrome 121 was released in early 2024.

Browsers that do not support will simply ignore the plugin and will have no effect on the user experience.

Check out the new Speculative Loading WordPress plugin developed by the official core WordPress performance team.


How Analytics Handles Prerendering

A WordPress developer commented with a question asking how Analytics would handle prerendering and someone else answered that it’s up to the Analytics provider to detect a prerender and not count it as a page load or site visit.

Fortunately both Google Analytics and Google Publisher Tags (GPT) both are able to handle prerenders. The Chrome developers support page has a note about how analytics handles prerendering:

“Google Analytics handles prerender by delaying until activation by default as of September 2023, and Google Publisher Tag (GPT) made a similar change to delay triggering advertisements until activation as of November 2023.”

Possible Conflict With Ad Blocker Extensions

There are a couple things to be aware of about this plugin, aside from the fact that it’s an experimental feature that requires Chrome 121 or higher.

A comment by a WordPress plugin developer that this feature may not work with browsers that are using the uBlock Origin ad blocking browser extension.

Download the plugin:
Speculative Loading Plugin by the WordPress Performance Team

Read the announcement at WordPress
Speculative Loading in WordPress


See also: WordPress, Wix & Squarespace Show Best CWV Rate Of Improvement

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading


10 Paid Search & PPC Planning Best Practices




10 Paid Search & PPC Planning Best Practices

Whether you are new to paid media or reevaluating your efforts, it’s critical to review your performance and best practices for your overall PPC marketing program, accounts, and campaigns.

Revisiting your paid media plan is an opportunity to ensure your strategy aligns with your current goals.

Reviewing best practices for pay-per-click is also a great way to keep up with trends and improve performance with newly released ad technologies.

As you review, you’ll find new strategies and features to incorporate into your paid search program, too.

Here are 10 PPC best practices to help you adjust and plan for the months ahead.


1. Goals

When planning, it is best practice to define goals for the overall marketing program, ad platforms, and at the campaign level.

Defining primary and secondary goals guides the entire PPC program. For example, your primary conversion may be to generate leads from your ads.

You’ll also want to look at secondary goals, such as brand awareness that is higher in the sales funnel and can drive interest to ultimately get the sales lead-in.

2. Budget Review & Optimization

Some advertisers get stuck in a rut and forget to review and reevaluate the distribution of their paid media budgets.

To best utilize budgets, consider the following:

  • Reconcile your planned vs. spend for each account or campaign on a regular basis. Depending on the budget size, monthly, quarterly, or semiannually will work as long as you can hit budget numbers.
  • Determine if there are any campaigns that should be eliminated at this time to free up the budget for other campaigns.
  • Is there additional traffic available to capture and grow results for successful campaigns? The ad platforms often include a tool that will provide an estimated daily budget with clicks and costs. This is just an estimate to show more click potential if you are interested.
  • If other paid media channels perform mediocrely, does it make sense to shift those budgets to another?
  • For the overall paid search and paid social budget, can your company invest more in the positive campaign results?

3. Consider New Ad Platforms

If you can shift or increase your budgets, why not test out a new ad platform? Knowing your audience and where they spend time online will help inform your decision when choosing ad platforms.

Go beyond your comfort zone in Google, Microsoft, and Meta Ads.


Here are a few other advertising platforms to consider testing:

  • LinkedIn: Most appropriate for professional and business targeting. LinkedIn audiences can also be reached through Microsoft Ads.
  • TikTok: Younger Gen Z audience (16 to 24), video.
  • Pinterest: Products, services, and consumer goods with a female-focused target.
  • Snapchat: Younger demographic (13 to 35), video ads, app installs, filters, lenses.

Need more detailed information and even more ideas? Read more about the 5 Best Google Ads Alternatives.

4. Top Topics in Google Ads & Microsoft Ads

Recently, trends in search and social ad platforms have presented opportunities to connect with prospects more precisely, creatively, and effectively.

Don’t overlook newer targeting and campaign types you may not have tried yet.

  • Video: Incorporating video into your PPC accounts takes some planning for the goals, ad creative, targeting, and ad types. There is a lot of opportunity here as you can simply include video in responsive display ads or get in-depth in YouTube targeting.
  • Performance Max: This automated campaign type serves across all of Google’s ad inventory. Microsoft Ads recently released PMAX so you can plan for consistency in campaign types across platforms. Do you want to allocate budget to PMax campaigns? Learn more about how PMax compares to search.
  • Automation: While AI can’t replace human strategy and creativity, it can help manage your campaigns more easily. During planning, identify which elements you want to automate, such as automatically created assets and/or how to successfully guide the AI in the Performance Max campaigns.

While exploring new features, check out some hidden PPC features you probably don’t know about.

5. Revisit Keywords

The role of keywords has evolved over the past several years with match types being less precise and loosening up to consider searcher intent.

For example, [exact match] keywords previously would literally match with the exact keyword search query. Now, ads can be triggered by search queries with the same meaning or intent.

A great planning exercise is to lay out keyword groups and evaluate if they are still accurately representing your brand and product/service.


Review search term queries triggering ads to discover trends and behavior you may not have considered. It’s possible this has impacted performance and conversions over time.

Critical to your strategy:

  • Review the current keyword rules and determine if this may impact your account in terms of close variants or shifts in traffic volume.
  • Brush up on how keywords work in each platform because the differences really matter!
  • Review search term reports more frequently for irrelevant keywords that may pop up from match type changes. Incorporate these into match type changes or negative keywords lists as appropriate.

6. Revisit Your Audiences

Review the audiences you selected in the past, especially given so many campaign types that are intent-driven.

Automated features that expand your audience could be helpful, but keep an eye out for performance metrics and behavior on-site post-click.

Remember, an audience is simply a list of users who are grouped together by interests or behavior online.

Therefore, there are unlimited ways to mix and match those audiences and target per the sales funnel.

Here are a few opportunities to explore and test:

  • LinkedIn user targeting: Besides LinkedIn, this can be found exclusively in Microsoft Ads.
  • Detailed Demographics: Marital status, parental status, home ownership, education, household income.
  • In-market and custom intent: Searches and online behavior signaling buying cues.
  • Remarketing: Advertisers website visitors, interactions with ads, and video/ YouTube.

Note: This varies per the campaign type and seems to be updated frequently, so make this a regular check-point in your campaign management for all platforms.

7. Organize Data Sources

You will likely be running campaigns on different platforms with combinations of search, display, video, etc.

Looking back at your goals, what is the important data, and which platforms will you use to review and report? Can you get the majority of data in one analytics platform to compare and share?

Millions of companies use Google Analytics, which is a good option for centralized viewing of advertising performance, website behavior, and conversions.

8. Reevaluate How You Report

Have you been using the same performance report for years?

It’s time to reevaluate your essential PPC key metrics and replace or add that data to your reports.

There are two great resources to kick off this exercise:


Your objectives in reevaluating the reporting are:

  • Are we still using this data? Is it still relevant?
  • Is the data we are viewing actionable?
  • What new metrics should we consider adding we haven’t thought about?
  • How often do we need to see this data?
  • Do the stakeholders receiving the report understand what they are looking at (aka data visualization)?

Adding new data should be purposeful, actionable, and helpful in making decisions for the marketing plan. It’s also helpful to decide what type of data is good to see as “deep dives” as needed.

9. Consider Using Scripts

The current ad platforms have plenty of AI recommendations and automated rules, and there is no shortage of third-party tools that can help with optimizations.

Scripts is another method for advertisers with large accounts or some scripting skills to automate report generation and repetitive tasks in their Google Ads accounts.

Navigating the world of scripts can seem overwhelming, but a good place to start is a post here on Search Engine Journal that provides use cases and resources to get started with scripts.

Luckily, you don’t need a Ph.D. in computer science — there are plenty of resources online with free or templated scripts.

10. Seek Collaboration

Another effective planning tactic is to seek out friendly resources and second opinions.


Much of the skill and science of PPC management is unique to the individual or agency, so there is no shortage of ideas to share between you.

You can visit the Paid Search Association, a resource for paid ad managers worldwide, to make new connections and find industry events.

Preparing For Paid Media Success

Strategies should be based on clear and measurable business goals. Then, you can evaluate the current status of your campaigns based on those new targets.

Your paid media strategy should also be built with an eye for both past performance and future opportunities. Look backward and reevaluate your existing assumptions and systems while investigating new platforms, topics, audiences, and technologies.

Also, stay current with trends and keep learning. Check out ebooks, social media experts, and industry publications for resources and motivational tips.

More resources: 


Featured Image: Vanatchanan/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading


Google Limits News Links In California Over Proposed ‘Link Tax’ Law




A brown cardboard price tag with a twine string and a black dollar sign symbol, influenced by the Link Tax Law, set against a dark gray background.

Google announced that it plans to reduce access to California news websites for a portion of users in the state.

The decision comes as Google prepares for the potential passage of the California Journalism Preservation Act (CJPA), a bill requiring online platforms like Google to pay news publishers for linking to their content.

What Is The California Journalism Preservation Act?

The CJPA, introduced in the California State Legislature, aims to support local journalism by creating what Google refers to as a “link tax.”

If passed, the Act would force companies like Google to pay media outlets when sending readers to news articles.

However, Google believes this approach needs to be revised and could harm rather than help the news industry.


Jaffer Zaidi, Google’s VP of Global News Partnerships, stated in a blog post:

“It would favor media conglomerates and hedge funds—who’ve been lobbying for this bill—and could use funds from CJPA to continue to buy up local California newspapers, strip them of journalists, and create more ghost papers that operate with a skeleton crew to produce only low-cost, and often low-quality, content.”

Google’s Response

To assess the potential impact of the CJPA on its services, Google is running a test with a percentage of California users.

During this test, Google will remove links to California news websites that the proposed legislation could cover.

Zaidi states:

“To prepare for possible CJPA implications, we are beginning a short-term test for a small percentage of California users. The testing process involves removing links to California news websites, potentially covered by CJPA, to measure the impact of the legislation on our product experience.”

Google Claims Only 2% of Search Queries Are News-Related

Zaidi highlighted peoples’ changing news consumption habits and its effect on Google search queries (emphasis mine):

“It’s well known that people are getting news from sources like short-form videos, topical newsletters, social media, and curated podcasts, and many are avoiding the news entirely. In line with those trends, just 2% of queries on Google Search are news-related.”

Despite the low percentage of news queries, Google wants to continue helping news publishers gain visibility on its platforms.


However, the “CJPA as currently constructed would end these investments,” Zaidi says.

A Call For A Different Approach

In its current form, Google maintains that the CJPA undermines news in California and could leave all parties worse off.

The company urges lawmakers to consider alternative approaches supporting the news industry without harming smaller local outlets.

Google argues that, over the past two decades, it’s done plenty to help news publishers innovate:

“We’ve rolled out Google News Showcase, which operates in 26 countries, including the U.S., and has more than 2,500 participating publications. Through the Google News Initiative we’ve partnered with more than 7,000 news publishers around the world, including 200 news organizations and 6,000 journalists in California alone.”

Zaidi suggested that a healthy news industry in California requires support from the state government and a broad base of private companies.

As the legislative process continues, Google is willing to cooperate with California publishers and lawmakers to explore alternative paths that would allow it to continue linking to news.


Featured Image:Ismael Juan/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading


Follow by Email