Connect with us


How to Do an SEO Log File Analysis [Template Included]



How to Do an SEO Log File Analysis [Template Included]

Log files have been receiving increasing recognition from technical SEOs over the past five years, and for a good reason.

They’re the most trustworthy source of information to understand the URLs that search engines have crawled, which can be critical information to help diagnose problems with technical SEO.

Google itself recognizes their importance, releasing new features in Google Search Console and making it easy to see samples of data that would previously only be available by analyzing logs.

Crawl stats report; key data above and line graph showing trend of crawl requests below

In addition, Google Search Advocate John Mueller has publicly stated how much good information log files hold.

With all this hype around the data in log files, you may want to understand logs better, how to analyze them, and whether the sites you’re working on will benefit from them.

This article will answer all of that and more. Here’s what we’ll be discussing:

First, what is a server log file?

A server log file is a file created and updated by a server that records the activities it has performed. A popular server log file is an access log file, which holds a history of HTTP requests to the server (by both users and bots).

When a non-developer mentions a log file, access logs are the ones they’ll usually be referring to.

Developers, however, find themselves spending more time looking at error logs, which report issues encountered by the server.

The above is important: If you request logs from a developer, the first thing they’ll ask is, “Which ones?”

Therefore, always be specific with log file requests. If you want logs to analyze crawling, ask for access logs.

Access log files contain lots of information about each request made to the server, such as the following:

  • IP addresses
  • User agents
  • URL path
  • Timestamps (when the bot/browser made the request)
  • Request type (GET or POST)
  • HTTP status codes

What servers include in access logs varies by the server type and sometimes what developers have configured the server to store in log files. Common formats for log files include the following:

  • Apache format – This is used by Nginx and Apache servers.
  • W3C format – This is used by Microsoft IIS servers.
  • ELB format – This is used by Amazon Elastic Load Balancing.
  • Custom formats – Many servers support outputting a custom log format.

Other forms exist, but these are the main ones you’ll encounter.

How log files benefit SEO

Now that we’ve got a basic understanding of log files, let’s see how they benefit SEO.

Here are some key ways:

  • Crawl monitoring – You can see the URLs search engines crawl and use this to spot crawler traps, look out for crawl budget wastage, or better understand how quickly content changes are picked up.
  • Status code reporting – This is particularly useful for prioritizing fixing errors. Rather than knowing you’ve got a 404, you can see precisely how many times a user/search engine is visiting the 404 URL.
  • Trends analysis – By monitoring crawling over time to a URL, page type/site section, or your entire site, you can spot changes and investigate potential causes.
  • Orphan page discovery – You can cross-analyze data from log files and a site crawl you run yourself to discover orphan pages.

All sites will benefit from log file analysis to some degree, but the amount of benefit varies massively depending on site size.

This is as log files primarily benefit sites by helping you better manage crawling. Google itself states managing the crawl budget is something larger-scale or frequently changing sites will benefit from.

Excerpt of Google article

The same is true for log file analysis.

For example, smaller sites can likely use the “Crawl stats” data provided in Google Search Console and receive all of the benefits mentioned above—without ever needing to touch a log file.

Gif of Crawl stats report being scrolled down gradually

Yes, Google won’t provide you with all URLs crawled (like with log files), and the trends analysis is limited to three months of data.

However, smaller sites that change infrequently also need less ongoing technical SEO. It’ll likely suffice to have a site auditor discover and diagnose issues.

For example, a cross-analysis from a site crawler, XML sitemaps, Google Analytics, and Google Search Console will likely discover all orphan pages.

You can also use a site auditor to discover error status codes from internal links.

There are a few key reasons I’m pointing this out:

  • Access log files aren’t easy to get a hold of (more on this next).
  • For small sites that change infrequently, the benefit of log files isn’t as much, meaning SEO focuses will likely go elsewhere.

How to access your log files

In most cases, to analyze log files, you’ll first have to request access to log files from a developer.

The developer is then likely going to have a few issues, which they’ll bring to your attention. These include:

  • Partial data – Log files can include partial data scattered across multiple servers. This usually happens when developers use various servers, such as an origin server, load balancers, and a CDN. Getting an accurate picture of all logs will likely mean compiling the access logs from all servers.
  • File size – Access log files for high-traffic sites can end up in terabytes, if not petabytes, making them hard to transfer.
  • Privacy/compliance – Log files include user IP addresses that are personally identifiable information (PII). User information may need removing before it can be shared with you.
  • Storage history – Due to file size, developers may have configured access logs to be stored for a few days only, making them not useful for spotting trends and issues.

These issues will bring to question whether storing, merging, filtering, and transferring log files are worth the dev effort, especially if developers already have a long list of priorities (which is often the case).

Developers will likely put the onus on the SEO to explain/build a case for why developers should invest time in this, which you will need to prioritize among other SEO focuses.

These issues are precisely why log file analysis doesn’t happen frequently.

Log files you receive from developers are also often formatted in unsupported ways by popular log file analysis tools, making analysis more difficult.

Thankfully, there are software solutions that simplify this process. My favorite is Logflare, a Cloudflare app that can store log files in a BigQuery database that you own.

How to analyze your log files

Now it’s time to start analyzing your logs.

I’m going to show you how to do this in the context of Logflare specifically; however, the tips on how to use log data will work with any logs.

The template I’ll share shortly also works with any logs. You’ll just need to make sure the columns in the data sheets match up.

1. Start by setting up Logflare (optional)

Logflare is simple to set up. And with the BigQuery integration, it stores data long term. You’ll own the data, making it easily accessible for everyone.

There’s one difficulty. You need to swap out your domain name servers to use Cloudflare ones and manage your DNS there.

For most, this is fine. However, if you’re working with a more enterprise-level site, it’s unlikely you can convince the server infrastructure team to change the name servers to simplify log analysis.

I won’t go through every step on how to get Logflare working. But to get started, all you need to do is head to the Cloudflare Apps part of your dashboard.

"Apps" in a sidebar

And then search for Logflare.

"Logflare" appearing in search field on top-right corner, and the app appearing below in the results

The setup past this point is self-explanatory (create an account, give your project a name, choose the data to send, etc.). The only extra part I recommend following is Logflare’s guide to setting up BigQuery.

Bear in mind, however, that BigQuery does have a cost that’s based on the queries you do and the amount of data you store.


 It’s worth noting that one significant advantage of the BigQuery backend is that you own the data. That means you can circumvent PII issues by configuring Logflare not to send PII like IP addresses and delete PII from BigQuery using an SQL query.

2. Verify Googlebot

We’ve now stored log files (via Logflare or an alternative method). Next, we need to extract logs precisely from the user agents we want to analyze. For most, this will be Googlebot.

Before we do that, we have another hurdle to jump across.

Many bots pretend to be Googlebot to get past firewalls (if you have one). In addition, some auditing tools do the same to get an accurate reflection of the content your site returns for the user agent, which is essential if your server returns different HTML for Googlebot, e.g., if you’ve set up dynamic rendering.

I’m not using Logflare

If you aren’t using Logflare, identifying Googlebot will require a reverse DNS lookup to verify the request did come from Google.

Google has a handy guide on validating Googlebot manually here.

Excerpt of Google article

You can do this on a one-off basis, using a reverse IP lookup tool and checking the domain name returned.

However, we need to do this in bulk for all rows in our log files. This also requires you to match IP addresses from a list provided by Google.

The easiest way to do this is by using server firewall rule sets maintained by third parties that block fake bots (resulting in fewer/no fake Googlebots in your log files). A popular one for Nginx is “Nginx Ultimate Bad Bot Blocker.”

Alternatively, something you’ll note on the list of Googlebot IPs is the IPV4 addresses all begin with “66.”

List of IPV4 addresses

While it won’t be 100% accurate, you can also check for Googlebot by filtering for IP addresses starting with “6” when analyzing the data within your logs.

I’m using Cloudflare/Logflare

Cloudflare’s pro plan (currently $20/month) has built-in firewall features that can block fake Googlebot requests from accessing your site.

Cloudflare pricing

Cloudflare disables these features by default, but you can find them by heading to Firewall > Managed Rules > enabling Cloudflare Specials> select Advanced”:

Webpage showing "Managed Rules"

Next, change the search type from “Description” to “ID” and search for “100035.”

List of description IDs

Cloudflare will now present you with a list of options to block fake search bots. Set the relevant ones to “Block,” and Cloudflare will check all requests from search bot user agents are legitimate, keeping your log files clean.

3. Extract data from log files

Finally, we now have access to log files, and we know the log files accurately reflect genuine Googlebot requests.

I recommend analyzing your log files within Google Sheets/Excel to start with because you’ll likely be used to spreadsheets, and it’s simple to cross-analyze log files with other sources like a site crawl.

There is no one right way to do this. You can use the following:

You can also do this within a Data Studio report. I find Data Studio helpful for monitoring data over time, and Google Sheets/Excel is better for a one-off analysis when technical auditing.

Open BigQuery and head to your project/dataset.

Sidebar showing project dataset

Select the “Query” dropdown and open it in a new tab.

"Query" dropdown showing 2 options: new tab or split tab

Next, you’ll need to write some SQL to extract the data you’ll be analyzing. To make this easier, first copy the contents of the FROM part of the query.

FROM part of the query

And then you can add that within the query I’ve written for you below:

SELECT DATE(timestamp) AS Date, req.url AS URL, req_headers.cf_connecting_ip AS IP, req_headers.user_agent AS User_Agent, resp.status_code AS Status_Code, resp.origin_time AS Origin_Time, resp_headers.cf_cache_status AS Cache_Status, resp_headers.content_type AS Content_Type

FROM `[Add Your from address here]`,

UNNEST(metadata) m,

UNNEST(m.request) req,

UNNEST(req.headers) req_headers,

UNNEST(m.response) resp,

UNNEST(resp.headers) resp_headers

WHERE DATE(timestamp) >= "2022-01-03" AND (req_headers.user_agent LIKE '%Googlebot%' OR req_headers.user_agent LIKE '%bingbot%')

ORDER BY timestamp DESC

This query selects all the columns of data that are useful for log file analysis for SEO purposes. It also only pulls data for Googlebot and Bingbot.


If there are other bots you want to analyze, just add another OR req_headers.user_agent LIKE ‘%bot_name%’ within the WHERE statement. You can also easily change the start date by updating the WHERE DATE(timestamp) >= “2022–03-03” line.

Select “Run” at the top. Then choose to save the results.

Button to "save results"

Next, save the data to a CSV in Google Drive (this is the best option due to the larger file size).

And then, once BigQuery has run the job and saved the file, open the file with Google Sheets.

4. Add to Google Sheets

We’re now going to start with some analysis. I recommend using my Google Sheets template. But I’ll explain what I’m doing, and you can build the report yourself if you want.

Here is my template.

The template consists of two data tabs to copy and paste your data into, which I then use for all other tabs using the Google Sheets QUERY function.


If you want to see how I’ve completed the reports that we’ll run through after setting up, select the first cell in each table.

To start with, copy and paste the output of your export from BigQuery into the “Data — Log files” tab.

Output from BigQuery

Note that there are multiple columns added to the end of the sheet (in darker grey) to make analysis a little easier (like the bot name and first URL directory).

5. Add Ahrefs data

If you have a site auditor, I recommend adding more data to the Google Sheet. Mainly, you should add these:

  • Organic traffic
  • Status codes
  • Crawl depth
  • Indexability
  • Number of internal links

To get this data out of Ahrefs’ Site Audit, head to Page Explorer and select “Manage Columns.”

I then recommend adding the columns shown below:

Columns to add

Then export all of that data.

Options to export to CSV

And copy and paste into the “Data — Ahrefs” sheet.

6. Check for status codes

The first thing we’ll analyze is status codes. This data will answer whether search bots are wasting crawl budget on non-200 URLs.

Note that this doesn’t always point toward an issue.

Sometimes, Google can crawl old 301s for many years. However, it can highlight an issue if you’re internally linking to many non-200 status codes.

The “Status Codes — Overview” tab has a QUERY function that summarizes the log file data and displays the results in a chart.

Pie chart showing summary of log file data for status codes

There is also a dropdown to filter by bot type and see which ones are hitting non-200 status codes the most.

Table showing status codes and corresponding hits; above, dropdown to filter results by bot type

Of course, this report alone doesn’t help us solve the issue, so I’ve added another tab, “URLs — Overview.”

List of URLs with corresponding data like status codes, organic traffic, etc

You can use this to filter for URLs that return non-200 status codes. As I’ve also included data from Ahrefs’ Site Audit, you can see whether you’re internally linking to any of those non-200 URLs in the “Inlinks” column.

If you see a lot of internal links to the URL, you can then use the Internal link opportunities report to spot these incorrect internal links by simply copying and pasting the URL in the search bar with “Target page” selected.

Excerpt of Internal link opportunities report results

7. Detect crawl budget wastage

The best way to highlight crawl budget wastage from log files that isn’t due to crawling non-200 status codes is to find frequently crawled non-indexable URLs (e.g., they’re canonicalized or noindexed).

Since we’ve added data from our log files and Ahrefs’ Site Audit, spotting these URLs is straightforward.

Head to the “Crawl budget wastage” tab, and you’ll find highly crawled HTML files that return a 200 but are non-indexable.

List of URLs with corresponding data like hits, etc

Now that you have this data, you’ll want to investigate why the bot is crawling the URL. Here are some common reasons:

  • It’s internally linked to.
  • It’s incorrectly included in XML sitemaps.
  • It has links from external sites.

It’s common for larger sites, especially those with faceted navigation, to link to many non-indexable URLs internally.

If the hit numbers in this report are very high and you believe you’re wasting your crawl budget, you’ll likely need to remove internal links to the URLs or block crawling with the robots.txt.

8. Monitor important URLs

If you have specific URLs on your site that are incredibly important to you, you may want to watch how often search engines crawl them.

The “URL monitor” tab does just that, plotting the daily trend of hits for up to five URLs that you can add.

Line graph showing daily trend of hits for 4 URLs

You can also filter by bot type, making it easy to monitor how often Bing or Google crawls a URL.

URL monitoring with dropdown option to filter by bot type


You can also use this report to check URLs you’ve recently redirected. Simply add the old URL and new URL in the dropdown and see how quickly Googlebot notices the change.

Often, the advice here is that it’s a bad thing if Google doesn’t crawl a URL frequently. That simply isn’t the case.

While Google tends to crawl popular URLs more frequently, it will likely crawl a URL less if it doesn’t change often.

Excerpt of Google article

Still, it’s helpful to monitor URLs like this if you need content changes picked up quickly, such as on a news site’s homepage.

In fact, if you notice Google is recrawling a URL too frequently, I’ll advocate for trying to help it better manage crawl rate by doing things like adding <lastmod> to XML sitemaps. Here’s what it looks like:

<?xml version="1.0" encoding="UTF-8"?>

<urlset xmlns="">






You can then update the <lastmod> attribute whenever the content of the page changes, signaling Google to recrawl.

9. Find orphan URLs

Another way to use log files is to discover orphan URLs, i.e., URLs that you want search engines to crawl and index but haven’t internally linked to.

We can do this by checking for 200 status code HTML URLs with no internal links found by Ahrefs’ Site Audit.

You can see the report I’ve created for this named “Orphan URLs.”

List of URLs with corresponding data like hits, etc

There is one caveat here. As Ahrefs hasn’t discovered these URLs but Googlebot has, these URLs may not be URLs we want to link to because they’re non-indexable.

I recommend copying and pasting these URLs using the “Custom URL list” functionality when setting up crawl sources for your Ahrefs project.

Page to set up crawl sources; text field to enter custom URLs

This way, Ahrefs will now consider these orphan URLs found in your log files and report any issues to you in your next crawl:

List of issues

10. Monitor crawling by directory

Suppose you’ve implemented structured URLs that indicate how you’ve organized your site (e.g., /features/feature-page/).

In that case, you can also analyze log files based on the directory to see if Googlebot is crawling certain sections of the site more than others.

I’ve implemented this kind of analysis in the “Directories — Overview” tab of the Google Sheet.

Table showing list of directories with corresponding data like organic traffic, inlinks, etc

You can see I’ve also included data on the number of internal links to the directories, as well as total organic traffic.

You can use this to see whether Googlebot is spending more time crawling low-traffic directories than high-value ones.

But again, bear in mind this may occur, as some URLs within specific directories change more often than others. Still, it’s worth further investigating if you spot an odd trend.

In addition to this report, there is also a “Directories — Crawl trend” report if you want to see the crawl trend per directory for your site.

Line graph showing crawl trend per directory

11. View Cloudflare cache ratios

Head to the “CF cache status” tab, and you’ll see a summary of how often Cloudflare is caching your files on the edge servers.

Bar chart showing how often Cloudflare is caching files on the edge servers

When Cloudflare caches content (HIT in the above chart), the request no longer goes to your origin server and is served directly from its global CDN. This results in better Core Web Vitals, especially for global sites.


 It’s also worth having a caching setup on your origin server (such as Varnish, Nginx FastCGI, or Redis full-page cache). This is so that even when Cloudflare hasn’t cached a URL, you’ll still benefit from some caching.

If you see a large amount of “Miss” or “Dynamic” responses, I recommend investigating further to understand why Cloudflare isn’t caching content. Common causes can be:

  • You’re linking to URLs with parameters in them – Cloudflare, by default, passes these requests to your origin server, as they’re likely dynamic.
  • Your cache expiry times are too low – If you set short cache lifespans, it’s likely more users will receive uncached content.
  • You aren’t preloading your cache – If you need your cache to expire often (as content changes frequently), rather than letting users hit uncached URLs, use a preloader bot that will prime the cache, such as Optimus Cache Preloader.


 I thoroughly recommend setting up HTML edge-caching via Cloudflare, which significantly reduces TTFB. You can do this easily with WordPress and Cloudflare’s Automatic Platform Optimization.

12. Check which bots crawl your site the most

The final report (found in the “Bots — Overview” tab) shows you which bots crawl your site the most:

Pie chart showing Googlebot crawls site the most, as compared to Bingbot

In the “Bots — Crawl trend” report, you can see how that trend has changed over time.

Stacked bar chart showing how crawl trend changes over time

This report can help check if there’s an increase in bot activity on your site. It’s also helpful when you’ve recently made a significant change, such as a URL migration, and want to see if bots have increased their crawling to collect new data.

Final thoughts

You should now have a good idea of the analysis you can do with your log files when auditing a site. Hopefully, you’ll find it easy to use my template and do this analysis yourself.

Anything unique you’re doing with your log files that I haven’t mentioned? Tweet me.

Source link

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *


The Challenges & Opportunities For Marketers



The Challenges & Opportunities For Marketers

Google’s parent company, Alphabet Inc., reported its fourth straight quarter of declining profits.

It made $76 billion in sales over the past three months, but it wasn’t enough to meet Wall Street’s expectations.

Google’s revenue was down 9% compared to last year, and its biggest business, Google Search, saw a 1% drop in revenue. Even YouTube’s advertising sales fell by nearly 8%.

Alphabet has decided to cut its workforce by 12,000 and expects to spend between $1.9 billion and $2.3 billion on employee severance costs.

This latest earnings report shows tech giants like Google are facing challenges in the current digital advertising landscape.

But Google’s CEO, Sundar Pichai, believes that the company’s long-term investments in AI will be a key factor in its future success.

In a press release, Pichai says he expects major AI advancements to be soon revealed in Google search and other areas:

“Our long-term investments in deep computer science make us extremely well-positioned as AI reaches an inflection point, and I’m excited by the AI-driven leaps we’re about to unveil in Search and beyond. There’s also great momentum in Cloud, YouTube subscriptions, and our Pixel devices. We’re on an important journey to re-engineer our cost structure in a durable way and to build financially sustainable, vibrant, growing businesses across Alphabet.”

Alphabet’s CFO, Ruth Porat, reported that their Q4 consolidated revenues were $76 billion, a 1% increase from the previous year. The full year 2022 saw revenues of $283 billion, a 10% increase.

Going forward, Alphabet is changing how it reports on its AI activities.

DeepMind, which used to be reported under “Other Bets,” will now be reported as part of Alphabet’s corporate costs to reflect its increasing integration with Google Services and Google Cloud.

What Does This Mean For Marketing Professionals?

It’s important to stay updated on the latest developments in the tech industry and how they may affect advertising strategies.

Google’s declining profits and decreased revenue in their search and YouTube platforms are reminders that the digital advertising landscape is constantly evolving, and companies must adapt to keep up.

Marketers should consider diversifying their advertising efforts across multiple platforms to minimize the impact of market swings.

Additionally, Google’s focus on AI and its integration with Google Services and Cloud is something to keep an eye on.

As AI advances, it may offer new opportunities for marketers to target and engage with their audience effectively.

By staying informed on the latest tech advancements, marketers can stay ahead of the curve and make the most of these opportunities.

Despite Google’s recent financial setbacks, the tech giant is still a major player in the digital advertising landscape, and its investments in AI show its commitment to continued growth and innovation.

Featured Image: Sergio Photone/Shutterstock

Source: Alphabet

Source link

Continue Reading


How to Use WordPress in 9 Simple Steps (Beginner’s Guide)



How to Use WordPress in 9 Simple Steps (Beginner’s Guide)

WordPress is the world’s largest content management system (CMS)—around 810 million websites are built on it.

It’s free to use and includes all the features any website owner could need. And if it doesn’t have a feature you want or need, you can have a developer create it for you because it’s built on open-source software.

But with all of these features come some complications. WordPress has a fairly steep learning curve compared to other CMSes like Wix or Squarespace.

I’ve built dozens of websites using (not, which is a totally different beast) and have narrowed down the process to nine simple steps that anyone can follow.

Let’s start with…

Step 1. Get a domain name and hosting

Every website built on needs a domain name ( and a hosting service that stores and displays your website on the internet.

You can buy a domain name for a small fee from a domain name registrar like NameCheap or GoDaddy. However, if you buy your domain name and your hosting from separate companies, you will need to change your website’s Domain Nameservers (DNS) to point your domain name from your registrar to your hosting company.

They look like this:

SiteGround DNS settings example

It’s a little cheaper to do it this way but not worth the hassle in my opinion. Instead, most hosting providers (such as SiteGround or Bluehost) can also sell you a domain name and connect it with your website automatically, allowing you to skip messing with DNS settings.

You can check out this guide to choosing a domain name if you’re not sure what to pick.

Step 2. Install WordPress

Once you purchase hosting, most hosting providers have a one-click install to set up WordPress on your website. Here are some links to guides on how to do this with common hosting services:

You can also opt for a faster (but more expensive) dedicated hosting provider like Kinsta or WP Engine. These companies will set up WordPress for you when you buy their hosting.

Step 3. Familiarize yourself with the UI

Now that you have a website with WordPress installed, let’s get into how to use WordPress. You can log in to your WordPress dashboard by going to

Once you log in, your dashboard will look like this (with fewer plugins since you’re on a fresh install):

WordPress user interface

Let me explain the options here:

  • Posts: This is where you’ll create blog posts.
  • Media: You can go here to see all the media on your site, such as images and videos. I typically upload media directly to my posts and pages and don’t visit media often.
  • Pages: This is where you’ll create static pages on your site, such as your homepage, about page, and contact page.
  • Comments: Here is where you’ll moderate any blog comments.
  • Appearance: This is where you’ll customize the appearance of your website, such as your website’s theme, font type, colors, and more.
  • Plugins: A plugin is an add-on to your website that adds functionality, such as custom contact forms or pop-ups on your website. I’ll discuss these in more detail later.
  • Users: Here is where you can add users to your website, such as writers, editors, and administrators.
  • Settings: Pretty straightforward; here is where your general website settings are located.

Now that you know what each option does, let’s get your website settings dialed in.

Step 4. Optimize your settings

Your WordPress website comes with some generic settings that need to be changed, as well as some things I recommend changing to optimize your website for search engines.

Specifically, you should:

  • Change your title, tagline, time zone, and favicon.
  • Change your permalink structure.
  • Configure your reading settings.
  • Delete any unused themes.
  • Change your domain from HTTP to HTTPS.

Let’s walk through each of these steps.

Change your title, tagline, time zone, and favicon

Head to Settings > General to find these settings. Change the title of your website and the tagline, which can appear underneath the title if you choose to display it.

Next, check that the time zone is correct (according to your local time zone) and upload your favicon. A favicon is the little icon that shows up in browser tabs next to the title of the page, like this:

Examples of favicons

You can make a favicon for free with Canva. Just make a 50×50 design with whatever you want your favicon to look like. Check out this guide to learn more. 

Change your permalink structure

Head to Settings > Permalinks. A permalink is the URL structure your blog posts take when you publish them. By default, WordPress displays the date in your URLs, which isn’t great for SEO or readability.

WordPress permalink structure settings

I always change this to the “Post name” option (/sample-post/) to add the title of the post by default. You want to optimize all of your URLs individually when possible, but this setting will make the process easier.

Configure your reading settings

Head over to Settings > Reading to choose whether you want your homepage to be a static page or if you want it to be a feed of your latest blog posts. 

WordPress homepage display settings

Personally, I always create a unique static page to use as my homepage because it gives me more control over the homepage. I like to add internal links to specific pages to help them rank higher on Google, as well as add an email opt-in form on the homepage.

Check out this guide to homepage SEO to learn more.

Delete any unused themes 

By default, you have a few themes installed. Once you choose a theme in step #5 below, you should delete any unused themes to remove vulnerabilities from your site (hackers can attack WordPress websites with outdated themes).

To do that, go to Appearance > Themes, click on the unused theme, then click the red Delete button in the bottom right.

How to delete unused themes on WordPress

Change your domain from HTTP to HTTPS

The “S” in HTTPS stands for secure. Adding this is done with an SSL certificate, and it’s an important step. It means your website is encrypted and safer for viewers.

Having HTTPS instead of HTTP gives you the “lock” icon next to your URL—Google (and most internet users) wants to see a secure website.

HTTPS secure "lock" icon

Most hosting providers automatically activate the secure version of your website. But sometimes, it needs to be manually activated by you. Here are guides on how to do this with common hosting providers:

If your host isn’t shown here, just do a Google search for “[your host] SSL encryption.”

Step 5. Select and customize your theme

Once you’ve optimized your settings, it’s time to start actually building your website using a WordPress theme. A theme is a customizable template that determines what your website looks like. 

You can browse for themes by going to Appearance > Themes, then clicking the Add new button at the top of the page. 

WordPress theme page

The generic Twenty Twenty-Three theme is actually pretty good. Most WordPress themes these days are optimized to show up in search engines and for requirements of the modern user, such as being mobile-friendly. 

However, some themes have a lot of added bloat that can slow a website down, so choose a theme that only has the features you need without extras you won’t use.

Alternatively, if you don’t like any themes or want something that’s more drag-and-drop, you can use a website builder like Elementor or Thrive Architect. These tools make building a website extremely easy, but they do add bloat that can slow a website down.

I use Elementor to build my websites but only use it to build static pages that I want to convert well. Then I use the built-in Guttenberg editor for my blog posts.

If you decide to go with a regular theme rather than a theme builder, you can edit the theme by going to Appearance > Customize. You’ll be taken to the following editor:

WordPress theme customization options

Depending on the theme you installed, you may have more or fewer options than the screenshot above. Rather than trying to cover every option you may encounter, I’ll just recommend that you go through each option to see what it does. 

For the most part, the options are self-explanatory. If you hit a snag, you can always do a Google search for that option in your theme to see forum posts from other users or even the theme’s FAQ or manual.

Step 6. Build your basic pages

After you’ve selected a theme, you can start building your website’s pages. Every website typically needs at least the following pages:

  • A homepage
  • A contact page
  • An about page
  • A privacy policy page
  • A terms of service page

Rather than going through how you should create each of these pages, I’ll refer you to the following guides:

Keep in mind that your privacy policy and terms of service (ToS) pages will vary depending on the country you live in. If you’re in the U.S., you can follow this guide for privacy policies and this guide for ToS pages.

That said, there are some general tips you should follow when building any page on your website. In general, make sure that your font is easy to read and a good visible size (18–20px is typical), your colors match, and you avoid too much clutter.

Here’s a good example of a webpage that is clean, legible, and thought out:

Ahrefs about page example

Here’s an example of a webpage that has too much clutter and displays an ad over half the page, causing confusion:

CNN poor website design

In general, less is more and legibility is better than fancy fonts.

Step 7. Install these essential plugins

One of the best parts of using WordPress is access to its massive library of plugins

A plugin is a custom piece of code written by a developer that anyone can install on their WordPress website in order to add specific functionality to the site, such as a contact form, extra customization options, or SEO features.

You can install a new plugin one of two ways. Head over to Plugins > Add New. From here, you can either:

  1. Browse the plugins directly on this page, then install and activate them directly.
  2. Download a plugin .zip file from the plugin’s website, then click the Upload plugin button at the top of the screen and upload the .zip file.
How to upload a plugin to your WordPress website

While many plugins are free, some are paid or have a premium paid version. It depends on what you need. However, I always install the following free plugins on my websites:

Rank Math: This plugin makes basic on-page SEO easier. It tells you if you’re missing basic things like metadata, image alt text, and more. It also allows you to create a robots.txt file and a sitemap, which are important for search engines to crawl your website the way you want.

Wordfence: This is a security plugin to help prevent your website from being hacked. I always install some sort of security plugin on my sites.

Insert Headers and Footers: One of the things you’ll often find yourself needing to do is insert code into the header or footer of your pages. You need to do this for everything from setting up Google Analytics and Google Search Console to adding the Facebook Remarketing pixel and more. Having this plugin makes it much easier to add this code.

Keep in mind that installing a lot of plugins on your website can cause code bloat and slow down your loading speeds, so only install plugins that you really need. 

Step 8. Start creating content

Now you know all the basics of how to use WordPress. But another important thing I want to talk about, which is probably why you wanted to start a WordPress website in the first place—how to create content for your blog.

Writing blog posts is an essential part of showing up on search engines like Google, having something to share on social media, and attracting more visitors to your website.

What you write about depends on your goals. I always start with some basic keyword research to figure out what people are searching for on Google that relates to my website.

A quick and easy way to do this is by plugging a broad keyword into Ahrefs’ free keyword generator tool to get some keyword ideas. 

For example, if I’m starting a website about farming, I may type “farm” into the tool. I can see keyword ideas like “farming insurance” and “vertical farming,” which are two potential blog topics I can write about.

Keyword ideas for farming, via Ahrefs' free keyword generator tool

If I want to get a little more specific, I can try a keyword like “how to start a farm.” This gives me ideas like “how to start a farm with no money” and “how to start a farm in texas.”

Keyword ideas for "how to start a farm," via Ahrefs' free keyword generator tool

Try different seed keywords—both broad keywords and more specific ones—to come up with some blog topics. Once you have a few ideas, go ahead and outline the article and then write it and publish it.

Check out our guide to writing a blog post to learn more.

Step 9. Monitor your website for technical issues

A regular part of maintaining your WordPress website is keeping plugins and themes up to date, as well as monitoring your website’s technical health.

WordPress automatically notifies you of updates to your plugins or themes with a red circle next to Dashboard > Updates. Log in to your dashboard at least once a week to update everything.

WordPress updates dashboard

Beyond weekly updates, use the free Ahrefs Webmaster Tools to run a technical audit on your site and see any issues your site may have, such as broken links, missing metadata, or slow loading speeds. 

Ahrefs website audit overview, via AWT

If you click the All issues tab, you can see every issue your site has—with an overview of what the issue is and how to fix it if you click on the ? icon.

All issues report, via AWT

You’ll also get email alerts when anything on your site changes, such as a link breaking or a page returning a 404 code. It’s a helpful tool to automatically monitor your WordPress site.

Final thoughts

Congratulations, you now know the basics of using WordPress. It may have a large learning curve, but learning how to use this CMS is one of the most valuable skills you can have in today’s digital age.

You can use your WordPress website to make money blogging, promote your services as a freelancer, or even sell products online. Knowing how to build a website is almost mandatory these days for anyone who wants to start a business.

Source link

Continue Reading


Top 5 Essential SEO Reporting Tools For Agencies



Top 5 Essential SEO Reporting Tools For Agencies

Your clients trust you to create real results and hit KPIs that drive their businesses forward.

Understanding the intricacies of how that works can be difficult, so it’s essential to demonstrate your progress and efforts.

SEO reporting software showcases important metrics in a digestible and visually represented way. They save guesswork and manual referencing, highlighting achievements over a specified time.

A great tool can also help you formulate action items, gauge the performance of campaigns, and see real results that can help you create new and innovative evaluations.

The latest and allegedly greatest tools hit the market all the time, promising to transform how you conduct reports.

Certainly, you have to weigh a few factors when deciding which software to implement. Price, features, and ease of use are the most important to consider.

A cost-effective tool with a steep learning curve might not be worth it for the features. Similarly, an expensive tool might be more appealing if it is user-friendly but could quickly run up costs.

Just like any transformational business decision, you’ll have to weigh the pros and cons carefully to determine the right one for you.

Key Takeaways

  • Cost, accessibility, and features are the common thread of comparison for SEO reporting tools.
  • To truly get the best use out of an SEO reporting tool for your agency, you’ll need to weigh several details, including scalability, customization, integrations, and access to support.
  • What might be considered a subpar tool could be a game-changer for an agency. Due diligence and research are the keys to knowing what will work for your team.

What To Look For In SEO Reporting Tools

It can be tough to make heads or tails of the available tools and choose which will benefit your agency the most.

Here are the 10 essential requirements of SEO reporting tools.

1. Accurate And Current Regional Data

SEO reporting is all about data. The software must have access to accurate and current data localized to your client’s targeted region.

Search data from the U.S. is meaningless if your client tries to rank for [London plumbing services], so localization matters.

The tool must update data regularly and with reliable accuracy so you can make informed decisions about where your client stands against the competition.

2. Integration With Third-Party Tools

Especially for full-scale digital marketing campaigns, the ability to report on all KPIs in one place is essential.

The more available integrations with third-party tools (e.g., Google Analytics, Google Business Profile, Majestic), the better.

Some tools even allow you to upload custom data sets.

3. Scalability

You don’t want to have to retrain or reinvest in new software every time your agency reaches a new tier.

The right SEO reporting tool should work well for your current business size and leave room for expansion as you onboard more clients.

4. Strong Suite Of Features

A great SEO reporting tool should include:

  • Position tracking.
  • Backlink monitoring.
  • Competitor data.
  • Analytics.

It is a bonus if the tool has reporting features for social media, email marketing, call tracking, and/or paid ads to make it a full-suite digital marketing software.

5. Continually Improving And Updating Features

SEO is constantly evolving, and so should SEO reporting tools.

As we continue the transition from website optimization to web presence optimization, a tool’s ability to integrate new features is essential.

6. Ability To Customize Reports

Each client will have different KPIs, objectives, and priorities.

Presenting the information that clients want to see is paramount to successful campaigns and retention.

Your reporting software of choice should be able to emphasize the correct data at the right times.

7. Client Integration

A good SEO reporting tool must have the client in mind.

It should have a simple bird’s eye overview of the basics but also be easy for clients to dig into the data at a deeper level.

This can mean automated summary reports or 24/7 client access to the dashboard.

8. Ability To White Label Reports

While white labeling is not essential (no client will sniff at receiving a report with a Google logo in the top corner), it helps keep branding consistent and gives a professional sheen to everything you send a client’s way.

9. Access To Support Resources

Quality support resources can help you find a detour when you encounter a roadblock.

Whether it’s detailed support documentation, a chat feature/support desk, or responsive customer support on social media, finding the help you need to solve the issue is important.

10. Cost-To-Value Ratio

With a proper process, time investment, and leveraging support resources, it is possible to get better results from a free reporting tool than one that breaks the bank.

This can mean automated summary reports or 24/7 client access to the dashboard.

Top 5 SEO Reporting Tools

In evaluating five of the most popular SEO reporting tools, based on the above criteria, here is how they stack up:

1. AgencyAnalytics

My Overall Rating: 4.7/5

Image credit: AgencyAnalytics, December 2022

AgencyAnalytics is a quality introductory/intermediate reporting tool for agencies.

Among the tools on this list, it is one of the easiest to use for small to mid-sized agencies.

It starts at $12 per month, per client, with unlimited staff and client logins, a white-label dashboard, and automated branded reports. The minimum purchase requirements mean the first two tiers work out to $60 per month and $180 per month, respectively. But your ability to change the payment based on the number of clients could help keep costs lean.

AgencyAnalytics comes with 70+ supported third-party data integrations.

However, this reliance on third-party data means you may have incomplete reports when there is an interruption in the transmission.

Though new integrations are always being added, they can be glitchy at first, making them unreliable to share with clients until stabilized.

With the ability for clients to log in and view daily data updates, it provides real-time transparency.

Automated reports can be customized, and the drag-and-drop customized dashboard makes it easy to emphasize priority KPIs.

2. SE Ranking

My Overall Rating: 4.5/5

SE Ranking has plans starting at $39.20 per month, although the $87.20 per month plan is necessary if you need historical data or more than 10 projects.

Setup is a breeze, as the on-screen tutorial guides you through the process.

SE Ranking features a strong collection of SEO-related tools, including current and historical position tracking, competitor SEO research, keyword suggestion, a backlink explorer, and more.

SE Ranking is hooked up with Zapier, which allows users to integrate thousands of apps and provide a high level of automation between apps like Klipfolio, Salesforce, HubSpot, and Google Apps.

SE Ranking is an effective SEO reporting tool at a beginner to intermediate level.

However, you may want to look in a different direction if your agency requires more technical implementations or advanced customization.

3. Semrush

My Overall Rating: 4.4/5

Semrush is one of the most SEO-focused reporting tools on the list, which is reflected in its features.

Starting at $229.95 per month for the agency package, it’s one of the more expensive tools on the list. But Semrush provides a full suite of tools that can be learned at an intermediate level.

A major downside of Semrush, especially for cost-conscious agencies, is that an account comes with only one user login.

Having to purchase individual licenses for each SEO analyst or account manager adds up quickly, and the users you can add are limited by the plan features. This makes scalability an issue.

Semrush has both branded and white-label reports, depending on your subscription level. It uses a proprietary data stream, tracking more than 800 million keywords.

The ever-expanding “projects” feature covers everything from position tracking to backlink monitoring and social media analysis.

Though it doesn’t fall specifically under the scope of SEO reporting, Semrush’s innovation makes it a one-stop shop for many agencies.

Project features include Ad Builder, which helps craft compelling ad text for Google Ads, and Social Media Poster, which allows agencies to schedule client social posts.

Combining such diverse features under the Semrush umbrella offsets its relatively high cost, especially if you can cancel other redundant software.

4. Looker Studio

My Overall Rating: 3.6/5

Looker StudioScreenshot from Looker Studio, December 2022

Formerly known as Google Data Studio, Looker Studio is a Google service that has grown considerably since its initial launch.

Though it is much more technical and requires more time investment to set up than most other tools on this list, it should be intuitive for staff familiar with Google Analytics.

If you’re on the fence, Looker Studio is completely free.

A major upside to this software is superior integration with other Google properties like Analytics, Search Console, Ads, and YouTube.

Like other reporting tools, it also allows third-party data integration, but the ability to query data from databases, including MySQL, PostgreSQL, and Google’s Cloud SQL, sets it apart.

You can customize reports with important KPIs with proper setup, pulling from lead and customer information. For eCommerce clients, you can even integrate sales data.

Though the initial setup will be much more technical, the ability to import templates saves time and effort.

You can also create your own templates that better reflect your processes and can be shared across clients. Google also has introductory video walk-throughs to help you get started.

5. Authority Labs

My Overall Rating: 3.2/5

Authority Labs Ranking ReportImage credit: Authority Labs, December 2022

Authority Labs does the job if you’re looking for a straightforward position-tracking tool.

Authority Labs is $49 per month for unlimited users, though you will need to upgrade to the $99 per month plan for white-label reporting.

You can track regional ranking data, get insights into “(not provided)” keywords, track competitor keywords, and schedule automated reporting.

However, lacking other essential features like backlink monitoring or analytic data means you will have to supplement this tool to provide a full SEO reporting picture for clients.


There are many quality SEO reporting tools on the market. What makes them valuable depends on their ability to work for your clients’ needs.

SE Ranking has a fantastic cost-to-value ratio, while Looker Studio has advanced reporting capabilities if you can withstand a higher barrier to entry.

Agency Analytics prioritizes client access, which is a big deal if transparency is a core value for your agency.

Authority Labs keeps it lean and clean, while Semrush always adds innovative features.

These five are simply a snapshot of what is available. There are new and emerging tools that might have some features more appealing to your current clients or fill gaps that other software creates despite being a great solution.

Ultimately, you need to consider what matters most to your agency. Is it:

  • Feature depth?
  • Scalability?
  • Cost-to-value ratio?

Once you weigh the factors that matter most for your agency, you can find the right SEO reporting tool. In the meantime, don’t shy away from testing out a few for a trial period.

If you don’t want to sign up for a full month’s usage, you can also explore walkthrough videos and reviews from current users. The most informed decision requires an understanding of the intricate details.

Featured Image: Paulo Bobita/Search Engine Journal

Source link

Continue Reading