Connect with us

SEO

How to Do an SEO Log File Analysis [Template Included]

Published

on

How to Do an SEO Log File Analysis [Template Included]

Log files have been receiving increasing recognition from technical SEOs over the past five years, and for a good reason.

They’re the most trustworthy source of information to understand the URLs that search engines have crawled, which can be critical information to help diagnose problems with technical SEO.

Google itself recognizes their importance, releasing new features in Google Search Console and making it easy to see samples of data that would previously only be available by analyzing logs.

Crawl stats report; key data above and line graph showing trend of crawl requests below

In addition, Google Search Advocate John Mueller has publicly stated how much good information log files hold.

With all this hype around the data in log files, you may want to understand logs better, how to analyze them, and whether the sites you’re working on will benefit from them.

This article will answer all of that and more. Here’s what we’ll be discussing:

First, what is a server log file?

A server log file is a file created and updated by a server that records the activities it has performed. A popular server log file is an access log file, which holds a history of HTTP requests to the server (by both users and bots).

When a non-developer mentions a log file, access logs are the ones they’ll usually be referring to.

Developers, however, find themselves spending more time looking at error logs, which report issues encountered by the server.

The above is important: If you request logs from a developer, the first thing they’ll ask is, “Which ones?”

Therefore, always be specific with log file requests. If you want logs to analyze crawling, ask for access logs.

Access log files contain lots of information about each request made to the server, such as the following:

  • IP addresses
  • User agents
  • URL path
  • Timestamps (when the bot/browser made the request)
  • Request type (GET or POST)
  • HTTP status codes

What servers include in access logs varies by the server type and sometimes what developers have configured the server to store in log files. Common formats for log files include the following:

  • Apache format – This is used by Nginx and Apache servers.
  • W3C format – This is used by Microsoft IIS servers.
  • ELB format – This is used by Amazon Elastic Load Balancing.
  • Custom formats – Many servers support outputting a custom log format.

Other forms exist, but these are the main ones you’ll encounter.

How log files benefit SEO

Now that we’ve got a basic understanding of log files, let’s see how they benefit SEO.

Here are some key ways:

  • Crawl monitoring – You can see the URLs search engines crawl and use this to spot crawler traps, look out for crawl budget wastage, or better understand how quickly content changes are picked up.
  • Status code reporting – This is particularly useful for prioritizing fixing errors. Rather than knowing you’ve got a 404, you can see precisely how many times a user/search engine is visiting the 404 URL.
  • Trends analysis – By monitoring crawling over time to a URL, page type/site section, or your entire site, you can spot changes and investigate potential causes.
  • Orphan page discovery – You can cross-analyze data from log files and a site crawl you run yourself to discover orphan pages.

All sites will benefit from log file analysis to some degree, but the amount of benefit varies massively depending on site size.

This is as log files primarily benefit sites by helping you better manage crawling. Google itself states managing the crawl budget is something larger-scale or frequently changing sites will benefit from.

Excerpt of Google article

The same is true for log file analysis.

For example, smaller sites can likely use the “Crawl stats” data provided in Google Search Console and receive all of the benefits mentioned above—without ever needing to touch a log file.

Gif of Crawl stats report being scrolled down gradually

Yes, Google won’t provide you with all URLs crawled (like with log files), and the trends analysis is limited to three months of data.

However, smaller sites that change infrequently also need less ongoing technical SEO. It’ll likely suffice to have a site auditor discover and diagnose issues.

For example, a cross-analysis from a site crawler, XML sitemaps, Google Analytics, and Google Search Console will likely discover all orphan pages.

You can also use a site auditor to discover error status codes from internal links.

There are a few key reasons I’m pointing this out:

  • Access log files aren’t easy to get a hold of (more on this next).
  • For small sites that change infrequently, the benefit of log files isn’t as much, meaning SEO focuses will likely go elsewhere.

How to access your log files

In most cases, to analyze log files, you’ll first have to request access to log files from a developer.

The developer is then likely going to have a few issues, which they’ll bring to your attention. These include:

  • Partial data – Log files can include partial data scattered across multiple servers. This usually happens when developers use various servers, such as an origin server, load balancers, and a CDN. Getting an accurate picture of all logs will likely mean compiling the access logs from all servers.
  • File size – Access log files for high-traffic sites can end up in terabytes, if not petabytes, making them hard to transfer.
  • Privacy/compliance – Log files include user IP addresses that are personally identifiable information (PII). User information may need removing before it can be shared with you.
  • Storage history – Due to file size, developers may have configured access logs to be stored for a few days only, making them not useful for spotting trends and issues.

These issues will bring to question whether storing, merging, filtering, and transferring log files are worth the dev effort, especially if developers already have a long list of priorities (which is often the case).

Developers will likely put the onus on the SEO to explain/build a case for why developers should invest time in this, which you will need to prioritize among other SEO focuses.

These issues are precisely why log file analysis doesn’t happen frequently.

Log files you receive from developers are also often formatted in unsupported ways by popular log file analysis tools, making analysis more difficult.

Thankfully, there are software solutions that simplify this process. My favorite is Logflare, a Cloudflare app that can store log files in a BigQuery database that you own.

How to analyze your log files

Now it’s time to start analyzing your logs.

I’m going to show you how to do this in the context of Logflare specifically; however, the tips on how to use log data will work with any logs.

The template I’ll share shortly also works with any logs. You’ll just need to make sure the columns in the data sheets match up.

1. Start by setting up Logflare (optional)

Logflare is simple to set up. And with the BigQuery integration, it stores data long term. You’ll own the data, making it easily accessible for everyone.

There’s one difficulty. You need to swap out your domain name servers to use Cloudflare ones and manage your DNS there.

For most, this is fine. However, if you’re working with a more enterprise-level site, it’s unlikely you can convince the server infrastructure team to change the name servers to simplify log analysis.

I won’t go through every step on how to get Logflare working. But to get started, all you need to do is head to the Cloudflare Apps part of your dashboard.

"Apps" in a sidebar

And then search for Logflare.

"Logflare" appearing in search field on top-right corner, and the app appearing below in the results

The setup past this point is self-explanatory (create an account, give your project a name, choose the data to send, etc.). The only extra part I recommend following is Logflare’s guide to setting up BigQuery.

Bear in mind, however, that BigQuery does have a cost that’s based on the queries you do and the amount of data you store.

Sidenote.

 It’s worth noting that one significant advantage of the BigQuery backend is that you own the data. That means you can circumvent PII issues by configuring Logflare not to send PII like IP addresses and delete PII from BigQuery using an SQL query.

2. Verify Googlebot

We’ve now stored log files (via Logflare or an alternative method). Next, we need to extract logs precisely from the user agents we want to analyze. For most, this will be Googlebot.

Before we do that, we have another hurdle to jump across.

Many bots pretend to be Googlebot to get past firewalls (if you have one). In addition, some auditing tools do the same to get an accurate reflection of the content your site returns for the user agent, which is essential if your server returns different HTML for Googlebot, e.g., if you’ve set up dynamic rendering.

I’m not using Logflare

If you aren’t using Logflare, identifying Googlebot will require a reverse DNS lookup to verify the request did come from Google.

Google has a handy guide on validating Googlebot manually here.

Excerpt of Google article

You can do this on a one-off basis, using a reverse IP lookup tool and checking the domain name returned.

However, we need to do this in bulk for all rows in our log files. This also requires you to match IP addresses from a list provided by Google.

The easiest way to do this is by using server firewall rule sets maintained by third parties that block fake bots (resulting in fewer/no fake Googlebots in your log files). A popular one for Nginx is “Nginx Ultimate Bad Bot Blocker.”

Alternatively, something you’ll note on the list of Googlebot IPs is the IPV4 addresses all begin with “66.”

List of IPV4 addresses

While it won’t be 100% accurate, you can also check for Googlebot by filtering for IP addresses starting with “6” when analyzing the data within your logs.

I’m using Cloudflare/Logflare

Cloudflare’s pro plan (currently $20/month) has built-in firewall features that can block fake Googlebot requests from accessing your site.

Cloudflare pricing

Cloudflare disables these features by default, but you can find them by heading to Firewall > Managed Rules > enabling Cloudflare Specials> select Advanced”:

Webpage showing "Managed Rules"

Next, change the search type from “Description” to “ID” and search for “100035.”

List of description IDs

Cloudflare will now present you with a list of options to block fake search bots. Set the relevant ones to “Block,” and Cloudflare will check all requests from search bot user agents are legitimate, keeping your log files clean.

3. Extract data from log files

Finally, we now have access to log files, and we know the log files accurately reflect genuine Googlebot requests.

I recommend analyzing your log files within Google Sheets/Excel to start with because you’ll likely be used to spreadsheets, and it’s simple to cross-analyze log files with other sources like a site crawl.

There is no one right way to do this. You can use the following:

You can also do this within a Data Studio report. I find Data Studio helpful for monitoring data over time, and Google Sheets/Excel is better for a one-off analysis when technical auditing.

Open BigQuery and head to your project/dataset.

Sidebar showing project dataset

Select the “Query” dropdown and open it in a new tab.

"Query" dropdown showing 2 options: new tab or split tab

Next, you’ll need to write some SQL to extract the data you’ll be analyzing. To make this easier, first copy the contents of the FROM part of the query.

FROM part of the query

And then you can add that within the query I’ve written for you below:

SELECT DATE(timestamp) AS Date, req.url AS URL, req_headers.cf_connecting_ip AS IP, req_headers.user_agent AS User_Agent, resp.status_code AS Status_Code, resp.origin_time AS Origin_Time, resp_headers.cf_cache_status AS Cache_Status, resp_headers.content_type AS Content_Type

FROM `[Add Your from address here]`,

UNNEST(metadata) m,

UNNEST(m.request) req,

UNNEST(req.headers) req_headers,

UNNEST(m.response) resp,

UNNEST(resp.headers) resp_headers

WHERE DATE(timestamp) >= "2022-01-03" AND (req_headers.user_agent LIKE '%Googlebot%' OR req_headers.user_agent LIKE '%bingbot%')

ORDER BY timestamp DESC

This query selects all the columns of data that are useful for log file analysis for SEO purposes. It also only pulls data for Googlebot and Bingbot.

Sidenote.

If there are other bots you want to analyze, just add another OR req_headers.user_agent LIKE ‘%bot_name%’ within the WHERE statement. You can also easily change the start date by updating the WHERE DATE(timestamp) >= “2022–03-03” line.

Select “Run” at the top. Then choose to save the results.

Button to "save results"

Next, save the data to a CSV in Google Drive (this is the best option due to the larger file size).

And then, once BigQuery has run the job and saved the file, open the file with Google Sheets.

4. Add to Google Sheets

We’re now going to start with some analysis. I recommend using my Google Sheets template. But I’ll explain what I’m doing, and you can build the report yourself if you want.

Here is my template.

The template consists of two data tabs to copy and paste your data into, which I then use for all other tabs using the Google Sheets QUERY function.

Sidenote.

If you want to see how I’ve completed the reports that we’ll run through after setting up, select the first cell in each table.

To start with, copy and paste the output of your export from BigQuery into the “Data — Log files” tab.

Output from BigQuery

Note that there are multiple columns added to the end of the sheet (in darker grey) to make analysis a little easier (like the bot name and first URL directory).

5. Add Ahrefs data

If you have a site auditor, I recommend adding more data to the Google Sheet. Mainly, you should add these:

  • Organic traffic
  • Status codes
  • Crawl depth
  • Indexability
  • Number of internal links

To get this data out of Ahrefs’ Site Audit, head to Page Explorer and select “Manage Columns.”

I then recommend adding the columns shown below:

Columns to add

Then export all of that data.

Options to export to CSV

And copy and paste into the “Data — Ahrefs” sheet.

6. Check for status codes

The first thing we’ll analyze is status codes. This data will answer whether search bots are wasting crawl budget on non-200 URLs.

Note that this doesn’t always point toward an issue.

Sometimes, Google can crawl old 301s for many years. However, it can highlight an issue if you’re internally linking to many non-200 status codes.

The “Status Codes — Overview” tab has a QUERY function that summarizes the log file data and displays the results in a chart.

Pie chart showing summary of log file data for status codes

There is also a dropdown to filter by bot type and see which ones are hitting non-200 status codes the most.

Table showing status codes and corresponding hits; above, dropdown to filter results by bot type

Of course, this report alone doesn’t help us solve the issue, so I’ve added another tab, “URLs — Overview.”

List of URLs with corresponding data like status codes, organic traffic, etc

You can use this to filter for URLs that return non-200 status codes. As I’ve also included data from Ahrefs’ Site Audit, you can see whether you’re internally linking to any of those non-200 URLs in the “Inlinks” column.

If you see a lot of internal links to the URL, you can then use the Internal link opportunities report to spot these incorrect internal links by simply copying and pasting the URL in the search bar with “Target page” selected.

Excerpt of Internal link opportunities report results

7. Detect crawl budget wastage

The best way to highlight crawl budget wastage from log files that isn’t due to crawling non-200 status codes is to find frequently crawled non-indexable URLs (e.g., they’re canonicalized or noindexed).

Since we’ve added data from our log files and Ahrefs’ Site Audit, spotting these URLs is straightforward.

Head to the “Crawl budget wastage” tab, and you’ll find highly crawled HTML files that return a 200 but are non-indexable.

List of URLs with corresponding data like hits, etc

Now that you have this data, you’ll want to investigate why the bot is crawling the URL. Here are some common reasons:

  • It’s internally linked to.
  • It’s incorrectly included in XML sitemaps.
  • It has links from external sites.

It’s common for larger sites, especially those with faceted navigation, to link to many non-indexable URLs internally.

If the hit numbers in this report are very high and you believe you’re wasting your crawl budget, you’ll likely need to remove internal links to the URLs or block crawling with the robots.txt.

8. Monitor important URLs

If you have specific URLs on your site that are incredibly important to you, you may want to watch how often search engines crawl them.

The “URL monitor” tab does just that, plotting the daily trend of hits for up to five URLs that you can add.

Line graph showing daily trend of hits for 4 URLs

You can also filter by bot type, making it easy to monitor how often Bing or Google crawls a URL.

URL monitoring with dropdown option to filter by bot type

Sidenote.

You can also use this report to check URLs you’ve recently redirected. Simply add the old URL and new URL in the dropdown and see how quickly Googlebot notices the change.

Often, the advice here is that it’s a bad thing if Google doesn’t crawl a URL frequently. That simply isn’t the case.

While Google tends to crawl popular URLs more frequently, it will likely crawl a URL less if it doesn’t change often.

Excerpt of Google article

Still, it’s helpful to monitor URLs like this if you need content changes picked up quickly, such as on a news site’s homepage.

In fact, if you notice Google is recrawling a URL too frequently, I’ll advocate for trying to help it better manage crawl rate by doing things like adding <lastmod> to XML sitemaps. Here’s what it looks like:

<?xml version="1.0" encoding="UTF-8"?>

<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

  <url>

    <loc>https://www.logexample.com/example</loc>

    <lastmod>2022-10-04</lastmod>

  </url>

</urlset>

You can then update the <lastmod> attribute whenever the content of the page changes, signaling Google to recrawl.

9. Find orphan URLs

Another way to use log files is to discover orphan URLs, i.e., URLs that you want search engines to crawl and index but haven’t internally linked to.

We can do this by checking for 200 status code HTML URLs with no internal links found by Ahrefs’ Site Audit.

You can see the report I’ve created for this named “Orphan URLs.”

List of URLs with corresponding data like hits, etc

There is one caveat here. As Ahrefs hasn’t discovered these URLs but Googlebot has, these URLs may not be URLs we want to link to because they’re non-indexable.

I recommend copying and pasting these URLs using the “Custom URL list” functionality when setting up crawl sources for your Ahrefs project.

Page to set up crawl sources; text field to enter custom URLs

This way, Ahrefs will now consider these orphan URLs found in your log files and report any issues to you in your next crawl:

List of issues

10. Monitor crawling by directory

Suppose you’ve implemented structured URLs that indicate how you’ve organized your site (e.g., /features/feature-page/).

In that case, you can also analyze log files based on the directory to see if Googlebot is crawling certain sections of the site more than others.

I’ve implemented this kind of analysis in the “Directories — Overview” tab of the Google Sheet.

Table showing list of directories with corresponding data like organic traffic, inlinks, etc

You can see I’ve also included data on the number of internal links to the directories, as well as total organic traffic.

You can use this to see whether Googlebot is spending more time crawling low-traffic directories than high-value ones.

But again, bear in mind this may occur, as some URLs within specific directories change more often than others. Still, it’s worth further investigating if you spot an odd trend.

In addition to this report, there is also a “Directories — Crawl trend” report if you want to see the crawl trend per directory for your site.

Line graph showing crawl trend per directory

11. View Cloudflare cache ratios

Head to the “CF cache status” tab, and you’ll see a summary of how often Cloudflare is caching your files on the edge servers.

Bar chart showing how often Cloudflare is caching files on the edge servers

When Cloudflare caches content (HIT in the above chart), the request no longer goes to your origin server and is served directly from its global CDN. This results in better Core Web Vitals, especially for global sites.

Sidenote.

 It’s also worth having a caching setup on your origin server (such as Varnish, Nginx FastCGI, or Redis full-page cache). This is so that even when Cloudflare hasn’t cached a URL, you’ll still benefit from some caching.

If you see a large amount of “Miss” or “Dynamic” responses, I recommend investigating further to understand why Cloudflare isn’t caching content. Common causes can be:

  • You’re linking to URLs with parameters in them – Cloudflare, by default, passes these requests to your origin server, as they’re likely dynamic.
  • Your cache expiry times are too low – If you set short cache lifespans, it’s likely more users will receive uncached content.
  • You aren’t preloading your cache – If you need your cache to expire often (as content changes frequently), rather than letting users hit uncached URLs, use a preloader bot that will prime the cache, such as Optimus Cache Preloader.

Sidenote.

 I thoroughly recommend setting up HTML edge-caching via Cloudflare, which significantly reduces TTFB. You can do this easily with WordPress and Cloudflare’s Automatic Platform Optimization.

12. Check which bots crawl your site the most

The final report (found in the “Bots — Overview” tab) shows you which bots crawl your site the most:

Pie chart showing Googlebot crawls site the most, as compared to Bingbot

In the “Bots — Crawl trend” report, you can see how that trend has changed over time.

Stacked bar chart showing how crawl trend changes over time

This report can help check if there’s an increase in bot activity on your site. It’s also helpful when you’ve recently made a significant change, such as a URL migration, and want to see if bots have increased their crawling to collect new data.

Final thoughts

You should now have a good idea of the analysis you can do with your log files when auditing a site. Hopefully, you’ll find it easy to use my template and do this analysis yourself.

Anything unique you’re doing with your log files that I haven’t mentioned? Tweet me.




Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

How To Build A Diverse & Healthy Link Profile

Published

on

By

How To Build A Diverse & Healthy Link Profile

Search is evolving at an incredible pace and new features, formats, and even new search engines are popping up within the space.

Google’s algorithm still prioritizes backlinks when ranking websites. If you want your website to be visible in search results, you must account for backlinks and your backlink profile.

A healthy backlink profile requires a diverse backlink profile.

In this guide, we’ll examine how to build and maintain a diverse backlink profile that powers your website’s search performance.

What Does A Healthy Backlink Profile Look Like?

As Google states in its guidelines, it primarily crawls pages through links from other pages linked to your pages, acquired through promotion and naturally over time.

In practice, a healthy backlink profile can be divided into three main areas: the distribution of link types, the mix of anchor text, and the ratio of followed to nofollowed links.

Let’s look at these areas and how they should look within a healthy backlink profile.

Distribution Of Link Types

One aspect of your backlink profile that needs to be diversified is link types.

It looks unnatural to Google to have predominantly one kind of link in your profile, and it also indicates that you’re not diversifying your content strategy enough.

Some of the various link types you should see in your backlink profile include:

  • Anchor text links.
  • Image links.
  • Redirect links.
  • Canonical links.

Here is an example of the breakdown of link types at my company, Whatfix (via Semrush):

Screenshot from Semrush, May 2024

Most links should be anchor text links and image links, as these are the most common ways to link on the web, but you should see some of the other types of links as they are picked up naturally over time.

Mix Of Anchor Text

Next, ensure your backlink profile has an appropriate anchor text variance.

Again, if you overoptimize for a specific type of anchor text, it will appear suspicious to search engines like Google and could have negative repercussions.

Here are the various types of anchor text you might find in your backlink profile:

  • Branded anchor text – Anchor text that is your brand name or includes your brand name.
  • Empty – Links that have no anchor text.
  • Naked URLs – Anchor text that is a URL (e.g., www.website.com).
  • Exact match keyword-rich anchor text – Anchor text that exactly matches the keyword the linked page targets (e.g., blue shoes).
  • Partial match keyword-rich anchor text – Anchor text that partially or closely matches the keyword the linked page targets (e.g., “comfortable blue footwear options”).
  • Generic anchor text – Anchor text such as “this website” or “here.”

To maintain a healthy backlink profile, aim for a mix of anchor text within a similar range to this:

  • Branded anchor text – 35-40%.
  • Partial match keyword-rich anchor text – 15-20%.
  • Generic anchor text -10-15%.
  • Exact match keyword-rich anchor text – 5-10%.
  • Naked URLs – 5-10%.
  • Empty – 3-5%.

This distribution of anchor text represents a natural mix of differing anchor texts. It is common for the majority of anchors to be branded or partially branded because most sites that link to your site will default to your brand name when linking. It also makes sense that the following most common anchors would be partial-match keywords or generic anchor text because these are natural choices within the context of a web page.

Exact-match anchor text is rare because it only happens when you are the best resource for a specific term, and the site owner knows your page exists.

Ratio Of Followed Vs. Nofollowed Backlinks

Lastly, you should monitor the ratio of followed vs. nofollowed links pointing to your website.

If you need a refresher on what nofollowed backlinks are or why someone might apply the nofollow tag to a link pointing to your site, check out Google’s guide on how to qualify outbound links to Google.

Nofollow attributes should only be applied to paid links or links pointing to a site the linking site doesn’t trust.

While it is not uncommon or suspicious to have some nofollow links (people misunderstand the purpose of the nofollow attribute all the time), a healthy backlink profile will have far more followed links.

You should aim for a ratio of 80%:20% or 70%:30% in favor of followed links. For example, here is what the followed vs. nofollowed ratio looks like for my company’s backlink profile (according to Ahrefs):

Referring domainsScreenshot from Ahrefs, May 2024

You may see links with other rel attributes, such as UGC or Sponsored.

The “UGC” attribute tags links from user-generated content, while the “Sponsored” attribute tags links from sponsored or paid sources. These attributes are slightly different than the nofollow tag, but they essentially work the same way, letting Google know these links aren’t trusted or endorsed by the linking site. You can simply group these links in with nofollowed links when calculating your ratio.

Importance Of Diversifying Your Backlink Profile

So why is it important to diversify your backlink profile anyway? Well, there are three main reasons you should consider:

  • Avoiding overoptimization.
  • Diversifying traffic sources.
  • And finding new audiences.

Let’s dive into each of these.

Avoiding Overoptimization

First and foremost, diversifying your backlink profile is the best way to protect yourself from overoptimization and the damaging penalties that can come with it.

As SEO pros, our job is to optimize websites to improve performance, but overoptimizing in any facet of our strategy – backlinks, keywords, structure, etc. – can result in penalties that limit visibility within search results.

In the previous section, we covered the elements of a healthy backlink profile. If you stray too far from that model, your site might look suspicious to search engines like Google and you could be handed a manual or algorithmic penalty, suppressing your rankings in search.

Considering how regularly Google updates its search algorithm these days (and how little information surrounds those updates), you could see your performance tank and have no idea why.

This is why it’s so important to keep a watchful eye on your backlink profile and how it’s shaping up.

Diversifying Traffic Sources

Another reason to cultivate a diverse backlink profile is to ensure you’re diversifying your traffic sources.

Google penalties come swiftly and can often be a surprise. If you have all your eggs in that basket when it comes to traffic, your site will suffer badly and might need help to recover.

However, diversifying your traffic sources (search, social, email, etc.) will mitigate risk – similar to a stock portfolio – as you’ll have other traffic sources to provide a steady flow of visitors if another source suddenly dips.

Part of building a diverse backlink profile is acquiring a diverse set of backlinks and backlink types, and this strategy will also help you find differing and varied sources of traffic.

Finding New Audiences

Finally, building a diverse backlink profile is essential, as doing so will also help you discover new audiences.

If you acquire links from the same handful of websites and platforms, you will need help expanding your audience and building awareness for your website.

While it’s important to acquire links from sites that cater to your existing audience, you should also explore ways to build links that can tap into new audiences. The best way to do this is by casting a wide net with various link acquisition tactics and strategies.

A diverse backlink profile indicates a varied approach to SEO and marketing that will help bring new visitors and awareness to your site.

Building A Diverse Backlink Profile

So that you know what a healthy backlink profile looks like and why it’s important to diversify, how do you build diversity into your site’s backlink profile?

This comes down to your link acquisition strategy and the types of backlinks you actively pursue. To guide your strategy, let’s break link building into three main categories:

  • Foundational links.
  • Content promotion.
  • Community involvement.

Here’s how to approach each area.

Foundational Links

Foundational links represent those links that your website simply should have. These are opportunities where a backlink would exist if all sites were known to all site owners.

Some examples of foundational links include:

  • Mentions – Websites that mention your brand in some way (brand name, product, employees, proprietary data, etc.) on their website but don’t link.
  • Partners – Websites that belong to real-world partners or companies you connect with offline and should also connect (link) with online.
  • Associations or groups – Websites for offline associations or groups you belong to where your site should be listed with a link.
  • Sponsorships – Any events or organizations your company sponsors might have websites that could (and should) link to your site.
  • Sites that link to competitors – If a website is linking to a competitor, there is a strong chance it would make sense for them to link to your site as well.

These link opportunities should set the foundation for your link acquisition efforts.

As the baseline for your link building strategy, you should start by exhausting these opportunities first to ensure you’re not missing highly relevant links to bolster your backlink profile.

Content Promotion

Next, consider content promotion as a strategy for building a healthy, diverse backlink profile.

Content promotion is much more proactive than the foundational link acquisition mentioned above. You must manifest the opportunity by creating link-worthy content rather than simply capitalizing on an existing opportunity.

Some examples of content promotion for links are:

  • Digital PR – Digital PR campaigns have numerous benefits and goals beyond link acquisition, but backlinks should be a primary KPI.
  • Original research – Similar to digital PR, original research should focus on providing valuable data to your audience. Still, you should also make sure any citations or references to your research are correctly linked.
  • Guest content – Whether regular columns or one-off contributions, providing guest content to websites is still a viable link acquisition strategy – when done right. The best way to gauge your guest content strategy is to ask yourself if you would still write the content for a site without guaranteeing a backlink, knowing you’ll still build authority and get your message in front of a new audience.
  • Original imagery – Along with research and data, if your company creates original imagery that offers unique value, you should promote those images and ask for citation links.

Content promotion is a viable avenue for building a healthy backlink profile as long as the content you’re promoting is worthy of links.

Community Involvement

Community involvement is the final piece of your link acquisition puzzle when building a diverse backlink profile.

After pursuing all foundational opportunities and manually promoting your content, you should ensure your brand is active and represented in all the spaces and communities where your audience engages.

In terms of backlinks, this could mean:

  • Wikipedia links – Wikipedia gets over 4 billion monthly visits, so backlinks here can bring significant referral traffic to your site. However, acquiring these links is difficult as these pages are moderated closely, and your site will only be linked if it is legitimately a top resource on the web.
  • Forums (Reddit, Quora, etc.) – Another great place to get backlinks that drive referral traffic is forums like Reddit and Quora. Again, these forums are strictly moderated, and earning link placements on these sites requires a page that delivers significant and unique value to a specific audience.
  • Social platforms – Social media platforms and groups represent communities where your brand should be active and engaged. While these strategies are likely handled by other teams outside SEO and focus on different metrics, you should still be intentional about converting these interactions into links when or where possible.
  • Offline events – While it may seem counterintuitive to think of offline events as a potential source for link acquisition, legitimate link opportunities exist here. After all, most businesses, brands, and people you interact with at these events also have websites, and networking can easily translate to online connections in the form of links.

While most of the link opportunities listed above will have the nofollow link attribute due to the nature of the sites associated with them, they are still valuable additions to your backlink profile as these are powerful, trusted domains.

These links help diversify your traffic sources by bringing substantial referral traffic, and that traffic is highly qualified as these communities share your audience.

How To Avoid Developing A Toxic Backlink Profile

Now that you’re familiar with the link building strategies that can help you cultivate a healthy, diverse backlink profile, let’s discuss what you should avoid.

As mentioned before, if you overoptimize one strategy or link, it can seem suspicious to search engines and cause your site to receive a penalty. So, how do you avoid filling your backlink profile with toxic links?

Remember The “Golden Rule” Of Link Building

One simple way to guide your link acquisition strategy and avoid running afoul of search engines like Google is to follow one “golden rule.”

That rule is to ask yourself: If search engines like Google didn’t exist, and the only way people could navigate the web was through backlinks, would you want your site to have a link on the prospective website?

Thinking this way strips away all the tactical, SEO-focused portions of the equation and only leaves the human elements of linking where two sites are linked because it makes sense and makes the web easier to navigate.

Avoid Private Blog Networks (PBNs)

Another good rule is to avoid looping your site into private blog networks (PBNs). Of course, it’s not always obvious or easy to spot a PBN.

However, there are some common traits or red flags you can look for, such as:

  • The person offering you a link placement mentions they have a list of domains they can share.
  • The prospective linking site has little to no traffic and doesn’t appear to have human engagement (blog comments, social media followers, blog views, etc.).
  • The website features thin content and little investment into user experience (UX) and design.
  • The website covers generic topics and categories, catering to any and all audiences.
  • Pages on the site feature numerous external links but only some internal links.
  • The prospective domain’s backlink profile features overoptimization in any of the previously discussed forms (high-density of exact match anchor text, abnormal ratio of nofollowed links, only one or two link types, etc.).

Again, diversification – in both tactics and strategies – is crucial to building a healthy backlink profile, but steering clear of obvious PBNs and remembering the ‘golden rule’ of link building will go a long way toward keeping your profile free from toxicity.

Evaluating Your Backlink Profile

As you work diligently to build and maintain a diverse, healthy backlink profile, you should also carve out time to evaluate it regularly from a more analytical perspective.

There are two main ways to evaluate the merit of your backlinks: leverage tools to analyze backlinks and compare your backlink profile to the greater competitive landscape.

Leverage Tools To Analyze Backlink Profile

There are a variety of third-party tools you can use to analyze your backlink profile.

These tools can provide helpful insights, such as the total number of backlinks and referring domains. You can use these tools to analyze your full profile, broken down by:

  • Followed vs. nofollowed.
  • Authority metrics (Domain Rating, Domain Authority, Authority Score, etc.).
  • Backlink types.
  • Location or country.
  • Anchor text.
  • Top-level domain types.
  • And more.

You can also use these tools to track new incoming backlinks, as well as lost backlinks, to help you better understand how your backlink profile is growing.

Some of the best tools for analyzing your backlink profile are:

Many of these tools also have features that estimate how toxic or suspicious your profile might look to search engines, which can help you detect potential issues early.

Compare Your Backlink Profile To The Competitive Landscape

Lastly, you should compare your overall backlink profile to those of your competitors and those competing with your site in the search results.

Again, the previously mentioned tools can help with this analysis – as far as providing you with the raw numbers – but the key areas you should compare are:

  • Total number of backlinks.
  • Total number of referring domains.
  • Breakdown of authority metrics of links (Domain Rating, Domain Authority, Authority Score, etc.).
  • Authority metrics of competing domains.
  • Link growth over the last two years.

Comparing your backlink profile to others within your competitive landscape will help you assess where your domain currently stands and provide insight into how far you must go if you’re lagging behind competitors.

It’s worth noting that it’s not as simple as whoever has the most backlinks will perform the best in search.

These numbers are typically solid indicators of how search engines gauge the authority of your competitors’ domains, and you’ll likely find a correlation between strong backlink profiles and strong search performance.

Approach Link Building With A User-First Mindset

The search landscape continues to evolve at a breakneck pace and we could see dramatic shifts in how people search within the next five years (or sooner).

However, at this time, search engines like Google still rely on backlinks as part of their ranking algorithms, and you need to cultivate a strong backlink profile to be visible in search.

Furthermore, if you follow the advice in this article as you build out your profile, you’ll acquire backlinks that benefit your site regardless of search algorithms, futureproofing your traffic sources.

Approach link acquisition like you would any other marketing endeavor – with a customer-first mindset – and over time, you’ll naturally build a healthy, diverse backlink profile.

More resources: 


Featured Image: Sammby/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google On Traffic Diversity As A Ranking Factor

Published

on

By

Google answers the question of whether traffic diversity is a ranking factor for SEO

Google’s SearchLiaison tweeted encouragement to diversify traffic sources, being clear about the reason he was recommending it. Days later, someone followed up to ask if traffic diversity is a ranking factor, prompting SearchLiaison to reiterate that it is not.

What Was Said

The question of whether diversity of traffic was a ranking factor was elicited from a previous tweet in a discussion about whether a site owner should be focusing on off-site promotion.

Here’s the question from the original discussion that was tweeted:

“Can you please tell me if I’m doing right by focusing on my site and content – writing new articles to be found through search – or if I should be focusing on some off-site effort related to building a readership? It’s frustrating to see traffic go down the more effort I put in.”

SearchLiaison split the question into component parts and answered each one. When it came to the part about off-site promotion, SearchLiaison (who is Danny Sullivan), shared from his decades of experience as a journalist and publisher covering technology and search marketing.

I’m going to break down his answer so that it’s clearer what he meant

This is the part from the tweet that talks about off-site activities:

“As to the off-site effort question, I think from what I know from before I worked at Google Search, as well as my time being part of the search ranking team, is that one of the ways to be successful with Google Search is to think beyond it.”

What he is saying here is simple, don’t limit your thinking about what to do with your site to thinking about how to make it appeal to Google.

He next explains that sites that rank tend to be sites that are created to appeal to people.

SearchLiaison continued:

“Great sites with content that people like receive traffic in many ways. People go to them directly. They come via email referrals. They arrive via links from other sites. They get social media mentions.”

What he’s saying there is that you’ll know that you’re appealing to people if people are discussing your site in social media, if people are referring the site in social media and if other sites are citing it with links.

Other ways to know that a site is doing well is when when people engage in the comments section, send emails asking follow up questions, and send emails of thanks and share anecdotes of their success or satisfaction with a product or advice.

Consider this, fast fashion site Shein at one point didn’t rank for their chosen keyword phrases, I know because I checked out of curiosity. But they were at the time virally popular and making huge amounts of sales by gamifying site interaction and engagement, propelling them to become a global brand. A similar strategy propelled Zappos when they pioneered no-questions asked returns and cheerful customer service.

SearchLiaison continued:

“It just means you’re likely building a normal site in the sense that it’s not just intended for Google but instead for people. And that’s what our ranking systems are trying to reward, good content made for people.”

SearchLiaison explicitly said that building sites with diversified content is not a ranking factor.

He added this caveat to his tweet:

“This doesn’t mean you should get a bunch of social mentions, or a bunch of email mentions because these will somehow magically rank you better in Google (they don’t, from how I know things).”

Despite The Caveat…

A journalist tweeted this:

“Earlier this week, @searchliaison told people to diversify their traffic. Naturally, people started questioning whether that meant diversity of traffic was a ranking factor.

So, I asked @iPullRank what he thought.”

SearchLiaison of course answered that he explicitly said it’s not a ranking factor and linked to his original tweet that I quoted above.

He tweeted:

“I mean that’s not exactly what I myself said, but rather repeat all that I’ll just add the link to what I did say:”

The journalist responded:

“I would say this is calling for publishers to diversify their traffic since you’re saying the great sites do it. It’s the right advice to give.”

And SearchLiaison answered:

“It’s the part of “does it matter for rankings” that I was making clear wasn’t what I myself said. Yes, I think that’s a generally good thing, but it’s not the only thing or the magic thing.”

Not Everything Is About Ranking Factors

There is a longstanding practice by some SEOs to parse everything that Google publishes for clues to how Google’s algorithm works. This happened with the Search Quality Raters guidelines. Google is unintentionally complicit because it’s their policy to (in general) not confirm whether or not something is a ranking factor.

This habit of searching for “ranking factors” leads to misinformation. It takes more acuity to read research papers and patents to gain a general understanding of how information retrieval works but it’s more work to try to understand something than skimming a PDF for ranking papers.

The worst approach to understanding search is to invent hypotheses about how Google works and then pore through a document to confirm those guesses (and falling into the confirmation bias trap).

In the end, it may be more helpful to back off of exclusively optimizing for Google and focus at least equally as much in optimizing for people (which includes optimizing for traffic). I know it works because I’ve been doing it for years.

Featured Image by Shutterstock/Asier Romero

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

The Complete Guide to Google My Business for Local SEO

Published

on

The Complete Guide to Google My Business

What is Google My Business?

Google My Business (GMB) is a free tool that business owners can use to manage their online presence across Google Search and Google Maps.

This profile also puts out important business details, such as address, phone number, and operating hours, making it easily accessible to potential customers. 

Google My Business profile shown on Google MapsGoogle My Business profile shown on Google Maps

When you click on a business listing in the search results it will open a detailed sidebar on the right side of the screen, providing comprehensive information about the business. 

This includes popular times, which show when the business is busiest, a Q&A section where potential users can ask questions and receive responses from the business or other customers, and a photos and videos section that showcases products and services. Customer reviews and ratings are also displayed, which are crucial for building trust and credibility.

Business details on Google My Business profileBusiness details on Google My Business profile

Using Google My Business for Local SEO

Having an optimized Google Business Profile ensures that your business is visible, searchable, and can attract potential customers who are looking for your products and services.

  • Increased reliance on online discovery: More consumers are going online to search and find local businesses, making it crucial to have a GMB listing.
  • Be where your customers are searching: GMB ensures your business information is accurate and visible on Google Search and Maps, helping you stay competitive.
  • Connect with customers digitally: GMB allows customers to connect with your business through various channels, including messaging and reviews.
  • Build your online reputation: GMB makes it easy for customers to leave reviews, which can improve your credibility and trustworthiness.
  • Location targeting: GMB enables location-based targeting, showing your ads to people searching for businesses in your exact location.
  • Measurable results: GMB provides actionable analytics, allowing you to track your performance and optimize your listing.

How to Set Up Google My Business

If you already have a profile and need help claiming, verifying, and/or optimizing it, skip to the next sections.

If you’re creating a new Google My Business profile, here’s a step-by-step guide:

Access or Create your Google AccountAccess or Create your Google Account

Step 1: Access or Create your Google Account:

If you don’t already have a Google account, follow these steps to create one:

  • Visit the Google Account Sign-up Page: Go to the Google Account sign-up page and click on “Create an account.”
  • Enter Your Information: Fill in the required fields, including your name, email address, and password.
  • Verify Your Account: Google will send a verification email to your email address. Click on the link in the email to confirm your account.

Step 2:  Access Google My Business

Business name on Google My BusinessBusiness name on Google My Business

Step 3: Enter Your Business Name and Category

  • Type in your exact business name. Google will suggest existing businesses as you type
  • If your business is not listed, fully type out the name as it appears
  • Search for and select your primary business category

Adding business address to Google My Business profileAdding business address to Google My Business profile

Step 4: Provide Your Business Address

  • If you have a physical location where customers can visit, select “Yes” and enter your address.
  • If you are a service area business without a physical location, select “No” and enter your service area.

Adding contact information to Google My Business profileAdding contact information to Google My Business profile

Step 5: Add Your Contact Information

  • Enter your business phone number and website URL
  • You can also create a free website based on your GMB information

Complete Your ProfileComplete Your Profile

Step 6: Complete Your Profile

To complete your profile, add the following details:

  • Hours of Operation: Enter your business’s operating hours to help customers plan their visits.
  • Services: List the services your business offers to help customers understand what you do.
  • Description: Write a detailed description of your business to help customers understand your offerings.

Now that you know how to set up your Google My Business account, all that’s left is to verify it. 

Verification is essential for you to manage and update business information whenever you need to, and for Google to show your business profile to the right users and for the right search queries. 

If you are someone who wants to claim their business or is currently on the last step of setting up their GMB, this guide will walk you through the verification process to solidify your business’ online credibility and visibility.

How to Verify Google My Business

There are several ways you can verify your business, including:

  • Postcard Verification: Google will send a postcard to your business address with a verification code. Enter the code on your GMB dashboard to verify.
  • Phone Verification: Google will call your business phone number and provide a verification code. Enter the code on your GMB dashboard to verify.
  • Email Verification: If you have a business email address, you can use it to verify your listing.
  • Instant Verification: If you have a Google Analytics account linked to your business, you can use instant verification.

How to Claim & Verify an Existing Google My Business Profile

If your business has an existing Google My Business profile, and you want to claim it, then follow these steps:

Sign in to Google AccountSign in to Google Account

Step 1: Sign in to Google My Business

Access Google My Business: Go to the Google My Business website and sign in with your Google account. If you don’t have a Google account, create one by following the sign-up process.

Search for Your BusinessSearch for Your Business

Step 2: Search for Your Business

Enter your business name in the search bar to find your listing. If your business is already listed, you will see it in the search results.

Request access to existing Google My Business accountRequest access to existing Google My Business account

Step 3: Claim Your Listing

If your business is not already claimed, you will see a “Claim this business” button. Click on this button to start the claiming process.

Editing business information on Google My BusinessEditing business information on Google My Business

Step 4: Complete Your Profile

Once your listing is verified, you can complete your profile by adding essential business information such as:

  • Business Name: Ensure it matches your business name.
  • Address: Enter your business address accurately.
  • Phone Number: Enter your business phone number.
  • Hours of Operation: Specify your business hours.
  • Categories: Choose relevant categories that describe your business.
  • Description: Write a brief description of your business.

Step 5: Manage Your Listing

Regularly check and update your listing to ensure it remains accurate and up-to-date. Respond to customer reviews and use the insights provided by Google Analytics to improve your business.

Unverified Google My Business profileUnverified Google My Business profile

Step 6: Verification 

Verify your business through postcard, email, or phone numbers as stated above. 

Now that you have successfully set up and verified your Google My Business listing, it’s time to optimize it for maximum visibility and effectiveness. By doing this, you can improve your local search rankings, increase customer engagement, and drive more conversions.

How to Optimize Google My Business

Here are the tips that I usually do when I’m optimizing my GMB account: 

    1. Complete Your Profile: Start by ensuring every section applicable to your business is filled out with accurate and up-to-date information. Use your real business name without keyword stuffing to avoid suspension. Ensure your address and phone number are consistent with those on your website and other online directories, and add a link to your website and social media accounts.
    2. Optimize for Keywords: Integrate relevant keywords into your business description, services, and posts. However, avoid stuffing your GMB profile with keywords, as this can appear spammy and reduce readability.
    3. Add Backlinks: Encourage local websites, blogs, and business directories to link to your GMB profile. 
  1. Select Appropriate Categories: Choose the most relevant primary category for your business to help Google understand what your business is about. Additionally, add secondary categories that accurately describe your business’s offerings to capture more relevant search traffic.
  2. Encourage and Manage Reviews: Ask satisfied customers to leave positive reviews on your profile, as reviews significantly influence potential customers. Respond to all reviews, both positive and negative, in a professional and timely manner. Addressing negative feedback shows that you value customer opinions and are willing to improve.
  3. Add High-Quality Photos and Videos: Use high-quality images for your profile and cover photos that represent your business well. Upload additional photos of your products, services, team, and premises. Adding short, engaging videos can give potential customers a virtual tour or highlight key services, enhancing their interest.

By following this comprehensive guide, you have successfully set up, verified, and optimized your GMB profile. Remember to continuously maintain and update your profile to ensure maximum impact and success.

Key Takeaway: 

With more and more people turning to Google for all their needs, creating, verifying, and optimizing your Google My Business profile is a must if you want your business to be found. 

Follow this guide to Google My Business, and you’re going to see increased online presence across Google Search and Google Maps in no time.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending