Connect with us

SEO

How to Do an SEO Log File Analysis [Template Included]

Published

on

How to Do an SEO Log File Analysis [Template Included]

Log files have been receiving increasing recognition from technical SEOs over the past five years, and for a good reason.

They’re the most trustworthy source of information to understand the URLs that search engines have crawled, which can be critical information to help diagnose problems with technical SEO.

Google itself recognizes their importance, releasing new features in Google Search Console and making it easy to see samples of data that would previously only be available by analyzing logs.

Crawl stats report; key data above and line graph showing trend of crawl requests below

In addition, Google Search Advocate John Mueller has publicly stated how much good information log files hold.

With all this hype around the data in log files, you may want to understand logs better, how to analyze them, and whether the sites you’re working on will benefit from them.

This article will answer all of that and more. Here’s what we’ll be discussing:

First, what is a server log file?

A server log file is a file created and updated by a server that records the activities it has performed. A popular server log file is an access log file, which holds a history of HTTP requests to the server (by both users and bots).

When a non-developer mentions a log file, access logs are the ones they’ll usually be referring to.

Developers, however, find themselves spending more time looking at error logs, which report issues encountered by the server.

The above is important: If you request logs from a developer, the first thing they’ll ask is, “Which ones?”

Therefore, always be specific with log file requests. If you want logs to analyze crawling, ask for access logs.

Access log files contain lots of information about each request made to the server, such as the following:

  • IP addresses
  • User agents
  • URL path
  • Timestamps (when the bot/browser made the request)
  • Request type (GET or POST)
  • HTTP status codes

What servers include in access logs varies by the server type and sometimes what developers have configured the server to store in log files. Common formats for log files include the following:

  • Apache format – This is used by Nginx and Apache servers.
  • W3C format – This is used by Microsoft IIS servers.
  • ELB format – This is used by Amazon Elastic Load Balancing.
  • Custom formats – Many servers support outputting a custom log format.

Other forms exist, but these are the main ones you’ll encounter.

How log files benefit SEO

Now that we’ve got a basic understanding of log files, let’s see how they benefit SEO.

Here are some key ways:

  • Crawl monitoring – You can see the URLs search engines crawl and use this to spot crawler traps, look out for crawl budget wastage, or better understand how quickly content changes are picked up.
  • Status code reporting – This is particularly useful for prioritizing fixing errors. Rather than knowing you’ve got a 404, you can see precisely how many times a user/search engine is visiting the 404 URL.
  • Trends analysis – By monitoring crawling over time to a URL, page type/site section, or your entire site, you can spot changes and investigate potential causes.
  • Orphan page discovery – You can cross-analyze data from log files and a site crawl you run yourself to discover orphan pages.

All sites will benefit from log file analysis to some degree, but the amount of benefit varies massively depending on site size.

This is as log files primarily benefit sites by helping you better manage crawling. Google itself states managing the crawl budget is something larger-scale or frequently changing sites will benefit from.

Excerpt of Google article

The same is true for log file analysis.

For example, smaller sites can likely use the “Crawl stats” data provided in Google Search Console and receive all of the benefits mentioned above—without ever needing to touch a log file.

Gif of Crawl stats report being scrolled down gradually

Yes, Google won’t provide you with all URLs crawled (like with log files), and the trends analysis is limited to three months of data.

However, smaller sites that change infrequently also need less ongoing technical SEO. It’ll likely suffice to have a site auditor discover and diagnose issues.

For example, a cross-analysis from a site crawler, XML sitemaps, Google Analytics, and Google Search Console will likely discover all orphan pages.

You can also use a site auditor to discover error status codes from internal links.

There are a few key reasons I’m pointing this out:

  • Access log files aren’t easy to get a hold of (more on this next).
  • For small sites that change infrequently, the benefit of log files isn’t as much, meaning SEO focuses will likely go elsewhere.

How to access your log files

In most cases, to analyze log files, you’ll first have to request access to log files from a developer.

The developer is then likely going to have a few issues, which they’ll bring to your attention. These include:

  • Partial data – Log files can include partial data scattered across multiple servers. This usually happens when developers use various servers, such as an origin server, load balancers, and a CDN. Getting an accurate picture of all logs will likely mean compiling the access logs from all servers.
  • File size – Access log files for high-traffic sites can end up in terabytes, if not petabytes, making them hard to transfer.
  • Privacy/compliance – Log files include user IP addresses that are personally identifiable information (PII). User information may need removing before it can be shared with you.
  • Storage history – Due to file size, developers may have configured access logs to be stored for a few days only, making them not useful for spotting trends and issues.

These issues will bring to question whether storing, merging, filtering, and transferring log files are worth the dev effort, especially if developers already have a long list of priorities (which is often the case).

Developers will likely put the onus on the SEO to explain/build a case for why developers should invest time in this, which you will need to prioritize among other SEO focuses.

These issues are precisely why log file analysis doesn’t happen frequently.

Log files you receive from developers are also often formatted in unsupported ways by popular log file analysis tools, making analysis more difficult.

Thankfully, there are software solutions that simplify this process. My favorite is Logflare, a Cloudflare app that can store log files in a BigQuery database that you own.

How to analyze your log files

Now it’s time to start analyzing your logs.

I’m going to show you how to do this in the context of Logflare specifically; however, the tips on how to use log data will work with any logs.

The template I’ll share shortly also works with any logs. You’ll just need to make sure the columns in the data sheets match up.

1. Start by setting up Logflare (optional)

Logflare is simple to set up. And with the BigQuery integration, it stores data long term. You’ll own the data, making it easily accessible for everyone.

There’s one difficulty. You need to swap out your domain name servers to use Cloudflare ones and manage your DNS there.

For most, this is fine. However, if you’re working with a more enterprise-level site, it’s unlikely you can convince the server infrastructure team to change the name servers to simplify log analysis.

I won’t go through every step on how to get Logflare working. But to get started, all you need to do is head to the Cloudflare Apps part of your dashboard.

"Apps" in a sidebar

And then search for Logflare.

"Logflare" appearing in search field on top-right corner, and the app appearing below in the results

The setup past this point is self-explanatory (create an account, give your project a name, choose the data to send, etc.). The only extra part I recommend following is Logflare’s guide to setting up BigQuery.

Bear in mind, however, that BigQuery does have a cost that’s based on the queries you do and the amount of data you store.

Sidenote.

 It’s worth noting that one significant advantage of the BigQuery backend is that you own the data. That means you can circumvent PII issues by configuring Logflare not to send PII like IP addresses and delete PII from BigQuery using an SQL query.

2. Verify Googlebot

We’ve now stored log files (via Logflare or an alternative method). Next, we need to extract logs precisely from the user agents we want to analyze. For most, this will be Googlebot.

Before we do that, we have another hurdle to jump across.

Many bots pretend to be Googlebot to get past firewalls (if you have one). In addition, some auditing tools do the same to get an accurate reflection of the content your site returns for the user agent, which is essential if your server returns different HTML for Googlebot, e.g., if you’ve set up dynamic rendering.

I’m not using Logflare

If you aren’t using Logflare, identifying Googlebot will require a reverse DNS lookup to verify the request did come from Google.

Google has a handy guide on validating Googlebot manually here.

Excerpt of Google article

You can do this on a one-off basis, using a reverse IP lookup tool and checking the domain name returned.

However, we need to do this in bulk for all rows in our log files. This also requires you to match IP addresses from a list provided by Google.

The easiest way to do this is by using server firewall rule sets maintained by third parties that block fake bots (resulting in fewer/no fake Googlebots in your log files). A popular one for Nginx is “Nginx Ultimate Bad Bot Blocker.”

Alternatively, something you’ll note on the list of Googlebot IPs is the IPV4 addresses all begin with “66.”

List of IPV4 addresses

While it won’t be 100% accurate, you can also check for Googlebot by filtering for IP addresses starting with “6” when analyzing the data within your logs.

I’m using Cloudflare/Logflare

Cloudflare’s pro plan (currently $20/month) has built-in firewall features that can block fake Googlebot requests from accessing your site.

Cloudflare pricing

Cloudflare disables these features by default, but you can find them by heading to Firewall > Managed Rules > enabling Cloudflare Specials> select Advanced”:

Webpage showing "Managed Rules"

Next, change the search type from “Description” to “ID” and search for “100035.”

List of description IDs

Cloudflare will now present you with a list of options to block fake search bots. Set the relevant ones to “Block,” and Cloudflare will check all requests from search bot user agents are legitimate, keeping your log files clean.

3. Extract data from log files

Finally, we now have access to log files, and we know the log files accurately reflect genuine Googlebot requests.

I recommend analyzing your log files within Google Sheets/Excel to start with because you’ll likely be used to spreadsheets, and it’s simple to cross-analyze log files with other sources like a site crawl.

There is no one right way to do this. You can use the following:

You can also do this within a Data Studio report. I find Data Studio helpful for monitoring data over time, and Google Sheets/Excel is better for a one-off analysis when technical auditing.

Open BigQuery and head to your project/dataset.

Sidebar showing project dataset

Select the “Query” dropdown and open it in a new tab.

"Query" dropdown showing 2 options: new tab or split tab

Next, you’ll need to write some SQL to extract the data you’ll be analyzing. To make this easier, first copy the contents of the FROM part of the query.

FROM part of the query

And then you can add that within the query I’ve written for you below:

SELECT DATE(timestamp) AS Date, req.url AS URL, req_headers.cf_connecting_ip AS IP, req_headers.user_agent AS User_Agent, resp.status_code AS Status_Code, resp.origin_time AS Origin_Time, resp_headers.cf_cache_status AS Cache_Status, resp_headers.content_type AS Content_Type

FROM `[Add Your from address here]`,

UNNEST(metadata) m,

UNNEST(m.request) req,

UNNEST(req.headers) req_headers,

UNNEST(m.response) resp,

UNNEST(resp.headers) resp_headers

WHERE DATE(timestamp) >= "2022-01-03" AND (req_headers.user_agent LIKE '%Googlebot%' OR req_headers.user_agent LIKE '%bingbot%')

ORDER BY timestamp DESC

This query selects all the columns of data that are useful for log file analysis for SEO purposes. It also only pulls data for Googlebot and Bingbot.

Sidenote.

If there are other bots you want to analyze, just add another OR req_headers.user_agent LIKE ‘%bot_name%’ within the WHERE statement. You can also easily change the start date by updating the WHERE DATE(timestamp) >= “2022–03-03” line.

Select “Run” at the top. Then choose to save the results.

Button to "save results"

Next, save the data to a CSV in Google Drive (this is the best option due to the larger file size).

And then, once BigQuery has run the job and saved the file, open the file with Google Sheets.

4. Add to Google Sheets

We’re now going to start with some analysis. I recommend using my Google Sheets template. But I’ll explain what I’m doing, and you can build the report yourself if you want.

Here is my template.

The template consists of two data tabs to copy and paste your data into, which I then use for all other tabs using the Google Sheets QUERY function.

Sidenote.

If you want to see how I’ve completed the reports that we’ll run through after setting up, select the first cell in each table.

To start with, copy and paste the output of your export from BigQuery into the “Data — Log files” tab.

Output from BigQuery

Note that there are multiple columns added to the end of the sheet (in darker grey) to make analysis a little easier (like the bot name and first URL directory).

5. Add Ahrefs data

If you have a site auditor, I recommend adding more data to the Google Sheet. Mainly, you should add these:

  • Organic traffic
  • Status codes
  • Crawl depth
  • Indexability
  • Number of internal links

To get this data out of Ahrefs’ Site Audit, head to Page Explorer and select “Manage Columns.”

I then recommend adding the columns shown below:

Columns to add

Then export all of that data.

Options to export to CSV

And copy and paste into the “Data — Ahrefs” sheet.

6. Check for status codes

The first thing we’ll analyze is status codes. This data will answer whether search bots are wasting crawl budget on non-200 URLs.

Note that this doesn’t always point toward an issue.

Sometimes, Google can crawl old 301s for many years. However, it can highlight an issue if you’re internally linking to many non-200 status codes.

The “Status Codes — Overview” tab has a QUERY function that summarizes the log file data and displays the results in a chart.

Pie chart showing summary of log file data for status codes

There is also a dropdown to filter by bot type and see which ones are hitting non-200 status codes the most.

Table showing status codes and corresponding hits; above, dropdown to filter results by bot type

Of course, this report alone doesn’t help us solve the issue, so I’ve added another tab, “URLs — Overview.”

List of URLs with corresponding data like status codes, organic traffic, etc

You can use this to filter for URLs that return non-200 status codes. As I’ve also included data from Ahrefs’ Site Audit, you can see whether you’re internally linking to any of those non-200 URLs in the “Inlinks” column.

If you see a lot of internal links to the URL, you can then use the Internal link opportunities report to spot these incorrect internal links by simply copying and pasting the URL in the search bar with “Target page” selected.

Excerpt of Internal link opportunities report results

7. Detect crawl budget wastage

The best way to highlight crawl budget wastage from log files that isn’t due to crawling non-200 status codes is to find frequently crawled non-indexable URLs (e.g., they’re canonicalized or noindexed).

Since we’ve added data from our log files and Ahrefs’ Site Audit, spotting these URLs is straightforward.

Head to the “Crawl budget wastage” tab, and you’ll find highly crawled HTML files that return a 200 but are non-indexable.

List of URLs with corresponding data like hits, etc

Now that you have this data, you’ll want to investigate why the bot is crawling the URL. Here are some common reasons:

  • It’s internally linked to.
  • It’s incorrectly included in XML sitemaps.
  • It has links from external sites.

It’s common for larger sites, especially those with faceted navigation, to link to many non-indexable URLs internally.

If the hit numbers in this report are very high and you believe you’re wasting your crawl budget, you’ll likely need to remove internal links to the URLs or block crawling with the robots.txt.

8. Monitor important URLs

If you have specific URLs on your site that are incredibly important to you, you may want to watch how often search engines crawl them.

The “URL monitor” tab does just that, plotting the daily trend of hits for up to five URLs that you can add.

Line graph showing daily trend of hits for 4 URLs

You can also filter by bot type, making it easy to monitor how often Bing or Google crawls a URL.

URL monitoring with dropdown option to filter by bot type

Sidenote.

You can also use this report to check URLs you’ve recently redirected. Simply add the old URL and new URL in the dropdown and see how quickly Googlebot notices the change.

Often, the advice here is that it’s a bad thing if Google doesn’t crawl a URL frequently. That simply isn’t the case.

While Google tends to crawl popular URLs more frequently, it will likely crawl a URL less if it doesn’t change often.

Excerpt of Google article

Still, it’s helpful to monitor URLs like this if you need content changes picked up quickly, such as on a news site’s homepage.

In fact, if you notice Google is recrawling a URL too frequently, I’ll advocate for trying to help it better manage crawl rate by doing things like adding <lastmod> to XML sitemaps. Here’s what it looks like:

<?xml version="1.0" encoding="UTF-8"?>

<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

  <url>

    <loc>https://www.logexample.com/example</loc>

    <lastmod>2022-10-04</lastmod>

  </url>

</urlset>

You can then update the <lastmod> attribute whenever the content of the page changes, signaling Google to recrawl.

9. Find orphan URLs

Another way to use log files is to discover orphan URLs, i.e., URLs that you want search engines to crawl and index but haven’t internally linked to.

We can do this by checking for 200 status code HTML URLs with no internal links found by Ahrefs’ Site Audit.

You can see the report I’ve created for this named “Orphan URLs.”

List of URLs with corresponding data like hits, etc

There is one caveat here. As Ahrefs hasn’t discovered these URLs but Googlebot has, these URLs may not be URLs we want to link to because they’re non-indexable.

I recommend copying and pasting these URLs using the “Custom URL list” functionality when setting up crawl sources for your Ahrefs project.

Page to set up crawl sources; text field to enter custom URLs

This way, Ahrefs will now consider these orphan URLs found in your log files and report any issues to you in your next crawl:

List of issues

10. Monitor crawling by directory

Suppose you’ve implemented structured URLs that indicate how you’ve organized your site (e.g., /features/feature-page/).

In that case, you can also analyze log files based on the directory to see if Googlebot is crawling certain sections of the site more than others.

I’ve implemented this kind of analysis in the “Directories — Overview” tab of the Google Sheet.

Table showing list of directories with corresponding data like organic traffic, inlinks, etc

You can see I’ve also included data on the number of internal links to the directories, as well as total organic traffic.

You can use this to see whether Googlebot is spending more time crawling low-traffic directories than high-value ones.

But again, bear in mind this may occur, as some URLs within specific directories change more often than others. Still, it’s worth further investigating if you spot an odd trend.

In addition to this report, there is also a “Directories — Crawl trend” report if you want to see the crawl trend per directory for your site.

Line graph showing crawl trend per directory

11. View Cloudflare cache ratios

Head to the “CF cache status” tab, and you’ll see a summary of how often Cloudflare is caching your files on the edge servers.

Bar chart showing how often Cloudflare is caching files on the edge servers

When Cloudflare caches content (HIT in the above chart), the request no longer goes to your origin server and is served directly from its global CDN. This results in better Core Web Vitals, especially for global sites.

Sidenote.

 It’s also worth having a caching setup on your origin server (such as Varnish, Nginx FastCGI, or Redis full-page cache). This is so that even when Cloudflare hasn’t cached a URL, you’ll still benefit from some caching.

If you see a large amount of “Miss” or “Dynamic” responses, I recommend investigating further to understand why Cloudflare isn’t caching content. Common causes can be:

  • You’re linking to URLs with parameters in them – Cloudflare, by default, passes these requests to your origin server, as they’re likely dynamic.
  • Your cache expiry times are too low – If you set short cache lifespans, it’s likely more users will receive uncached content.
  • You aren’t preloading your cache – If you need your cache to expire often (as content changes frequently), rather than letting users hit uncached URLs, use a preloader bot that will prime the cache, such as Optimus Cache Preloader.

Sidenote.

 I thoroughly recommend setting up HTML edge-caching via Cloudflare, which significantly reduces TTFB. You can do this easily with WordPress and Cloudflare’s Automatic Platform Optimization.

12. Check which bots crawl your site the most

The final report (found in the “Bots — Overview” tab) shows you which bots crawl your site the most:

Pie chart showing Googlebot crawls site the most, as compared to Bingbot

In the “Bots — Crawl trend” report, you can see how that trend has changed over time.

Stacked bar chart showing how crawl trend changes over time

This report can help check if there’s an increase in bot activity on your site. It’s also helpful when you’ve recently made a significant change, such as a URL migration, and want to see if bots have increased their crawling to collect new data.

Final thoughts

You should now have a good idea of the analysis you can do with your log files when auditing a site. Hopefully, you’ll find it easy to use my template and do this analysis yourself.

Anything unique you’re doing with your log files that I haven’t mentioned? Tweet me.




Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Expert Embedding Techniques for SEO Success

Published

on

By

Expert Embedding Techniques for SEO Success

AI Overviews are here, and they’re making a big impact in the world of SEO. Are you up to speed on how to maximize their impact?

Watch on-demand as we dive into the fascinating world of Google AI Overviews and their functionality, exploring the concept of embeddings and demystifying the complex processes behind them.

We covered which measures play a crucial role in how Google AI assesses the relevance of different pieces of content, helping to rank and select the most pertinent information for AI-generated responses.

You’ll see:

  • An understanding of the technical side of embeddings & how they work, enabling efficient information retrieval and comparison.
  • Insights into AI Content curation, including the criteria and algorithms used to rank and choose the most relevant snippets for AI-generated overviews.
  • A visualization of the step-by-step process of how AI overviews are constructed, with a clear perspective on the decision-making process behind AI-generated content.

With Scott Stouffer from Market Brew, we explored their AI Overviews Visualizer, a tool that deconstructs AI Overviews and provides an inside look at how Snippets and AI Overviews are curated. 

If you’re looking to clarify misconceptions around AI, or looking to face the challenge of optimizing your own content for the AI Overview revolution, then be sure to watch this webinar.

View the slides below, or check out the full presentation for all the details.

Join Us For Our Next Webinar!

[Expert Panel] How Agencies Leverage AI Tools To Drive ROI

Join us as we discuss the importance of AI to your performance as an agency or small business, and how you can use it successfully.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

7 Strategies to Lower Cost-Per-Lead

Published

on

7 Strategies to Lower Cost-Per-Lead

SEO for personal injury law firms is notorious for how expensive and competitive it can be. Even with paid ads, it’s common for every click from the ad to your website to cost hundreds of dollars: 

When spending this kind of money per click, the cost of gaining new cases can quickly skyrocket. Since SEO focuses on improving your visibility in the unpaid areas of search engines, you can cut costs and get more leads if you’re savvy enough.

Here are the strategies I’ve used to help new and boutique injury and accident law firms compete with the big guns for a fraction of the cost.

Recommendation

If you’re brand new to SEO, check out The Beginner’s Guide to SEO to get familiar with the fundamental concepts of SEO that apply to all websites. 

1. Add reviews, certifications, and contact details to your website

Unlike many other local service businesses, personal injury law firms need to work harder to earn trust and credibility online.

This applies to earning trust from humans and search engines alike. Google has a 170-page document called the Search Quality Rater Guidelines. This document contains two frameworks law firms can use to help Google and website visitors trust them more.

The first is “your money or your life,” or YMYL. Google uses this term to describe topics that may present a high risk of harm to searchers. Generally, any health, finances, safety, or welfare information falls into this category. Legal information is also a YMYL topic since acting on the wrong information could cause serious damage or harm to searchers.

The second framework is EEAT, which stands for experience, expertise, authoritativeness, and trustworthiness.

7 Strategies to Lower Cost Per Lead7 Strategies to Lower Cost Per Lead

This framework applies more broadly to all industries and is about sharing genuine information written by experts and authorities for a given topic. Both YMYL and EEAT consider the extent to which content is accurate, honest, safe, and reliable, with the ultimate goal of delivering trustworthy information.

Here are the things I implement for my personal injury clients as a priority to improve the trustworthiness of their online presence:

  1. Prominently display star ratings from third-party platforms, like Google or FaceBook reviews.
  2. Show your accreditations, certifications, awards, and the stats on cases you’ve won.
  3. If government-issued ratings or licenses apply to your practice areas, show those too.
  4. Add contact information like your phone number and address in the footer of every page.
  5. Share details of every member of your firm, highlighting their expertise and cases they’ve won.
  6. Add links to your professional profiles online, including social media and law-related listings.
  7. Include photos of your team and offices, results, case studies, and success stories.

2. Create a Google Business profile in every area you have an office

Generally speaking, your Google Business listing can account for over 50% of the leads you get from search engines. That’s because it can display prominently in the maps pack, like so: 1725965766 32 7 Strategies to Lower Cost Per Lead1725965766 32 7 Strategies to Lower Cost Per Lead Without a Google Business listing, your firm will not show up here or within Google Maps since it is managed completely separately from your website. Think of your Google listing like a social profile, but optimize it like a website. Make sure you create one of these for each location where you have an on-the-ground presence, ideally an established office.

Take the time to fill out all the details it asks for, especially:

  • Your firm’s name, address, and phone number
  • Your services with a description of each
  • Images of your premises, inside and outside the office

And anything else you see in these sections: Google Business LIsting profile informationGoogle Business LIsting profile information

Also, make it a regular habit to ask your clients for reviews.

Reviews are crucial for law firms. They are the number one deciding factor when someone is ready to choose a law firm to work with. While you can send automated text messages with a link to your Google profile, you’ll likely have a higher success rate if you ask clients in person while they’re in your office or by calling them.

I’ve also seen success when adding a request for a review on thank you pages.

For instance, if you ever send an electronic contract or invoice out to clients, once they’ve signed or paid, you can send them to a thank you page that also asks for a review. Here’s my favorite example of this from a local accountant. You can emulate this concept for your own website too:

1725965767 403 7 Strategies to Lower Cost Per Lead1725965767 403 7 Strategies to Lower Cost Per Lead

Recommendation

Optimizing your Google listing is part of local SEO. Check out our complete guide to local SEO for insights into how you can rank in more map pack results. 

3. Add a webpage for each location you serve

The most common way that people search for legal services is by searching for things like “personal injury lawyer near me” or “car accident lawyer new york”.

For instance, take a look at the monthly search volume on these “near me” keywords for an injury and accident lawyer:

1725965767 660 7 Strategies to Lower Cost Per Lead1725965767 660 7 Strategies to Lower Cost Per Lead

People also commonly search at a state, city, and even suburb level for many legal services, especially if it’s an area of law that differs based on someone’s location. To optimize your website architecture for location keywords like these, it’s best practice to create dedicated pages for each location and then add sub-pages for each of your practice areas in that location.

For example, here’s what that would look like:

Example of a franchise' site structure with each franchisee having a content hub.Example of a franchise' site structure with each franchisee having a content hub.

The corresponding URL structure would look like this:

  • /new-york
  • /new-york/car-accident-lawyer
  • /new-york/personal-injury-lawyer
  • /new-york/work-injury-lawyer

Pro Tip:

If you have many locations across the country, you may need to consider additional factors. The greater your number of locations, the more your SEO strategy may need to mimic a franchise’s location strategy.

Check out my guide on franchise SEO for local and national growth strategies if you have many offices nationwide. 

4. Build a topic hub for your core practice areas

A topic hub is a way to organize and link between related articles on a website. It’s sometimes referred to as a topic cluster because it groups together pages that are related to the same subject matter.

1725965768 48 7 Strategies to Lower Cost Per Lead1725965768 48 7 Strategies to Lower Cost Per Lead

If you run a small firm or your marketing budget is tight, I recommend focusing on a single area of law and turning your website into a topical hub. You can do this by publishing different types of content, such as how-to guides, answering common questions, and creating landing pages for each of your services.

For example, if you currently offer services for immigration law, criminal defense, and personal injury compensation, each appeals to very different audience segments. They’re also very competitive when it comes to marketing, so focusing your efforts on one of these is ideal to make your budget go further.

Most areas of law are naturally suited to building out topic clusters. Every practice area tends to follow a similar pattern in how people search at different stages in their journey.

  • Top-of-funnel: When people are very early in their journey, and unaware of what type of lawyer they need, they ask a lot of high-level questions like “what is a car accident attorney”.
  • Mid-funnel: When people are in the middle of their journey, they tend to ask more nuanced questions or look for more detailed information, like “average settlement for neck injury”.
  • Bottom-of-funnel: When people are ready to hire an attorney, they search for the practice area + “attorney” or “lawyer”. Sometimes they include a location but nothing else. For example, “personal injury lawyer”.

This pattern applies to most areas of law. To apply it to your website, enter your main practice area and a few variations into Keywords Explorer:

1725965768 248 7 Strategies to Lower Cost Per Lead1725965768 248 7 Strategies to Lower Cost Per Lead

Make sure to include a few different variations like how I’ve added different ways people search for lawyers (lawyer, attorney, solicitor) and also for other related terms (compensation, personal injury, settlement).

If you check the Matching terms report, you’ll generally get a big list that you’ll need to filter to make it more manageable when turning it into a content plan.

For example, there are 164,636 different keyword variations of how people search for personal injury lawyers. These generate over 2.4 million searches per month in the US.

1725965768 694 7 Strategies to Lower Cost Per Lead1725965768 694 7 Strategies to Lower Cost Per Lead

You can make the list more manageable by removing keywords with no search volume. Just set the minimum volume to 1:

1725965768 631 7 Strategies to Lower Cost Per Lead1725965768 631 7 Strategies to Lower Cost Per Lead

You can also use the include filter to only see keywords containing your location for your location landing pages:

1725965769 353 7 Strategies to Lower Cost Per Lead1725965769 353 7 Strategies to Lower Cost Per Lead

There are also a number of distinct sub-themes relevant to your area of law. To isolate these, you can use the Cluster by Terms side panel. For instance, looking at our list of injury-related keywords, you can easily spot specific body parts that emerge as sub-themes:

1725965769 520 7 Strategies to Lower Cost Per Lead1725965769 520 7 Strategies to Lower Cost Per Lead

Other sub-themes include:

  • How the accident happened (at work, in a car)
  • How much compensation someone can get (compensation, average, settlement)
  • How severe the injury was (traumatic)

Each of these sub-themes can be turned into a cluster. Here’s what it might look like for the topic of neck injuries:

Example of a content hub about neck injury settlements.Example of a content hub about neck injury settlements.

5. Create a knowledge hub answering common questions

People tend to ask a lot of questions related to most areas of law. As you go through the exercise of planning out your topic clusters, you should also consider building out a knowledge hub where people can more easily navigate your FAQs and find the answers they’re looking for.

Use the knowledge base exclusively for question-related content. You can find the most popular questions people ask after an accident or injury in the Matching terms > Questions tab:

1725965769 641 7 Strategies to Lower Cost Per Lead1725965769 641 7 Strategies to Lower Cost Per Lead

You can also easily see clusters of keywords for the top-of-funnel and mid-funnel questions people ask by checking the Clusters by Parent Topic report. It groups these keywords into similar themes and each group can likely be covered in a single article.

1725965769 514 7 Strategies to Lower Cost Per Lead1725965769 514 7 Strategies to Lower Cost Per Lead

Here’s an example of how Smith’s Lawyers has created a knowledge base with a search feature and broad categories to allow people to find answers to all their questions more easily.

1725965770 930 7 Strategies to Lower Cost Per Lead1725965770 930 7 Strategies to Lower Cost Per Lead

The easier you make it for people to find answers on your website, the less inclined they are to go back to Google and potentially visit a competitor’s website instead. It also increases their interaction time with your brand, giving you a higher chance of being front-of-mind when they are ready to speak to a lawyer about their case.

6. Use interactive content where applicable

Some areas of law lend themselves to certain types of interactive content. An obvious example is a compensation calculator for injury and accident claims. Doing a very quick search, there are over 1,500 keywords on this topic searched over 44,000 times a month in the US.

The best part is how insanely low the competition is on these keywords:

1725965770 383 7 Strategies to Lower Cost Per Lead1725965770 383 7 Strategies to Lower Cost Per Lead

Keyword difficulty is graded on a 100-point scale, so single-digit figures mean there’s virtually no competition to contend with. It’s not all that hard to create a calculator either.

There are many low-cost, no-code tools on the market, like Outgrow, that allow you to create a simple calculator in no time. Other types of interactive content you could consider are:

  • Quiz-style questionnaires: great for helping people decide if they need a lawyer for their case.
  • Chatbots: to answer people’s questions in real-time.
  • Assessments: to pre-qualify leads before they book a meeting with you.
  • Calendar or countdown clock: to help people keep track of imminent deadlines.

7. Gain links by sharing your expertise with writers and journalists

Backlinks are like the internet’s version of citations. They are typically dark blue, underlined text that connects you to a different page on the internet. In SEO, links play a very important role for a few different reasons:

  1. Links are how search engines discover new content. Your content may not be discovered if you have no links pointing to it.
  2. Links are like votes in a popularity contest. The more you have from authoritative websites in your industry, the more they elevate your brand.
  3. Links also help search engines understand what different websites are about. Getting links from other law-related websites will help build relevancy to your brand.

Think of link building as a scaled-down version of PR. It’s often easier and cheaper to implement. However, it is very time-intensive in most cases. If you’re doing your own SEO, hats off to you!

However, I’d recommend you consider partnering with an agency that specializes in law firm SEO and can handle link building for you. Typically, agencies like these will have existing relationships with law-related websites where they can feature your brand, which will be completely hands-off for you.

For instance, Webris has a database of thousands of legal websites on which they have been able to feature their clients. If you don’t have an existing database to work with and you’re doing SEO yourself, here are some alternative tactics to consider.

Expert quotes

Many journalists and writers benefit from quoting subject-matter experts in their content. You could be such an expert, and every time someone quotes you, ask for a link back to your website. Check out platforms like Muck Rack or SourceBottle, where reporters post callouts for specific experts they’re looking to get quotes from or feature in their articles.

1725965770 985 7 Strategies to Lower Cost Per Lead1725965770 985 7 Strategies to Lower Cost Per Lead

Guest posting

If you like writing content, you can alternatively create content for other people’s websites and include links back to your site. This approach is more time intensive. To make the effort worth it, reach out to websites with an established audience so you get some additional brand exposure too.

Updating outdated content

If you’re checking out other people’s legal content and you ever notice a mistake or outdated information, you could reach out and offer to help them correct it in exchange for a link to your website.

Naturally, you’ll need to recommend updates for sections of content that relate to your practice areas for this to work and for the link to make sense in the context of the content.

Final thoughts

SEO for personal injury lawyers is one of the most competitive niches. High advertising costs and high competition levels make it difficult for new or small firms to compete against industry giants.

As a new or emerging firm, you can take a more nimble approach and outrank the big firms for low competition keywords they haven’t optimized their websites for. It’s all about doing thorough research to uncover these opportunities in your practice area.

Want to know more? Reach out on LinkedIn.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Ads To Phase Out Enhanced CPC Bidding Strategy

Published

on

By

Google Ads To Phase Out Enhanced CPC Bidding Strategy

Google has announced plans to discontinue its Enhanced Cost-Per-Click (eCPC) bidding strategy for search and display ad campaigns.

This change, set to roll out in stages over the coming months, marks the end of an era for one of Google’s earliest smart bidding options.

Dates & Changes

Starting October 2024, new search and display ad campaigns will no longer be able to select Enhanced CPC as a bidding strategy.

However, existing eCPC campaigns will continue to function normally until March 2025.

From March 2025, all remaining search and display ad campaigns using Enhanced CPC will be automatically migrated to manual CPC bidding.

Advertisers who prefer not to change their campaigns before this date will see their bidding strategy default to manual CPC.

Impact On Display Campaigns

No immediate action is required for advertisers running display campaigns with the Maximize Clicks strategy and Enhanced CPC enabled.

These campaigns will automatically transition to the Maximize Clicks bidding strategy in March 2025.

Rationale Behind The Change

Google introduced Enhanced CPC over a decade ago as its first Smart Bidding strategy. The company has since developed more advanced machine learning-driven bidding options, such as Maximize Conversions with an optional target CPA and Maximize Conversion Value with an optional target ROAS.

In an email to affected advertisers, Google stated:

“These strategies have the potential to deliver comparable or superior outcomes. As we transition to these improved strategies, search and display ads campaigns will phase out Enhanced CPC.”

What This Means for Advertisers

This update signals Google’s continued push towards more sophisticated, AI-driven bidding strategies.

In the coming months, advertisers currently relying on Enhanced CPC will need to evaluate their options and potentially adapt their campaign management approaches.

While the change may require some initial adjustments, it also allows advertisers to explore and leverage Google’s more advanced bidding strategies, potentially improving campaign performance and efficiency.


FAQ

What change is Google implementing for Enhanced CPC bidding?

Google will discontinue the Enhanced Cost-Per-Click (eCPC) bidding strategy for search and display ad campaigns.

  • New search and display ad campaigns can’t select eCPC starting October 2024.
  • Existing campaigns will function with eCPC until March 2025.
  • From March 2025, remaining eCPC campaigns will switch to manual CPC bidding.

How will this update impact existing campaigns using Enhanced CPC?

Campaigns using Enhanced CPC will continue as usual until March 2025. After that:

  • Search and display ad campaigns employing eCPC will automatically migrate to manual CPC bidding.
  • Display campaigns with Maximize Clicks and eCPC enabled will transition to the Maximize Clicks strategy in March 2025.

What are the recommended alternatives to Enhanced CPC?

Google suggests using its more advanced, AI-driven bidding strategies:

  • Maximize Conversions – Can include an optional target CPA (Cost Per Acquisition).
  • Maximize Conversion Value – Can include an optional target ROAS (Return on Ad Spend).

These strategies are expected to deliver comparable or superior outcomes compared to Enhanced CPC.

What should advertisers do in preparation for this change?

Advertisers need to evaluate their current reliance on Enhanced CPC and explore alternatives:

  • Assess how newer AI-driven bidding strategies can be integrated into their campaigns.
  • Consider transitioning some campaigns earlier to adapt to the new strategies gradually.
  • Leverage tools and resources provided by Google to maximize performance and efficiency.

This proactive approach will help manage changes smoothly and explore potential performance improvements.


Featured Image: Vladimka production/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending