Connect with us

SEO

Using Python + Streamlit To Find Striking Distance Keyword Opportunities

Published

on

Python is an excellent tool to automate repetitive tasks as well as gain additional insights into data.

In this article, you’ll learn how to build a tool to check which keywords are close to ranking in positions one to three and advises whether there is an opportunity to naturally work those keywords into the page.

It’s perfect for Python beginners and pros alike and is a great introduction to using Python for SEO.

If you’d just like to get stuck in there’s a handy Streamlit app available for the code. This is simple to use and requires no coding experience.

There’s also a Google Colaboratory Sheet if you’d like to poke around with the code. If you can crawl a website, you can use this script!

Here’s an example of what we’ll be making today:

Screenshot from Microsoft Excel, October 2021An Excel sheet documenting onpage keywords opportunites generated with Python

These keywords are found in the page title and H1, but not in the copy. Adding these keywords naturally to the existing copy would be an easy way to increase relevancy for these keywords.

By taking the hint from search engines and naturally including any missing keywords a site already ranks for, we increase the confidence of search engines to rank those keywords higher in the SERPs.

This report can be created manually, but it’s pretty time-consuming.

So, we’re going to automate the process using a Python SEO script.

Preview Of The Output

This is a sample of what the final output will look like after running the report:

Excel sheet showing and example of keywords that can be optimised by using the striking distance reportScreenshot from Microsoft Excel, October 2021Excel sheet showing and example of keywords that can be optimised by using the striking distance report

The final output takes the top five opportunities by search volume for each page and neatly lays each one horizontally along with the estimated search volume.

It also shows the total search volume of all keywords a page has within striking distance, as well as the total number of keywords within reach.

The top five keywords by search volume are then checked to see if they are found in the title, H1, or copy, then flagged TRUE or FALSE.

This is great for finding quick wins! Just add the missing keyword naturally into the page copy, title, or H1.

Getting Started

The setup is fairly straightforward. We just need a crawl of the site (ideally with a custom extraction for the copy you’d like to check), and an exported file of all keywords a site ranks for.

This post will walk you through the setup, the code, and will link to a Google Colaboratory sheet if you just want to get stuck in without coding it yourself.

To get started you will need:

We’ve named this the Striking Distance Report as it flags keywords that are easily within striking distance.

(We have defined striking distance as keywords that rank in positions four to 20, but have made this a configurable option in case you would like to define your own parameters.)

Striking Distance SEO Report: Getting Started

1. Crawl The Target Website

  • Set a custom extractor for the page copy (optional, but recommended).
  • Filter out pagination pages from the crawl.

2. Export All Keywords The Site Ranks For Using Your Favorite Provider

  • Filter keywords that trigger as a site link.
  • Remove keywords that trigger as an image.
  • Filter branded keywords.
  • Use both exports to create an actionable Striking Distance report from the keyword and crawl data with Python.

Crawling The Site

I’ve opted to use Screaming Frog to get the initial crawl. Any crawler will work, so long as the CSV export uses the same column names or they’re renamed to match.

The script expects to find the following columns in the crawl CSV export:

"Address", "Title 1", "H1-1", "Copy 1", "Indexability"

Crawl Settings

The first thing to do is to head over to the main configuration settings within Screaming Frog:

Configuration > Spider > Crawl

The main settings to use are:

Crawl Internal Links, Canonicals, and the Pagination (Rel Next/Prev) setting.

(The script will work with everything else selected, but the crawl will take longer to complete!)

Recommended Screaming Frog Crawl SettingsScreenshot from Screaming Frog, October 2021Recommended Screaming Frog Crawl Settings

Next, it’s on to the Extraction tab.

Configuration > Spider > Extraction

Recommended Screaming Frog Extraction Crawl SettingsScreenshot from Screaming Frog, October 2021Recommended Screaming Frog Extraction Crawl Settings

At a bare minimum, we need to extract the page title, H1, and calculate whether the page is indexable as shown below.

Indexability is useful because it’s an easy way for the script to identify which URLs to drop in one go, leaving only keywords that are eligible to rank in the SERPs.

If the script cannot find the indexability column, it’ll still work as normal but won’t differentiate between pages that can and cannot rank.

Setting A Custom Extractor For Page Copy

In order to check whether a keyword is found within the page copy, we need to set a custom extractor in Screaming Frog.

Configuration > Custom > Extraction

Name the extractor “Copy” as seen below.

Screaming Frog Custom Extraction Showing Default Options for Extracting the Page CopyScreenshot from Screaming Frog, October 2021Screaming Frog Custom Extraction Showing Default Options for Extracting the Page Copy

Important: The script expects the extractor to be named “Copy” as above, so please double check!

Lastly, make sure Extract Text is selected to export the copy as text, rather than HTML.

There are many guides on using custom extractors online if you need help setting one up, so I won’t go over it again here.

Once the extraction has been set it’s time to crawl the site and export the HTML file in CSV format.

Exporting The CSV File

Exporting the CSV file is as easy as changing the drop-down menu displayed underneath Internal to HTML and pressing the Export button.

Internal > HTML > Export

Screaming Frog - Export Internal HTML SettingsScreenshot from Screaming Frog, October 2021Screaming Frog - Export Internal HTML Settings

After clicking Export, It’s important to make sure the type is set to CSV format.

The export screen should look like the below:

Screaming Frog Internal HTML CSV Export SettingsScreenshot from Screaming Frog, October 2021Screaming Frog Internal HTML CSV Export Settings

Tip 1: Filtering Out Pagination Pages

I recommend filtering out pagination pages from your crawl either by selecting Respect Next/Prev under the Advanced settings (or just deleting them from the CSV file, if you prefer).

Screaming Frog Settings to Respect Rel / PrevScreenshot from Screaming Frog, October 2021Screaming Frog Settings to Respect Rel / Prev

Tip 2: Saving The Crawl Settings

Once you have set the crawl up, it’s worth just saving the crawl settings (which will also remember the custom extraction).

This will save a lot of time if you want to use the script again in the future.

File > Configuration > Save As

How to save a configuration file in screaming frogScreenshot from Screaming Frog, October 2021How to save a configuration file in screaming frog

Exporting Keywords

Once we have the crawl file, the next step is to load your favorite keyword research tool and export all of the keywords a site ranks for.

The goal here is to export all the keywords a site ranks for, filtering out branded keywords and any which triggered as a sitelink or image.

For this example, I’m using the Organic Keyword Report in Ahrefs, but it will work just as well with Semrush if that’s your preferred tool.

In Ahrefs, enter the domain you’d like to check in Site Explorer and choose Organic Keywords.

Ahrefs Site Explorer SettingsScreenshot from Ahrefs.com, October 2021Ahrefs Site Explorer Settings

Site Explorer > Organic Keywords

Ahrefs - How Setting to Export Organic Keywords a Site Ranks ForScreenshot from Ahrefs.com, October 2021Ahrefs - How Setting to Export Organic Keywords a Site Ranks For

This will bring up all keywords the site is ranking for.

Filtering Out Sitelinks And Image links

The next step is to filter out any keywords triggered as a sitelink or an image pack.

The reason we need to filter out sitelinks is that they have no influence on the parent URL ranking. This is because only the parent page technically ranks for the keyword, not the sitelink URLs displayed under it.

Filtering out sitelinks will ensure that we are optimizing the correct page.

Ahrefs Screenshot Demonstrating Pages Ranking for Sitelink KeywordsScreenshot from Ahrefs.com, October 2021Ahrefs Screenshot Demonstrating Pages Ranking for Sitelink Keywords

Here’s how to do it in Ahrefs.

Image showing how to exclude images and sitelinks from a keyword exportScreenshot from Ahrefs.com, October 2021Image showing how to exclude images and sitelinks from a keyword export

Lastly, I recommend filtering out any branded keywords. You can do this by filtering the CSV output directly, or by pre-filtering in the keyword tool of your choice before the export.

Finally, when exporting make sure to choose Full Export and the UTF-8 format as shown below.

Image showing how to export keywords in UTF-8 format as a csv fileScreenshot from Ahrefs.com, October 2021Image showing how to export keywords in UTF-8 format as a csv file

By default, the script works with Ahrefs (v1/v2) and Semrush keyword exports. It can work with any keyword CSV file as long as the column names the script expects are present.

Processing

The following instructions pertain to running a Google Colaboratory sheet to execute the code.

There is now a simpler option for those that prefer it in the form of a Streamlit app. Simply follow the instructions provided to upload your crawl and keyword file.

Now that we have our exported files, all that’s left to be done is to upload them to the Google Colaboratory sheet for processing.

Select Runtime > Run all from the top navigation to run all cells in the sheet.

Image showing how to run the stirking distance Python script from Google CollaboratoryScreenshot from Colab.research.google.com, October 2021Image showing how to run the stirking distance Python script from Google Collaboratory

The script will prompt you to upload the keyword CSV from Ahrefs or Semrush first and the crawl file afterward.

Image showing how to upload the csv files to Google CollaboratoryScreenshot from Colab.research.google.com, October 2021Image showing how to upload the csv files to Google Collaboratory

That’s it! The script will automatically download an actionable CSV file you can use to optimize your site.

Image showing the Striking Distance final outputScreenshot from Microsoft Excel, October 2021Image showing the Striking Distance final output

Once you’re familiar with the whole process, using the script is really straightforward.

Code Breakdown And Explanation

If you’re learning Python for SEO and interested in what the code is doing to produce the report, stick around for the code walkthrough!

Install The Libraries

Let’s install pandas to get the ball rolling.

!pip install pandas

Import The Modules

Next, we need to import the required modules.

import pandas as pd
from pandas import DataFrame, Series
from typing import Union
from google.colab import files

Set The Variables

Now it’s time to set the variables.

The script considers any keywords between positions four and 20 as within striking distance.

Changing the variables here will let you define your own range if desired. It’s worth experimenting with the settings to get the best possible output for your needs.

# set all variables here
min_volume = 10  # set the minimum search volume
min_position = 4  # set the minimum position  / default = 4
max_position = 20 # set the maximum position  / default = 20
drop_all_true = True  # If all checks (h1/title/copy) are true, remove the recommendation (Nothing to do)
pagination_filters = "filterby|page|p="  # filter patterns used to detect and drop paginated pages

Upload The Keyword Export CSV File

The next step is to read in the list of keywords from the CSV file.

It is set up to accept an Ahrefs report (V1 and V2) as well as a Semrush export.

This code reads in the CSV file into a Pandas DataFrame.

upload = files.upload()
upload = list(upload.keys())[0]
df_keywords = pd.read_csv(
    (upload),
    error_bad_lines=False,
    low_memory=False,
    encoding="utf8",
    dtype={
        "URL": "str",
        "Keyword": "str",
        "Volume": "str",
        "Position": int,
        "Current URL": "str",
        "Search Volume": int,
    },
)
print("Uploaded Keyword CSV File Successfully!")

If everything went to plan, you’ll see a preview of the DataFrame created from the keyword CSV export. 

Dataframe showing sucessful upload of the keyword export fileScreenshot from Colab.research.google.com, October 2021Dataframe showing sucessful upload of the keyword export file

Upload The Crawl Export CSV File

Once the keywords have been imported, it’s time to upload the crawl file.

This fairly simple piece of code reads in the crawl with some error handling option and creates a Pandas DataFrame named df_crawl.

upload = files.upload()
upload = list(upload.keys())[0]
df_crawl = pd.read_csv(
    (upload),
        error_bad_lines=False,
        low_memory=False,
        encoding="utf8",
        dtype="str",
    )
print("Uploaded Crawl Dataframe Successfully!")

Once the CSV file has finished uploading, you’ll see a preview of the DataFrame.

Image showing a dataframe of the crawl file being uploaded successfullyScreenshot from Colab.research.google.com, October 2021Image showing a dataframe of the crawl file being uploaded successfully

Clean And Standardize The Keyword Data

The next step is to rename the column names to ensure standardization between the most common types of file exports.

Essentially, we’re getting the keyword DataFrame into a good state and filtering using cutoffs defined by the variables.

df_keywords.rename(
    columns={
        "Current position": "Position",
        "Current URL": "URL",
        "Search Volume": "Volume",
    },
    inplace=True,
)

# keep only the following columns from the keyword dataframe
cols = "URL", "Keyword", "Volume", "Position"
df_keywords = df_keywords.reindex(columns=cols)

try:
    # clean the data. (v1 of the ahrefs keyword export combines strings and ints in the volume column)
    df_keywords["Volume"] = df_keywords["Volume"].str.replace("0-10", "0")
except AttributeError:
    pass

# clean the keyword data
df_keywords = df_keywords[df_keywords["URL"].notna()]  # remove any missing values
df_keywords = df_keywords[df_keywords["Volume"].notna()]  # remove any missing values
df_keywords = df_keywords.astype({"Volume": int})  # change data type to int
df_keywords = df_keywords.sort_values(by="Volume", ascending=False)  # sort by highest vol to keep the top opportunity

# make new dataframe to merge search volume back in later
df_keyword_vol = df_keywords[["Keyword", "Volume"]]

# drop rows if minimum search volume doesn't match specified criteria
df_keywords.loc[df_keywords["Volume"] < min_volume, "Volume_Too_Low"] = "drop"
df_keywords = df_keywords[~df_keywords["Volume_Too_Low"].isin(["drop"])]

# drop rows if minimum search position doesn't match specified criteria
df_keywords.loc[df_keywords["Position"] <= min_position, "Position_Too_High"] = "drop"
df_keywords = df_keywords[~df_keywords["Position_Too_High"].isin(["drop"])]
# drop rows if maximum search position doesn't match specified criteria
df_keywords.loc[df_keywords["Position"] >= max_position, "Position_Too_Low"] = "drop"
df_keywords = df_keywords[~df_keywords["Position_Too_Low"].isin(["drop"])]

Clean And Standardize The Crawl Data

Next, we need to clean and standardize the crawl data.

Essentially, we use reindex to only keep the “Address,” “Indexability,” “Page Title,” “H1-1,” and “Copy 1” columns, discarding the rest.

We use the handy “Indexability” column to only keep rows that are indexable. This will drop canonicalized URLs, redirects, and so on. I recommend enabling this option in the crawl.

Lastly, we standardize the column names so they’re a little nicer to work with.

# keep only the following columns from the crawl dataframe
cols = "Address", "Indexability", "Title 1", "H1-1", "Copy 1"
df_crawl = df_crawl.reindex(columns=cols)
# drop non-indexable rows
df_crawl = df_crawl[~df_crawl["Indexability"].isin(["Non-Indexable"])]
# standardise the column names
df_crawl.rename(columns={"Address": "URL", "Title 1": "Title", "H1-1": "H1", "Copy 1": "Copy"}, inplace=True)
df_crawl.head()

Group The Keywords

As we approach the final output, it’s necessary to group our keywords together to calculate the total opportunity for each page.

Here, we’re calculating how many keywords are within striking distance for each page, along with the combined search volume.

# groups the URLs (remove the dupes and combines stats)
# make a copy of the keywords dataframe for grouping - this ensures stats can be merged back in later from the OG df
df_keywords_group = df_keywords.copy()
df_keywords_group["KWs in Striking Dist."] = 1  # used to count the number of keywords in striking distance
df_keywords_group = (
    df_keywords_group.groupby("URL")
    .agg({"Volume": "sum", "KWs in Striking Dist.": "count"})
    .reset_index()
)
df_keywords_group.head()
DataFrame showing how many keywords were found within striking distanceScreenshot from Colab.research.google.com, October 2021DataFrame showing how many keywords were found within striking distance

Once complete, you’ll see a preview of the DataFrame.

Display Keywords In Adjacent Rows

We use the grouped data as the basis for the final output. We use Pandas.unstack to reshape the DataFrame to display the keywords in the style of a GrepWords export.

DataFrame showing a grepwords type-view of keywords laid out horizontallyScreenshot from Colab.research.google.com, October 2021DataFrame showing a grepwords type-view of keywords laid out horizontally
# create a new df, combine the merged data with the original data. display in adjacent rows ala grepwords
df_merged_all_kws = df_keywords_group.merge(
    df_keywords.groupby("URL")["Keyword"]
    .apply(lambda x: x.reset_index(drop=True))
    .unstack()
    .reset_index()
)

# sort by biggest opportunity
df_merged_all_kws = df_merged_all_kws.sort_values(
    by="KWs in Striking Dist.", ascending=False
)

# reindex the columns to keep just the top five keywords
cols = "URL", "Volume", "KWs in Striking Dist.", 0, 1, 2, 3, 4
df_merged_all_kws = df_merged_all_kws.reindex(columns=cols)

# create union and rename the columns
df_striking: Union[Series, DataFrame, None] = df_merged_all_kws.rename(
    columns={
        "Volume": "Striking Dist. Vol",
        0: "KW1",
        1: "KW2",
        2: "KW3",
        3: "KW4",
        4: "KW5",
    }
)

# merges striking distance df with crawl df to merge in the title, h1 and category description
df_striking = pd.merge(df_striking, df_crawl, on="URL", how="inner")

Set The Final Column Order And Insert Placeholder Columns

Lastly, we set the final column order and merge in the original keyword data.

There are a lot of columns to sort and create!

# set the final column order and merge the keyword data in

cols = [
    "URL",
    "Title",
    "H1",
    "Copy",
    "Striking Dist. Vol",
    "KWs in Striking Dist.",
    "KW1",
    "KW1 Vol",
    "KW1 in Title",
    "KW1 in H1",
    "KW1 in Copy",
    "KW2",
    "KW2 Vol",
    "KW2 in Title",
    "KW2 in H1",
    "KW2 in Copy",
    "KW3",
    "KW3 Vol",
    "KW3 in Title",
    "KW3 in H1",
    "KW3 in Copy",
    "KW4",
    "KW4 Vol",
    "KW4 in Title",
    "KW4 in H1",
    "KW4 in Copy",
    "KW5",
    "KW5 Vol",
    "KW5 in Title",
    "KW5 in H1",
    "KW5 in Copy",
]

# re-index the columns to place them in a logical order + inserts new blank columns for kw checks.
df_striking = df_striking.reindex(columns=cols)

Merge In The Keyword Data For Each Column

This code merges the keyword volume data back into the DataFrame. It’s more or less the equivalent of an Excel VLOOKUP function.

# merge in keyword data for each keyword column (KW1 - KW5)
df_striking = pd.merge(df_striking, df_keyword_vol, left_on="KW1", right_on="Keyword", how="left")
df_striking['KW1 Vol'] = df_striking['Volume']
df_striking.drop(['Keyword', 'Volume'], axis=1, inplace=True)
df_striking = pd.merge(df_striking, df_keyword_vol, left_on="KW2", right_on="Keyword", how="left")
df_striking['KW2 Vol'] = df_striking['Volume']
df_striking.drop(['Keyword', 'Volume'], axis=1, inplace=True)
df_striking = pd.merge(df_striking, df_keyword_vol, left_on="KW3", right_on="Keyword", how="left")
df_striking['KW3 Vol'] = df_striking['Volume']
df_striking.drop(['Keyword', 'Volume'], axis=1, inplace=True)
df_striking = pd.merge(df_striking, df_keyword_vol, left_on="KW4", right_on="Keyword", how="left")
df_striking['KW4 Vol'] = df_striking['Volume']
df_striking.drop(['Keyword', 'Volume'], axis=1, inplace=True)
df_striking = pd.merge(df_striking, df_keyword_vol, left_on="KW5", right_on="Keyword", how="left")
df_striking['KW5 Vol'] = df_striking['Volume']
df_striking.drop(['Keyword', 'Volume'], axis=1, inplace=True)

Clean The Data Some More

The data requires additional cleaning to populate empty values, (NaNs), as empty strings. This improves the readability of the final output by creating blank cells, instead of cells populated with NaN string values.

Next, we convert the columns to lowercase so that they match when checking whether a target keyword is featured in a specific column.

# replace nan values with empty strings
df_striking = df_striking.fillna("")
# drop the title, h1 and category description to lower case so kws can be matched to them
df_striking["Title"] = df_striking["Title"].str.lower()
df_striking["H1"] = df_striking["H1"].str.lower()
df_striking["Copy"] = df_striking["Copy"].str.lower()

Check Whether The Keyword Appears In The Title/H1/Copy and Return True Or False

This code checks if the target keyword is found in the page title/H1 or copy.

It’ll flag true or false depending on whether a keyword was found within the on-page elements.

df_striking["KW1 in Title"] = df_striking.apply(lambda row: row["KW1"] in row["Title"], axis=1)
df_striking["KW1 in H1"] = df_striking.apply(lambda row: row["KW1"] in row["H1"], axis=1)
df_striking["KW1 in Copy"] = df_striking.apply(lambda row: row["KW1"] in row["Copy"], axis=1)
df_striking["KW2 in Title"] = df_striking.apply(lambda row: row["KW2"] in row["Title"], axis=1)
df_striking["KW2 in H1"] = df_striking.apply(lambda row: row["KW2"] in row["H1"], axis=1)
df_striking["KW2 in Copy"] = df_striking.apply(lambda row: row["KW2"] in row["Copy"], axis=1)
df_striking["KW3 in Title"] = df_striking.apply(lambda row: row["KW3"] in row["Title"], axis=1)
df_striking["KW3 in H1"] = df_striking.apply(lambda row: row["KW3"] in row["H1"], axis=1)
df_striking["KW3 in Copy"] = df_striking.apply(lambda row: row["KW3"] in row["Copy"], axis=1)
df_striking["KW4 in Title"] = df_striking.apply(lambda row: row["KW4"] in row["Title"], axis=1)
df_striking["KW4 in H1"] = df_striking.apply(lambda row: row["KW4"] in row["H1"], axis=1)
df_striking["KW4 in Copy"] = df_striking.apply(lambda row: row["KW4"] in row["Copy"], axis=1)
df_striking["KW5 in Title"] = df_striking.apply(lambda row: row["KW5"] in row["Title"], axis=1)
df_striking["KW5 in H1"] = df_striking.apply(lambda row: row["KW5"] in row["H1"], axis=1)
df_striking["KW5 in Copy"] = df_striking.apply(lambda row: row["KW5"] in row["Copy"], axis=1)

Delete True/False Values If There Is No Keyword

This will delete true/false values when there is no keyword adjacent.

# delete true / false values if there is no keyword
df_striking.loc[df_striking["KW1"] == "", ["KW1 in Title", "KW1 in H1", "KW1 in Copy"]] = ""
df_striking.loc[df_striking["KW2"] == "", ["KW2 in Title", "KW2 in H1", "KW2 in Copy"]] = ""
df_striking.loc[df_striking["KW3"] == "", ["KW3 in Title", "KW3 in H1", "KW3 in Copy"]] = ""
df_striking.loc[df_striking["KW4"] == "", ["KW4 in Title", "KW4 in H1", "KW4 in Copy"]] = ""
df_striking.loc[df_striking["KW5"] == "", ["KW5 in Title", "KW5 in H1", "KW5 in Copy"]] = ""
df_striking.head()

Drop Rows If All Values == True

This configurable option is really useful for reducing the amount of QA time required for the final output by dropping the keyword opportunity from the final output if it is found in all three columns.

def true_dropper(col1, col2, col3):
    drop = df_striking.drop(
        df_striking[
            (df_striking[col1] == True)
            & (df_striking[col2] == True)
            & (df_striking[col3] == True)
        ].index
    )
    return drop

if drop_all_true == True:
    df_striking = true_dropper("KW1 in Title", "KW1 in H1", "KW1 in Copy")
    df_striking = true_dropper("KW2 in Title", "KW2 in H1", "KW2 in Copy")
    df_striking = true_dropper("KW3 in Title", "KW3 in H1", "KW3 in Copy")
    df_striking = true_dropper("KW4 in Title", "KW4 in H1", "KW4 in Copy")
    df_striking = true_dropper("KW5 in Title", "KW5 in H1", "KW5 in Copy")

Download The CSV File

The last step is to download the CSV file and start the optimization process.

df_striking.to_csv('Keywords in Striking Distance.csv', index=False)
files.download("Keywords in Striking Distance.csv")

Conclusion

If you are looking for quick wins for any website, the striking distance report is a really easy way to find them.

Don’t let the number of steps fool you. It’s not as complex as it seems. It’s as simple as uploading a crawl and keyword export to the supplied Google Colab sheet or using the Streamlit app.

The results are definitely worth it!

More Resources:


Featured Image: aurielaki/Shutterstock

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘python-seo-striking-distance’,
content_category: ‘seo-strategy technical-seo’
});

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

15 Unique Ways to Check Competitor Website Traffic

Published

on

15 Unique Ways to Check Competitor Website Traffic

You only need three tools to get sixteen highly actionable data points on your competitors’ traffic.

Before we dive in, let’s set the right expectations: no tool will give you your competitor’s exact traffic data. However, it’s still well enough to see what works for them, copy their best ideas, or set realistic benchmarks.

We’ll cover:

  • Types of data you can access, such as traffic volume, trends, organic and paid keywords, and audience insights.
  • Practical use cases, including benchmarking, tracking progress, identifying content gaps, boosting your SEO and SEM, and negotiating budgets.
  • Last but not least, how this data is gathered and its reliability.

With these tools and insights, you’ll be well-equipped to understand and outperform your competitors’ website traffic.

We’ll start with organic search traffic — the source on which you’ll get the most data.

How to analyze competitor organic search traffic

Organic search traffic refers to the clicks a site gets from search engines, excluding search ads.

There’s a lot you can tell about your competitors’ organic traffic and a lot you can tell from it. Here are my favorite twelve use cases with detailed instructions.

You can check that in seconds for free, right now:

The tool will also show you where in the world the traffic is coming from, some of the top pages and keywords, and traffic value (i.e., the value of the organic search traffic, if it were to be acquired via PPC in Google Ads).

Organic competitors are the sites that compete with you for the same organic keywords in search engines.

Typically, you’ll have more organic competitors than your regular direct business competitors. For example, a 3D printer manufacturer may be competing for a fair share of keywords with a 3D printing magazine — completely different businesses, same keywords.

So by rounding up your top organic competitors, you gain a bigger pool of keyword ideas you can potentially target. Much bigger than if you’d just take into account your direct competitors.

Here’s how to identify all organic competitors.

  1. Open Ahrefs’ Site Explorer and enter your domain.
  2. Go to the Organic competitors report.
Organic competitors report. Organic competitors report.

From there, you can look at the common keywords to see where they outrank you or click on Competitor’s keywords to see keywords you don’t rank for but they do (a.k.a. your content gap).

Top competing domains report showing keyword intersect. Top competing domains report showing keyword intersect.

If your competitor is doing SEO, typically their blog will attract most of their organic traffic. But this is not always the case. They may have found other ways of getting clicks from Google, like free tools or free resources, and you could do the same.

  1. Open Site Explorer and enter your competitor’s domain.
  2. Go to the Site structure report.
Site structure report. Site structure report.

For example, someone analyzing our site could see that our free writing tools get more organic traffic than years of writing on the blog.

Free writing tools get more organic traffic than years of writing on the blog. Free writing tools get more organic traffic than years of writing on the blog.

To see your competitor’s top performing pages:

  1. Go to Site Explorer and enter your competitor’s domain.
  2. Go to the Top pages report.
Top pages report.Top pages report.

The first use case here is targeting the same keywords as their top pages to channel some of that traffic your way.

Top keyword column in Top pages report. Top keyword column in Top pages report.

There’s more. You can use the report to see which pages contributed to an uptrend or downtrend in your competitor’s traffic.

Analyzing changes in traffic with the Top pages report. Analyzing changes in traffic with the Top pages report.

Or, focus on top-performing pages and use the Compare pages view to see when those pages started to pick up traffic.

Comparing pages in Top pages report.Comparing pages in Top pages report.

Now to see what the competitors did to improve the pages, click on the caret next to the page and click Inspect.

Accessing the Inspect tool contextually.  Accessing the Inspect tool contextually.

Then choose the date on the calendar and view changes made to the text in that time.

Calendar tool in Ahrefs. Calendar tool in Ahrefs.

If you’re already doing SEO or considering it, seeing a list of your competitors’ keywords is almost like they’ve shared their keyword research with you.

You can use keyword data to find:

  • Top-performing keywords and “steal” some of their traffic with your own content.
  • Top-performing keywords in specific countries.
  • Keywords with specific terms to find content ideas around certain topics or phrases.
  • Low-difficulty keywords (typically, faster to rank).

To see your competitors’ keywords:

  1. Go to Site Explorer and enter your competitor’s domain.
  2. Go to the Organic keywords report.
  3. Use the filters to find what you need. For instance, use the KD filter to find low-competition keywords.
Organic keywords report in Ahrefs. Organic keywords report in Ahrefs.

For example, you can track the ranking history of your competitor’s top traffic-generating keywords. If you see sudden spikes, it likely means they’ve updated the content to increase ranking. By using the calendar feature mentioned above, you can learn how they did it.

SERP history. SERP history.

One of the best ways to find organic traffic you’re potentially missing out on is to do a content gap analysis. In SEO, it means identifying the keywords that your competitors rank for but you don’t. Some of those keywords can make perfect topics for you to cover.

In Ahefs, you can do a content gap analysis automatically:

  1. Go to Ahrefs’ Competitive Analysis tool.
  2. Enter your domain in the Target section.
  3. Enter your competitors’ domains in the Competitors section.
  4. Hit “Compare”.
  5. Click the Content Gap report.
Ahrefs' Competitive Analysis tool.
Ahrefs' Competitive Analysis tool.

Toggle Main positions to exclude your competitors’ rankings in SERP features like “Top stories” and “Image packs.”

Toggling the "Main positions only" feature.
Toggling the "Main positions only" feature.

Now look through the report and identify keywords that are relevant for your site. The volume column will show you which keywords are likely to send the most traffic.

More than 60,000 potential keyword opportunities via Ahrefs' Content Gap report.
More than 60,000 potential keyword opportunities via Ahrefs' Content Gap report.

Short-term organic traffic performance can inform you of the latest developments in your competitors’ rankings (say, within the last 24 hours to a couple of weeks).

For example, you can observe the impact of the latest Google Update on their site, see how much traffic they gained or lost last month, or check if any of their newly launched pages are already picking up traffic.

To see short-term organic traffic performance:

  1. Go to Site Explorer and enter your competitor’s domain.
  2. In the Overview report, choose a timeframe in the Changes mode.
Choosing a short-term data timeframe in Overview report. Choosing a short-term data timeframe in Overview report.

This will adjust the top-level metrics and traffic by location panel and show you the changes over the specified period.

1717077370 466 15 Unique Ways to Check Competitor Website Traffic1717077370 466 15 Unique Ways to Check Competitor Website Traffic

You can go as deep as day-to-day traffic changes — a very helpful thing if you want to see Google’s update impact on your competitors’ traffic.

Traffic performance graph showing exact day of a Google update. Traffic performance graph showing exact day of a Google update.

Date comparison is available in multiple tools and reports across Ahrefs.

As for long-term traffic performance, this allows to set a traffic goal to match or overtake your competitor’s traffic, and plan your budget based on competitor’s performance. You can also use it to forecast your competitors’ traffic.

To see long-term traffic performance:

  1. Go to Site Explorer and enter your competitor’s domain.
  2. Turn on the Years mode in the traffic graph.
  3. Adjust the time frame and export the data if needed.
Choosing a long-term data timeframe in Overview report. Choosing a long-term data timeframe in Overview report.

Seeing multiple sites on one graph is useful if you want to identify the leader in your niche, compare your site to a few competitors simultaneously, and determine if you are catching up to the leader or if someone is catching up to you.

Here’s how:

  1. Go to Site Explorer and enter your domain.
  2. Add competitors using the Competitors tab.
Zoho Desk's traffic (green) is catching up to Intercom (blue).
Zoho Desk's traffic (green) is catching up to Intercom (blue).

Organic share of voice (SOV) is an SEO metric that shows how much traffic goes to your pages compared to competitors’.

In other words, if you want to see your overall organic search traffic share in the market, and eventually increase it, this is the metric you’d want to use.

SOV is based on tracked keywords, so you first need to add them to the tool. These can be keywords you target on your blog, your product pages, or even all of your important keywords together.

  • Go to Ahrefs’ Rank Tracker.
  • Start a New project.
  • Select keywords to track. You can use the filters to refine the list suggested by the tool and add some keywords later on. Make sure to choose only important locations for your site.
Adding keywords to track in Ahrefs Rank Tracker. Adding keywords to track in Ahrefs Rank Tracker.
  • Add competitors. You can add specific sites or choose from the ones suggested by the tool. Notice the keyword intersect — the higher the number, the “closer” the competitor.
Adding competitors to analyze in Rank Tracker. Adding competitors to analyze in Rank Tracker.

Once you finish the set-up, you will be able to see and regularly track SOV in the Competitors Overview section in Rank Tracker.

Share of voice metric in Rank Tracker. Share of voice metric in Rank Tracker.

One of the ways your competitors could be getting traffic is from links from other sites (a.k.a. referral traffic).

Knowing who links to your competitors allows you to pursue the same or similar links which can help you not only get more referral traffic but also boost your SEO and increase your brand awareness.

To find pages with a high probability of sending traffic to your competitors, look for backlinks from pages with significant organic traffic. Here’s how:

  1. Go to Site Explorer and enter your competitor’s domain.
  2. Open Backlinks report. Pages with the most traffic will be displayed on top by default.
Backlinks report in Ahrefs. Backlinks report in Ahrefs.

From there you can use the Referring page title filter to see only reviews or rankings where you could be listed, too. Simply add in words like “vs, review, tool, tools, top” as a way to identify these pages.

Using the referring page title filter to see only reviews or rankings where you could be listed, too.Using the referring page title filter to see only reviews or rankings where you could be listed, too.

Here’s an example of such a page:

1717077372 49 15 Unique Ways to Check Competitor Website Traffic1717077372 49 15 Unique Ways to Check Competitor Website Traffic

Another way to analyze your competitors’ traffic is to treat them as one entity. This allows you to:

  • Benchmark your site traffic trend to your competitors as a market segment.
  • Identify broader industry trends and seasonal patterns in traffic.
  • Assess the collective impact of major events, such as changes in search engine algorithms or economic shifts.
  • Monitor the overall health and growth rate of your industry.

For this, use the Portfolios feature in Ahrefs. The image below shows aggregated data for four sites, including organic traffic and paid traffic (from Google Search Ads).

Example portfolio of sites. Example portfolio of sites.

Here’s how to set it up:

  • Dashboard and click Create > Portfolio.
How to create a new portfolio.  How to create a new portfolio.
  • Fill in the URLs you want to track. Note the URL mode selector. Use “Domain” to track the entire domain with subdomains, “Path” for folders, and “Exact URL” for single pages.
Filling details of a site portfolio. Filling details of a site portfolio.

How to analyze competitor paid search traffic

Paid search traffic refers to the clicks a site gets from search ads on search engine result pages. Here’s how to check your competitors’s paid search traffic and how to use that knowledge to your advantage.

If you’re running search ads, checking out your competitors’ paid keywords can give you ready-made keyword research. This lets you see which keywords are working for them and helps you fine-tune your own ad strategy to target those high-performing keywords.

What’s more, you can reveal paid search data Google Keyword Planner hides by default: search volume for a particular keyword instead of a search volume range for a group of keywords.

And even if you’re not investing in ads, this info can still be super useful. It usually means these keywords are important to your competitors because they know these keywords bring in customers. Chances are, these keywords could be important for your business, too.

To find your competitors’ paid keywords:

  1. Go to Site Explorer and enter your competitor’s domain.
  2. Open Paid keywords report.
Paid keywords report in Intercom. Paid keywords report in Intercom.

From here, you can use filters to find keywords that meet your CPC, traffic, or relevance criteria, and sort the data to see the keywords which bring the most traffic.

Filters in paid keywords report. Filters in paid keywords report.

Notice the Paid/organic traffic share bar. If you see both blue and yellow color, that means your competitor has invested in the keyword twice (through content and ads) and is trying to get as much SERP real estate as possible — consider pursuing these keywords as well.

Paid traffic/organic traffic share. Paid traffic/organic traffic share.

Another way to gauge a keyword’s importance is to look at its ad position history. A long and consistent history suggests it’s likely a valuable ‘money’ keyword, while a short history might indicate your competitor is just experimenting with it.

Ad history report. Ad history report.

Want to check out their ad copy and landing pages? Head to the Ads report. You can set the location where your competitor runs their ads and see the landing pages and keywords associated with each ad.

Ads report in Ahrefs. Ads report in Ahrefs.

Interested to see how much your competitors spend to get all of that paid traffic?

  1. Go to Site Explorer.
  2. Enter your competitor’s domain.
  3. Open Paid pages report.
  4. Set the preferred location to see the budget per country (leave it set to all locations to see the total ad spend).
  5. Set the Performance report to Paid traffic cost set and adjust the timeframe.
Paid pages report in Ahrefs. Paid pages report in Ahrefs.

Use this data to set a benchmark for traffic performance relative to ad spend and to negotiate the budget for your campaigns.

How to analyze other traffic sources

If you’re interested in the overall competitor traffic performance, here’s where to look.

To get a quick answer to how much traffic your competitors get overall (from all traffic sources), you can get that information for free with Similarweb.

Once you set up a free account, simply go to Website analysis > Website performance report.

Website performance report in Similarweb. Website performance report in Similarweb.

Arguably, the best way to use Similarweb is in comparison mode. This approach ensures that the data is directionally accurate: whether the data is overestimated or underestimated, it is consistently so across all sites. By comparing your traffic with your competitors, you can identify the relative differences that set you apart.

Comparing websites in Similarweb. Comparing websites in Similarweb.

Similarweb is not the only tool with general traffic insights. Another one is Sparktoro, an audience research tool.

What’s great about Sparktoro is that its data and functionality revolve around the users behind the clicks. So you can use Similarweb to understand how popular the site is and then Sparktoro to get to know the people who visit it. Take that data and use it for persona development, fine-tuning your messaging, and looking up influencers to partner with or sites to advertise on.

Simply set up an account at Sparktoro and type your competitor’s domain in the search bar. Make sure the “Visit the website” mode is on.

Overview report in Sparktoro. Overview report in Sparktoro.

From there go to:

  • Social networks: scroll down a bit and see which social network the brand uses the most. This not only tells which social networks likely send the most traffic but also which proved to be the most engaging.
  • Demographics tab: see data like gender, age, geography and interests. What’s unique about this data is that it comes from social media profiles.
  • Social accounts tab: to see what social media accounts site visitors are likely to follow and engage with. This is a great source of potential influencers to work with.
  • YouTube channels, Reddit, and Podcast tabs: see where it’s highly likely to meet your competitors’ (and possibly yours) audience.

Where does the data come from? Is it accurate?

Depending on the tool, the data on your competitors will mostly come from:

This means that, in most cases, the data is estimated instead of actual data taken from your competitors and handed over to you.

So, when it comes to the data’s accuracy, you should expect a blend of estimated accuracy and directional accuracy. Despite best efforts, the data will be approximated and designed to give you an idea of relative performance because there’s no other way.

This also means that if you’re interested in a particular type of traffic, say traffic from search engines, it’s probably best to get a dedicated tool for that. You’ll get access to bigger data sets and more capable functionality, allowing you to do more.

Final thoughts

Want to go deeper into competitor analysis? Check out our other guides to go beyond traffic data:

Got questions or comments? Let me know on X or LinkedIn.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

The Top 10 Content Marketing Skills You Need

Published

on

By

10 Content Marketing Skills You Need to Master

Want to reach more of your target audience, connect with them, and have meaningful interactions?

Quality content marketing may be the ideal solution for you.

But gone are the days of simply writing and releasing content.

Effective content marketing requires various skills and strategies if you want to get it right.

If you’re looking to breathe new life into your brand and generate more interest in your target audience, here are the top 10 skills and strategies you’ll need.

1. Know Your Audience And Target Them Effectively

Ask anyone about content and content marketing, and chances are that audience targeting is one of the first suggestions.

But what does audience targeting actually mean? And why is it an essential content marketing skill?

First, understand who your audience is, what their day is like, their priorities, and what they’re doing or intending to do while they consume content.

Then, use that information to craft content that counts on a platform and in a format that suits your audience.

Take the Shoe Snob Blog as an example.

The content is photo-centric. The page has few distractions, and the storytelling and text are dense and chunked.

The topics range from stories of shoemakers, care tips, and all the insider info a lover of bespoke and top-of-the-line men’s shoes, shoe designer, or shoemaker could want to know about the objects of their obsessions.

These features tell us a lot about the blog’s readers.

Shoe Snob Blog readers are likely visual, busy, and view reading the blog’s content as almost a secret pleasure they indulge in while waiting in line for an expensive coffee.

The blog doesn’t have content on saving money, getting things for less, building shoes more cheaply, or reviews of shoes you’d find in your local department store.

Why? That’s not what the blog’s target audience is interested in. In fact, those topics would likely chase readers away.

For Justin FitzPatrick, the blog’s author, it’s about the luxury, the emotional connection and passion for the brands, and the smaller details most of us wouldn’t likely notice about a man’s dress shoe – in language that matches the audience’s expertise.

You might be tempted to skip audience exploration and targeting to this degree, particularly if you’re a B2B brand or sell something non-visual like insurance.

But this could be a fatal mistake for your content marketing.

Even if you’re selling to another company, that company is driven and shaped by humans you’ll need to get attention from.

2. Understand How Brand Strategy Influences Content

Content and content marketing could do more harm than good if they fail to blend seamlessly with a brand strategy.

So, if you’re looking to build content marketing skills, ensure you understand how brand strategy influences effective content.

Solid brand-driven content strategies consist of six core elements when it comes to content:

  • Brand foundations – What matters to the company, such as the image it wishes to project, etc.
  • Audience discovery and brand position – How the brand fits within the market.
  • Keywords and language – How the company wants people to find its brand, and what language it will use.
  • Authority building – Looking like an expert and a leader on a chosen topic.
  • Content creation – Any content strategy must be manageable, affordable, sustainable, scalable, and effective.
  • Organization – Utilizing an editorial and publishing calendar and post-publishing tracking and measurement to maintain and guide your content strategy.

3. Consider SEO, Search, And Search Engines

SEO and search are essential for getting found, gaining traffic, building authority, and overall growth.

If you want your content marketing to work, you can’t afford to avoid this content marketing skill because you’re not an expert.

  • Users make 1.2 trillion searches on Google per year.
  • 93% of all web traffic comes from a search engine.
  • 46% of searches are made to look for something local.

In January 2023, searches for phrases that included “gifts” increased 45%, while searches that included “presents” increased 15% over 2022. This equated to $47 billion in the two weeks following Christmas.

So, search is growing and becoming more important – not declining.

If you want to take advantage of search traffic, you need to ensure you’re considering several aspects of SEO when developing your content marketing skills, including:

  • Keyword research.
  • AI and how to humanize your content.
  • Link building.
  • Building authority.
  • Topic relevance and expertise.
  • Site structure, website performance, and analytics.

4. Humanize Your Content

Once you get started with content marketing, you’ll realize pretty quickly that AI-generated content is highly problematic.

You need to follow basic SEO formulas to have your content rank, another formula to make it interesting and catchy for readers, and how to maximize the usability of your content.

However, you also need to ensure you stand out from the crowd and surpass your competitors.

To make your content more human-friendly, learn how to:

  • Create content that supports a user journey rather than search engines or sections of a funnel.
  • Utilize customer communications and social channels to understand and connect with your audience. Then, use it to market your content.
  • Make use of internal experts. Not only is looking in-house a way to make excellent content more affordable, but audiences also love to see your brand’s passion for what it does.
  • Take a smart angle, get personal, and have an attitude. Personality and branding are vital, but so is the information you provide. Ensure it is something of value to your readers, and don’t be afraid to tell stories to build emotional connections.
  • Add personal videos to top-performing articles.

One of the best examples of all these tips for humanized content in action is the annual Christmas content campaign from WestJet.

5. Engage By Storytelling And Creative Writing

If you want to capture attention and use content to connect with your audience, you need to be able to tell a good story.

Stories make content emotionally engaging but also make it possible for readers to experience what it would be like if they purchased your product or service.

Want to strengthen your content marketing with storytelling?

  • Create relatable, believable content. To do this, know your audience, understand their experiences, and create content that aligns with this knowledge.
  • Have a clear message. Like an ad, every story or piece of content needs a goal and a clear message you want to convey to your audience.
  • Choose the right type of story. Do you need to make an emotional connection? Compel a reader to act? Convey values, a feature, or a concept? Build community?
  • Select the right platform and medium. If you want to share several statistics, video might not be the best option. Selling vacations? YouTube or TikTok might perform better than Reddit or a blog.
  • Know where to start and stop. Your content needs to appear at the right point in the customer journey and push readers to the next step. What should readers do next?
  • Organize and structure. Plan your content ahead of time. Make sure your stories have an arc, make sense, and take readers or views through an experience.

6. Do Your Research

The best content provides an audience with information or a look at something they normally don’t have access to.

To find this information, you must be prepared for deep research – and that means a lot more than just finding a statistic.

Find the original source or study. Ensure the number you’ve found is still relevant and accurate. Consider the source of the statistic and how they arrived at that number. What did the study not consider when finding their statistic?

To build additional authority, you may consider interviewing the source of a statistic or a subject area expert.

7. Improve Your Interviewing Skills

While it helps if you deeply understand the subject matter, it isn’t all lost if you’re new to the topic.

In fact, being a newbie to a topic can have advantages because you can see the topic with a fresh perspective.

One thing you must be knowledgeable about, however, is interviews. Interviewing is an essential content marketing skill.

Here are some tips:

Prepare

Arrive at the interview with an understanding of the topic. Know the pains and challenges individuals interested in the topic face.

Understand your priorities for your readers, the industry, and the individual you’re interviewing.

Have a list of questions that are thoughtful and organized, and work toward answering a single question or reaching a specific goal.

Set Interview Goal

Are you trying to get tips from an expert? A day in the life of? Solve or bring light to a certain issue? Make a human connection?

Choose a goal for your interview, organize it into an outline, and remove any question or information that doesn’t help you move toward that goal.

Be Personable And Make The Interviewee Comfortable

Awkward silences, a lack of rapport, nervousness, and other social aspects can interfere with an otherwise excellent interview and affect the information you collect.

You may want to consider using cognitive interview techniques, which have been adapted from criminal investigation for journalism.

Record Your Conversation

As humans, our brains prioritize stimuli to determine what is important and what we should pay attention to and remember.

This attentional filtering becomes more severe when you’re making notes, thinking about the technical aspects of an interview, and nervous. As a result, it’s easy to miss important details or implications.

So, save some time and improve your accuracy and insights into the information provided during the interview by making a recording that you can refer to as often as necessary.

Be Precise And Ask For Clarification

Some people love raisins in cinnamon buns. Others do not. And just like the raisins debate, how you define a word or concept may vary greatly from someone else.

So, if the information you collect during an interview seems vague, or you’re unsure of something the interviewee says, ask.

The worst thing you can do is assume that it isn’t true or deliberately influence the meaning of someone’s words.

8. Measure And Track Everything

Measuring something is generally easy. The difficult part of measurement and tracking is measuring and tracking the right things.

SEJ’s annual State of SEO Report reveals that SEO professionals often have a mismatch between their goals, the methods and strategies they use to reach them, and the variables they measure.

Content marketers and marketing are no exception.

Let’s say you want to use content marketing to increase conversions. So, you create a video for your hot tub company.

In this instance, tracking and analyzing traffic data to the video would be a mistake. Those numbers are only part of the story.

Instead, track clicks and use traffic data to better understand who clicks through to your content and where viewers go after they consume it.

And this is vital: Don’t stop your analysis at the click.

Every visit from a viewer is only one step in a larger journey – and this journey matters.

Returning to the previous example, your video might have generated fewer clicks and conversions overall.

Dig a little deeper, however, and you might discover that those few conversions were of much higher value than average, and the viewers return to your site more often than your average site viewer.

In this instance, while traffic numbers might make it look like your video failed, analysis of the customer journey reveals that your video was actually a big success, attracting a more qualified, valuable, and engaged audience.

9. Repackage Content With Purpose

You invest a lot of resources in creating amazing content. Don’t simply publish it in one format and waste the rest of its potential.

Before creating content, consider all the different formats and ways you can share it to get attention.

By planning, you can collect images, video footage, sound bites, expert quotes, and everything you’ll need to share and market your content in various ways to maximize your return on investment (ROI).

But refrain from repackaging content with the sole purpose of spreading it everywhere. Carefully plan your content to appear when and where you need to.

As explained in the video above, Search Engine Journal uses the data gathered for its State of SEO Report to create:

  • White paper reports.
  • Podcast.
  • Articles on data not included in the main reports.
  • Infographics.
  • Carousels for social media.
  • Video clips.

Some of these are released before the main report is published to help spread the word and generate interest while sharing interesting insights about the SEO industry.

Then, when the report is released, it is followed by additional content to help generate interest, links, and findings.

Therefore, instead of a week of interest, the reports generate traffic and attention while informing readers for months without significantly increasing the original investment.

10. Stand Out While Blending In

One of the more common pieces of advice is to copy successful content and do what others are doing.

Makes sense, right?

After all, SEO, good writing, and other skills all have best practices you need to follow. Your audience also has preferences, expectations, and requirements.

Your content needs to look like everyone else’s to some degree.

But here’s the problem with this advice: No one stands out if everyone does things the same way.

Therefore, learning how to blend in while standing out is an essential skill for content marketing.

So, instead of mimicking or copying successful content, collect several examples that have worked on a specific platform or for a specific audience and investigate to find out why they’re effective.

Then, you can use these insights to create and test your own content that allows you to stand out, be unique, and fulfill the needs of your target audience.

Conclusion

Effective marketing is more than choosing the right topic or quality writing.

By strengthening and utilizing these 10 content marketing skills, your content will help you generate the right traffic and connect with your audience in a way that will have you dominating the competition.

More resources:


Featured Image: Viktoria Kurpas/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Documents Leaked & SEOs Are Making Some Wild Assumptions

Published

on

Google Documents Leaked & SEOs Are Making Some Wild Assumptions

You’ve probably heard about the recent Google documents leak. It’s on every major site and all over social media.

Where did the docs come from?

My understanding is that a bot called yoshi-code-bot leaked docs related to the Content API Warehouse on Github on March 13th, 2024. It may have appeared earlier in some other repos, but this is the one that was first discovered.

They were discovered by an anonymous ex-Googler who shared the info with Erfan Azimi who shared it with Rand Fishkin who shared it with Mike King. The docs were removed on May 7th.

I appreciate all involved for sharing their findings with the community.

Google’s response

There was some debate if the documents were real or not, but they mention a lot of internal systems and link to internal documentation and it definitely appears to be real.

A Google spokesperson released the following statement to Search Engine Land:

We would caution against making inaccurate assumptions about Search based on out-of-context, outdated, or incomplete information. We’ve shared extensive information about how Search works and the types of factors that our systems weigh, while also working to protect the integrity of our results from manipulation.

SEOs interpret things based on their own experiences and bias

Many SEOs are saying that the ranking factors leaked. I haven’t seen any code or weights, just what appear to be descriptions and storage info. Unless one of the descriptions says the item is used for ranking, I think it’s dangerous for SEOs that all of these are used in ranking.

Having some features or information stored does not mean they’re used in ranking. For our search engine, Yep.com, we have all kinds of things stored that might be used for crawling, indexing, ranking, personalization, testing, or feedback. We even have things stored that we aren’t doing things with yet.

What is more likely is that SEOs are making assumptions that favor their own opinions and biases.

It’s the same for me. I may not have full context or knowledge and may have inherent biases that influence my interpretation, but I try to be as fair as I can be. If I’m wrong, it means that I will learn something new and that’s a good thing! SEOs can, and do, interpret things differently.

Gael Breton said it well:

I’ve been around long enough to see many SEO myths created over the years and I can point you to who started many of them and what they misunderstood. We’ll likely see a lot of new myths from this leak that we’ll be dealing with for the next decade or longer.

Let’s look at a few things that in my opinion are being misinterpreted or where conclusions are being drawn where they shouldn’t be.

SiteAuthority

As much as I want to be able to say Google has a Site Authority score that they use for ranking that’s like DR, that part specifically is about compressed quality metrics and talks about quality.

I believe DR is more an effect that happens as you have a lot of pages with strong PageRank, not that it’s necessarily something Google uses. Lots of pages with higher PageRank that internally link to each other means you’re more likely to create stronger pages.

  • Do I believe that PageRank could be part of what Google calls quality? Yes.
  • Do I think that’s all of it? No.
  • Could Site Authority be something similar to DR? Maybe. It fits in the bigger picture.
  • Can I prove that or even that it’s used in rankings? No, not from this.

From some of the Google testimony to the US Department of Justice, we found out that quality is often measured with an Information Satisfaction (IS) score from the raters. This isn’t directly used in rankings, but is used for feedback, testing, and fine-tuning models.

We know the quality raters have the concept of E-E-A-T, but again that’s not exactly what Google uses. They use signals that align to E-E-A-T.

Some of the E-E-A-T signals that Google has mentioned are:

  • PageRank
  • Mentions on authoritative sites
  • Site queries. This could be “site:http://ahrefs.com E-E-A-T” or searches like “ahrefs E-E-A-T”

So could some kind of PageRank scores extrapolated to the domain level and called Site Authority be used by Google and be part of what makes up the quality signals? I’d say it’s plausible, but this leak doesn’t prove it.

I can recall 3 patents from Google I’ve seen about quality scores. One of them aligns with the signals above for site queries.

I should point out that just because something is patented, doesn’t mean it is used. The patent around site queries was written in part by Navneet Panda. Want to guess who the Panda algorithm that related to quality was named after? I’d say there’s a good chance this is being used.

The others were around n-gram usage and seemed to be to calculate a quality score for a new website and another mentioned time on site.

Sandbox

I think this has been misinterpreted as well. The document has a field called hostAge and refers to a sandbox, but it specifically says it’s used “to sandbox fresh spam in serving time.”

To me, that doesn’t confirm the existence of a sandbox in the way that SEOs see it where new sites can’t rank. To me, it reads like a spam protection measure.

Clicks

Are clicks used in rankings? Well, yes, and no.

We know Google uses clicks for things like personalization, timely events, testing, feedback, etc. We know they have models upon models trained on the click data including navBoost. But is that directly accessing the click data and being used in rankings? Nothing I saw confirms that.

The problem is SEOs are interpreting this as CTR is a ranking factor. Navboost is made to predict which pages and features will be clicked. It’s also used to cut down on the number of returned results which we learned from the DOJ trial.

As far as I know, there is nothing to confirm that it takes into account the click data of individual pages to re-order the results or that if you get more people to click on your individual results, that your rankings would go up.

That should be easy enough to prove if it was the case. It’s been tried many times. I tried it years ago using the Tor network. My friend Russ Jones (may he rest in peace) tried using residential proxies.

I’ve never seen a successful version of this and people have been buying and trading clicks on various sites for years. I’m not trying to discourage you or anything. Test it yourself, and if it works, publish the study.

Rand Fishkin’s tests for searching and clicking a result at conferences years ago showed that Google used click data for trending events, and they would boost whatever result was being clicked. After the experiments, the results went right back to normal. It’s not the same as using them for the normal rankings.

Authors

We know Google matches authors with entities in the knowledge graph and that they use them in Google news.

There seems to be a decent amount of author info in these documents, but nothing about them confirms that they’re used in rankings as some SEOs are speculating.

Was Google lying to us?

What I do disagree with whole-heartedly is SEOs being angry with the Google Search Advocates and calling them liars. They’re nice people who are just doing their job.

If they told us something wrong, it’s likely because they don’t know, they were misinformed, or they’ve been instructed to obfuscate something to prevent abuse. They don’t deserve the hate that the SEO community is giving them right now. We’re lucky that they share information with us at all.

If you think something they said is wrong, go and run a test to prove it. Or if there’s a test you want me to run, let me know. Just being mentioned in the docs is not proof that a thing is used in rankings.

Final Thoughts

While I may agree or I may disagree with the interpretations of other SEOs, I respect all who are willing to share their analysis. It’s not easy to put yourself or your thoughts out there for public scrutiny.

I also want to reiterate that unless these fields specifically say they are used in rankings, that the information could just as easily be used for something else. We definitely don’t need any posts about Google’s 14,000 ranking factors.

If you want my thoughts on a particular thing, message me on X or LinkedIn.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending