Connect with us

SEO

Using Python + Streamlit To Find Striking Distance Keyword Opportunities

Published

on

Python is an excellent tool to automate repetitive tasks as well as gain additional insights into data.

In this article, you’ll learn how to build a tool to check which keywords are close to ranking in positions one to three and advises whether there is an opportunity to naturally work those keywords into the page.

It’s perfect for Python beginners and pros alike and is a great introduction to using Python for SEO.

If you’d just like to get stuck in there’s a handy Streamlit app available for the code. This is simple to use and requires no coding experience.

There’s also a Google Colaboratory Sheet if you’d like to poke around with the code. If you can crawl a website, you can use this script!

Here’s an example of what we’ll be making today:

Screenshot from Microsoft Excel, October 2021An Excel sheet documenting onpage keywords opportunites generated with Python

These keywords are found in the page title and H1, but not in the copy. Adding these keywords naturally to the existing copy would be an easy way to increase relevancy for these keywords.

By taking the hint from search engines and naturally including any missing keywords a site already ranks for, we increase the confidence of search engines to rank those keywords higher in the SERPs.

This report can be created manually, but it’s pretty time-consuming.

So, we’re going to automate the process using a Python SEO script.

Preview Of The Output

This is a sample of what the final output will look like after running the report:

Excel sheet showing and example of keywords that can be optimised by using the striking distance reportScreenshot from Microsoft Excel, October 2021Excel sheet showing and example of keywords that can be optimised by using the striking distance report

The final output takes the top five opportunities by search volume for each page and neatly lays each one horizontally along with the estimated search volume.

It also shows the total search volume of all keywords a page has within striking distance, as well as the total number of keywords within reach.

The top five keywords by search volume are then checked to see if they are found in the title, H1, or copy, then flagged TRUE or FALSE.

This is great for finding quick wins! Just add the missing keyword naturally into the page copy, title, or H1.

Getting Started

The setup is fairly straightforward. We just need a crawl of the site (ideally with a custom extraction for the copy you’d like to check), and an exported file of all keywords a site ranks for.

This post will walk you through the setup, the code, and will link to a Google Colaboratory sheet if you just want to get stuck in without coding it yourself.

To get started you will need:

We’ve named this the Striking Distance Report as it flags keywords that are easily within striking distance.

(We have defined striking distance as keywords that rank in positions four to 20, but have made this a configurable option in case you would like to define your own parameters.)

Striking Distance SEO Report: Getting Started

1. Crawl The Target Website

  • Set a custom extractor for the page copy (optional, but recommended).
  • Filter out pagination pages from the crawl.

2. Export All Keywords The Site Ranks For Using Your Favorite Provider

  • Filter keywords that trigger as a site link.
  • Remove keywords that trigger as an image.
  • Filter branded keywords.
  • Use both exports to create an actionable Striking Distance report from the keyword and crawl data with Python.

Crawling The Site

I’ve opted to use Screaming Frog to get the initial crawl. Any crawler will work, so long as the CSV export uses the same column names or they’re renamed to match.

The script expects to find the following columns in the crawl CSV export:

"Address", "Title 1", "H1-1", "Copy 1", "Indexability"

Crawl Settings

The first thing to do is to head over to the main configuration settings within Screaming Frog:

Configuration > Spider > Crawl

The main settings to use are:

Crawl Internal Links, Canonicals, and the Pagination (Rel Next/Prev) setting.

(The script will work with everything else selected, but the crawl will take longer to complete!)

Recommended Screaming Frog Crawl SettingsScreenshot from Screaming Frog, October 2021Recommended Screaming Frog Crawl Settings

Next, it’s on to the Extraction tab.

Configuration > Spider > Extraction

Recommended Screaming Frog Extraction Crawl SettingsScreenshot from Screaming Frog, October 2021Recommended Screaming Frog Extraction Crawl Settings

At a bare minimum, we need to extract the page title, H1, and calculate whether the page is indexable as shown below.

Indexability is useful because it’s an easy way for the script to identify which URLs to drop in one go, leaving only keywords that are eligible to rank in the SERPs.

If the script cannot find the indexability column, it’ll still work as normal but won’t differentiate between pages that can and cannot rank.

Setting A Custom Extractor For Page Copy

In order to check whether a keyword is found within the page copy, we need to set a custom extractor in Screaming Frog.

Configuration > Custom > Extraction

Name the extractor “Copy” as seen below.

Screaming Frog Custom Extraction Showing Default Options for Extracting the Page CopyScreenshot from Screaming Frog, October 2021Screaming Frog Custom Extraction Showing Default Options for Extracting the Page Copy

Important: The script expects the extractor to be named “Copy” as above, so please double check!

Lastly, make sure Extract Text is selected to export the copy as text, rather than HTML.

There are many guides on using custom extractors online if you need help setting one up, so I won’t go over it again here.

Once the extraction has been set it’s time to crawl the site and export the HTML file in CSV format.

Exporting The CSV File

Exporting the CSV file is as easy as changing the drop-down menu displayed underneath Internal to HTML and pressing the Export button.

Internal > HTML > Export

Screaming Frog - Export Internal HTML SettingsScreenshot from Screaming Frog, October 2021Screaming Frog - Export Internal HTML Settings

After clicking Export, It’s important to make sure the type is set to CSV format.

The export screen should look like the below:

Screaming Frog Internal HTML CSV Export SettingsScreenshot from Screaming Frog, October 2021Screaming Frog Internal HTML CSV Export Settings

Tip 1: Filtering Out Pagination Pages

I recommend filtering out pagination pages from your crawl either by selecting Respect Next/Prev under the Advanced settings (or just deleting them from the CSV file, if you prefer).

Screaming Frog Settings to Respect Rel / PrevScreenshot from Screaming Frog, October 2021Screaming Frog Settings to Respect Rel / Prev

Tip 2: Saving The Crawl Settings

Once you have set the crawl up, it’s worth just saving the crawl settings (which will also remember the custom extraction).

This will save a lot of time if you want to use the script again in the future.

File > Configuration > Save As

How to save a configuration file in screaming frogScreenshot from Screaming Frog, October 2021How to save a configuration file in screaming frog

Exporting Keywords

Once we have the crawl file, the next step is to load your favorite keyword research tool and export all of the keywords a site ranks for.

The goal here is to export all the keywords a site ranks for, filtering out branded keywords and any which triggered as a sitelink or image.

For this example, I’m using the Organic Keyword Report in Ahrefs, but it will work just as well with Semrush if that’s your preferred tool.

In Ahrefs, enter the domain you’d like to check in Site Explorer and choose Organic Keywords.

Ahrefs Site Explorer SettingsScreenshot from Ahrefs.com, October 2021Ahrefs Site Explorer Settings

Site Explorer > Organic Keywords

Ahrefs - How Setting to Export Organic Keywords a Site Ranks ForScreenshot from Ahrefs.com, October 2021Ahrefs - How Setting to Export Organic Keywords a Site Ranks For

This will bring up all keywords the site is ranking for.

Filtering Out Sitelinks And Image links

The next step is to filter out any keywords triggered as a sitelink or an image pack.

The reason we need to filter out sitelinks is that they have no influence on the parent URL ranking. This is because only the parent page technically ranks for the keyword, not the sitelink URLs displayed under it.

Filtering out sitelinks will ensure that we are optimizing the correct page.

Ahrefs Screenshot Demonstrating Pages Ranking for Sitelink KeywordsScreenshot from Ahrefs.com, October 2021Ahrefs Screenshot Demonstrating Pages Ranking for Sitelink Keywords

Here’s how to do it in Ahrefs.

Image showing how to exclude images and sitelinks from a keyword exportScreenshot from Ahrefs.com, October 2021Image showing how to exclude images and sitelinks from a keyword export

Lastly, I recommend filtering out any branded keywords. You can do this by filtering the CSV output directly, or by pre-filtering in the keyword tool of your choice before the export.

Finally, when exporting make sure to choose Full Export and the UTF-8 format as shown below.

Image showing how to export keywords in UTF-8 format as a csv fileScreenshot from Ahrefs.com, October 2021Image showing how to export keywords in UTF-8 format as a csv file

By default, the script works with Ahrefs (v1/v2) and Semrush keyword exports. It can work with any keyword CSV file as long as the column names the script expects are present.

Processing

The following instructions pertain to running a Google Colaboratory sheet to execute the code.

There is now a simpler option for those that prefer it in the form of a Streamlit app. Simply follow the instructions provided to upload your crawl and keyword file.

Now that we have our exported files, all that’s left to be done is to upload them to the Google Colaboratory sheet for processing.

Select Runtime > Run all from the top navigation to run all cells in the sheet.

Image showing how to run the stirking distance Python script from Google CollaboratoryScreenshot from Colab.research.google.com, October 2021Image showing how to run the stirking distance Python script from Google Collaboratory

The script will prompt you to upload the keyword CSV from Ahrefs or Semrush first and the crawl file afterward.

Image showing how to upload the csv files to Google CollaboratoryScreenshot from Colab.research.google.com, October 2021Image showing how to upload the csv files to Google Collaboratory

That’s it! The script will automatically download an actionable CSV file you can use to optimize your site.

Image showing the Striking Distance final outputScreenshot from Microsoft Excel, October 2021Image showing the Striking Distance final output

Once you’re familiar with the whole process, using the script is really straightforward.

Code Breakdown And Explanation

If you’re learning Python for SEO and interested in what the code is doing to produce the report, stick around for the code walkthrough!

Install The Libraries

Let’s install pandas to get the ball rolling.

!pip install pandas

Import The Modules

Next, we need to import the required modules.

import pandas as pd
from pandas import DataFrame, Series
from typing import Union
from google.colab import files

Set The Variables

Now it’s time to set the variables.

The script considers any keywords between positions four and 20 as within striking distance.

Changing the variables here will let you define your own range if desired. It’s worth experimenting with the settings to get the best possible output for your needs.

# set all variables here
min_volume = 10  # set the minimum search volume
min_position = 4  # set the minimum position  / default = 4
max_position = 20 # set the maximum position  / default = 20
drop_all_true = True  # If all checks (h1/title/copy) are true, remove the recommendation (Nothing to do)
pagination_filters = "filterby|page|p="  # filter patterns used to detect and drop paginated pages

Upload The Keyword Export CSV File

The next step is to read in the list of keywords from the CSV file.

It is set up to accept an Ahrefs report (V1 and V2) as well as a Semrush export.

This code reads in the CSV file into a Pandas DataFrame.

upload = files.upload()
upload = list(upload.keys())[0]
df_keywords = pd.read_csv(
    (upload),
    error_bad_lines=False,
    low_memory=False,
    encoding="utf8",
    dtype={
        "URL": "str",
        "Keyword": "str",
        "Volume": "str",
        "Position": int,
        "Current URL": "str",
        "Search Volume": int,
    },
)
print("Uploaded Keyword CSV File Successfully!")

If everything went to plan, you’ll see a preview of the DataFrame created from the keyword CSV export. 

Dataframe showing sucessful upload of the keyword export fileScreenshot from Colab.research.google.com, October 2021Dataframe showing sucessful upload of the keyword export file

Upload The Crawl Export CSV File

Once the keywords have been imported, it’s time to upload the crawl file.

This fairly simple piece of code reads in the crawl with some error handling option and creates a Pandas DataFrame named df_crawl.

upload = files.upload()
upload = list(upload.keys())[0]
df_crawl = pd.read_csv(
    (upload),
        error_bad_lines=False,
        low_memory=False,
        encoding="utf8",
        dtype="str",
    )
print("Uploaded Crawl Dataframe Successfully!")

Once the CSV file has finished uploading, you’ll see a preview of the DataFrame.

Image showing a dataframe of the crawl file being uploaded successfullyScreenshot from Colab.research.google.com, October 2021Image showing a dataframe of the crawl file being uploaded successfully

Clean And Standardize The Keyword Data

The next step is to rename the column names to ensure standardization between the most common types of file exports.

Essentially, we’re getting the keyword DataFrame into a good state and filtering using cutoffs defined by the variables.

df_keywords.rename(
    columns={
        "Current position": "Position",
        "Current URL": "URL",
        "Search Volume": "Volume",
    },
    inplace=True,
)

# keep only the following columns from the keyword dataframe
cols = "URL", "Keyword", "Volume", "Position"
df_keywords = df_keywords.reindex(columns=cols)

try:
    # clean the data. (v1 of the ahrefs keyword export combines strings and ints in the volume column)
    df_keywords["Volume"] = df_keywords["Volume"].str.replace("0-10", "0")
except AttributeError:
    pass

# clean the keyword data
df_keywords = df_keywords[df_keywords["URL"].notna()]  # remove any missing values
df_keywords = df_keywords[df_keywords["Volume"].notna()]  # remove any missing values
df_keywords = df_keywords.astype({"Volume": int})  # change data type to int
df_keywords = df_keywords.sort_values(by="Volume", ascending=False)  # sort by highest vol to keep the top opportunity

# make new dataframe to merge search volume back in later
df_keyword_vol = df_keywords[["Keyword", "Volume"]]

# drop rows if minimum search volume doesn't match specified criteria
df_keywords.loc[df_keywords["Volume"] < min_volume, "Volume_Too_Low"] = "drop"
df_keywords = df_keywords[~df_keywords["Volume_Too_Low"].isin(["drop"])]

# drop rows if minimum search position doesn't match specified criteria
df_keywords.loc[df_keywords["Position"] <= min_position, "Position_Too_High"] = "drop"
df_keywords = df_keywords[~df_keywords["Position_Too_High"].isin(["drop"])]
# drop rows if maximum search position doesn't match specified criteria
df_keywords.loc[df_keywords["Position"] >= max_position, "Position_Too_Low"] = "drop"
df_keywords = df_keywords[~df_keywords["Position_Too_Low"].isin(["drop"])]

Clean And Standardize The Crawl Data

Next, we need to clean and standardize the crawl data.

Essentially, we use reindex to only keep the “Address,” “Indexability,” “Page Title,” “H1-1,” and “Copy 1” columns, discarding the rest.

We use the handy “Indexability” column to only keep rows that are indexable. This will drop canonicalized URLs, redirects, and so on. I recommend enabling this option in the crawl.

Lastly, we standardize the column names so they’re a little nicer to work with.

# keep only the following columns from the crawl dataframe
cols = "Address", "Indexability", "Title 1", "H1-1", "Copy 1"
df_crawl = df_crawl.reindex(columns=cols)
# drop non-indexable rows
df_crawl = df_crawl[~df_crawl["Indexability"].isin(["Non-Indexable"])]
# standardise the column names
df_crawl.rename(columns={"Address": "URL", "Title 1": "Title", "H1-1": "H1", "Copy 1": "Copy"}, inplace=True)
df_crawl.head()

Group The Keywords

As we approach the final output, it’s necessary to group our keywords together to calculate the total opportunity for each page.

Here, we’re calculating how many keywords are within striking distance for each page, along with the combined search volume.

# groups the URLs (remove the dupes and combines stats)
# make a copy of the keywords dataframe for grouping - this ensures stats can be merged back in later from the OG df
df_keywords_group = df_keywords.copy()
df_keywords_group["KWs in Striking Dist."] = 1  # used to count the number of keywords in striking distance
df_keywords_group = (
    df_keywords_group.groupby("URL")
    .agg({"Volume": "sum", "KWs in Striking Dist.": "count"})
    .reset_index()
)
df_keywords_group.head()
DataFrame showing how many keywords were found within striking distanceScreenshot from Colab.research.google.com, October 2021DataFrame showing how many keywords were found within striking distance

Once complete, you’ll see a preview of the DataFrame.

Display Keywords In Adjacent Rows

We use the grouped data as the basis for the final output. We use Pandas.unstack to reshape the DataFrame to display the keywords in the style of a GrepWords export.

DataFrame showing a grepwords type-view of keywords laid out horizontallyScreenshot from Colab.research.google.com, October 2021DataFrame showing a grepwords type-view of keywords laid out horizontally
# create a new df, combine the merged data with the original data. display in adjacent rows ala grepwords
df_merged_all_kws = df_keywords_group.merge(
    df_keywords.groupby("URL")["Keyword"]
    .apply(lambda x: x.reset_index(drop=True))
    .unstack()
    .reset_index()
)

# sort by biggest opportunity
df_merged_all_kws = df_merged_all_kws.sort_values(
    by="KWs in Striking Dist.", ascending=False
)

# reindex the columns to keep just the top five keywords
cols = "URL", "Volume", "KWs in Striking Dist.", 0, 1, 2, 3, 4
df_merged_all_kws = df_merged_all_kws.reindex(columns=cols)

# create union and rename the columns
df_striking: Union[Series, DataFrame, None] = df_merged_all_kws.rename(
    columns={
        "Volume": "Striking Dist. Vol",
        0: "KW1",
        1: "KW2",
        2: "KW3",
        3: "KW4",
        4: "KW5",
    }
)

# merges striking distance df with crawl df to merge in the title, h1 and category description
df_striking = pd.merge(df_striking, df_crawl, on="URL", how="inner")

Set The Final Column Order And Insert Placeholder Columns

Lastly, we set the final column order and merge in the original keyword data.

There are a lot of columns to sort and create!

# set the final column order and merge the keyword data in

cols = [
    "URL",
    "Title",
    "H1",
    "Copy",
    "Striking Dist. Vol",
    "KWs in Striking Dist.",
    "KW1",
    "KW1 Vol",
    "KW1 in Title",
    "KW1 in H1",
    "KW1 in Copy",
    "KW2",
    "KW2 Vol",
    "KW2 in Title",
    "KW2 in H1",
    "KW2 in Copy",
    "KW3",
    "KW3 Vol",
    "KW3 in Title",
    "KW3 in H1",
    "KW3 in Copy",
    "KW4",
    "KW4 Vol",
    "KW4 in Title",
    "KW4 in H1",
    "KW4 in Copy",
    "KW5",
    "KW5 Vol",
    "KW5 in Title",
    "KW5 in H1",
    "KW5 in Copy",
]

# re-index the columns to place them in a logical order + inserts new blank columns for kw checks.
df_striking = df_striking.reindex(columns=cols)

Merge In The Keyword Data For Each Column

This code merges the keyword volume data back into the DataFrame. It’s more or less the equivalent of an Excel VLOOKUP function.

# merge in keyword data for each keyword column (KW1 - KW5)
df_striking = pd.merge(df_striking, df_keyword_vol, left_on="KW1", right_on="Keyword", how="left")
df_striking['KW1 Vol'] = df_striking['Volume']
df_striking.drop(['Keyword', 'Volume'], axis=1, inplace=True)
df_striking = pd.merge(df_striking, df_keyword_vol, left_on="KW2", right_on="Keyword", how="left")
df_striking['KW2 Vol'] = df_striking['Volume']
df_striking.drop(['Keyword', 'Volume'], axis=1, inplace=True)
df_striking = pd.merge(df_striking, df_keyword_vol, left_on="KW3", right_on="Keyword", how="left")
df_striking['KW3 Vol'] = df_striking['Volume']
df_striking.drop(['Keyword', 'Volume'], axis=1, inplace=True)
df_striking = pd.merge(df_striking, df_keyword_vol, left_on="KW4", right_on="Keyword", how="left")
df_striking['KW4 Vol'] = df_striking['Volume']
df_striking.drop(['Keyword', 'Volume'], axis=1, inplace=True)
df_striking = pd.merge(df_striking, df_keyword_vol, left_on="KW5", right_on="Keyword", how="left")
df_striking['KW5 Vol'] = df_striking['Volume']
df_striking.drop(['Keyword', 'Volume'], axis=1, inplace=True)

Clean The Data Some More

The data requires additional cleaning to populate empty values, (NaNs), as empty strings. This improves the readability of the final output by creating blank cells, instead of cells populated with NaN string values.

Next, we convert the columns to lowercase so that they match when checking whether a target keyword is featured in a specific column.

# replace nan values with empty strings
df_striking = df_striking.fillna("")
# drop the title, h1 and category description to lower case so kws can be matched to them
df_striking["Title"] = df_striking["Title"].str.lower()
df_striking["H1"] = df_striking["H1"].str.lower()
df_striking["Copy"] = df_striking["Copy"].str.lower()

Check Whether The Keyword Appears In The Title/H1/Copy and Return True Or False

This code checks if the target keyword is found in the page title/H1 or copy.

It’ll flag true or false depending on whether a keyword was found within the on-page elements.

df_striking["KW1 in Title"] = df_striking.apply(lambda row: row["KW1"] in row["Title"], axis=1)
df_striking["KW1 in H1"] = df_striking.apply(lambda row: row["KW1"] in row["H1"], axis=1)
df_striking["KW1 in Copy"] = df_striking.apply(lambda row: row["KW1"] in row["Copy"], axis=1)
df_striking["KW2 in Title"] = df_striking.apply(lambda row: row["KW2"] in row["Title"], axis=1)
df_striking["KW2 in H1"] = df_striking.apply(lambda row: row["KW2"] in row["H1"], axis=1)
df_striking["KW2 in Copy"] = df_striking.apply(lambda row: row["KW2"] in row["Copy"], axis=1)
df_striking["KW3 in Title"] = df_striking.apply(lambda row: row["KW3"] in row["Title"], axis=1)
df_striking["KW3 in H1"] = df_striking.apply(lambda row: row["KW3"] in row["H1"], axis=1)
df_striking["KW3 in Copy"] = df_striking.apply(lambda row: row["KW3"] in row["Copy"], axis=1)
df_striking["KW4 in Title"] = df_striking.apply(lambda row: row["KW4"] in row["Title"], axis=1)
df_striking["KW4 in H1"] = df_striking.apply(lambda row: row["KW4"] in row["H1"], axis=1)
df_striking["KW4 in Copy"] = df_striking.apply(lambda row: row["KW4"] in row["Copy"], axis=1)
df_striking["KW5 in Title"] = df_striking.apply(lambda row: row["KW5"] in row["Title"], axis=1)
df_striking["KW5 in H1"] = df_striking.apply(lambda row: row["KW5"] in row["H1"], axis=1)
df_striking["KW5 in Copy"] = df_striking.apply(lambda row: row["KW5"] in row["Copy"], axis=1)

Delete True/False Values If There Is No Keyword

This will delete true/false values when there is no keyword adjacent.

# delete true / false values if there is no keyword
df_striking.loc[df_striking["KW1"] == "", ["KW1 in Title", "KW1 in H1", "KW1 in Copy"]] = ""
df_striking.loc[df_striking["KW2"] == "", ["KW2 in Title", "KW2 in H1", "KW2 in Copy"]] = ""
df_striking.loc[df_striking["KW3"] == "", ["KW3 in Title", "KW3 in H1", "KW3 in Copy"]] = ""
df_striking.loc[df_striking["KW4"] == "", ["KW4 in Title", "KW4 in H1", "KW4 in Copy"]] = ""
df_striking.loc[df_striking["KW5"] == "", ["KW5 in Title", "KW5 in H1", "KW5 in Copy"]] = ""
df_striking.head()

Drop Rows If All Values == True

This configurable option is really useful for reducing the amount of QA time required for the final output by dropping the keyword opportunity from the final output if it is found in all three columns.

def true_dropper(col1, col2, col3):
    drop = df_striking.drop(
        df_striking[
            (df_striking[col1] == True)
            & (df_striking[col2] == True)
            & (df_striking[col3] == True)
        ].index
    )
    return drop

if drop_all_true == True:
    df_striking = true_dropper("KW1 in Title", "KW1 in H1", "KW1 in Copy")
    df_striking = true_dropper("KW2 in Title", "KW2 in H1", "KW2 in Copy")
    df_striking = true_dropper("KW3 in Title", "KW3 in H1", "KW3 in Copy")
    df_striking = true_dropper("KW4 in Title", "KW4 in H1", "KW4 in Copy")
    df_striking = true_dropper("KW5 in Title", "KW5 in H1", "KW5 in Copy")

Download The CSV File

The last step is to download the CSV file and start the optimization process.

df_striking.to_csv('Keywords in Striking Distance.csv', index=False)
files.download("Keywords in Striking Distance.csv")

Conclusion

If you are looking for quick wins for any website, the striking distance report is a really easy way to find them.

Don’t let the number of steps fool you. It’s not as complex as it seems. It’s as simple as uploading a crawl and keyword export to the supplied Google Colab sheet or using the Streamlit app.

The results are definitely worth it!

More Resources:


Featured Image: aurielaki/Shutterstock

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);

if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}

fbq(‘init’, ‘1321385257908563’);

fbq(‘track’, ‘PageView’);

fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘python-seo-striking-distance’,
content_category: ‘seo-strategy technical-seo’
});

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

ChatGPT Plus Upgrades Paused; Waitlisted Users Receive Invites

Published

on

By

ChatGPT Plus Upgrades Paused; Waitlisted Users Receive Invites

ChatGPT Plus subscriptions and upgrades remain paused after a surge in demand for new features created outages.

Some users who signed up for the waitlist have received invites to join ChatGPT Plus.

Screenshot from Gmail, December 2023ChatGPT Plus Upgrades Paused; Waitlisted Users Receive Invites

This has resulted in a few shares of the link that is accessible for everyone. For now.

RELATED: GPT Store Set To Launch In 2024 After ‘Unexpected’ Delays

In addition to the invites, signs that more people are getting access to GPTs include an introductory screen popping up on free ChatGPT accounts.

ChatGPT Plus Upgrades Paused; Waitlisted Users Receive InvitesScreenshot from ChatGPT, December 2023ChatGPT Plus Upgrades Paused; Waitlisted Users Receive Invites

Unfortunately, they still aren’t accessible without a Plus subscription.

chatgpt plus subscriptions upgrades paused waitlistScreenshot from ChatGPT, December 2023chatgpt plus subscriptions upgrades paused waitlist

You can sign up for the waitlist by clicking on the option to upgrade in the left sidebar of ChatGPT on a desktop browser.

ChatGPT Plus Upgrades Paused; Waitlisted Users Receive InvitesScreenshot from ChatGPT, December 2023ChatGPT Plus Upgrades Paused; Waitlisted Users Receive Invites

OpenAI also suggests ChatGPT Enterprise for those who need more capabilities, as outlined in the pricing plans below.

ChatGPT Plus Upgrades Paused; Waitlisted Users Receive InvitesScreenshot from OpenAI, December 2023ChatGPT Plus Upgrades Paused; Waitlisted Users Receive Invites

Why Are ChatGPT Plus Subscriptions Paused?

According to a post on X by OpenAI’s CEO Sam Altman, the recent surge in usage following the DevDay developers conference has led to capacity challenges, resulting in the decision to pause ChatGPT Plus signups.

The decision to pause new ChatGPT signups follows a week where OpenAI services – including ChatGPT and the API – experienced a series of outages related to high-demand and DDoS attacks.

Demand for ChatGPT Plus resulted in eBay listings supposedly offering one or more months of the premium subscription.

When Will ChatGPT Plus Subscriptions Resume?

So far, we don’t have any official word on when ChatGPT Plus subscriptions will resume. We know the GPT Store is set to open early next year after recent boardroom drama led to “unexpected delays.”

Therefore, we hope that OpenAI will onboard waitlisted users in time to try out all of the GPTs created by OpenAI and community builders.

What Are GPTs?

GPTs allow users to create one or more personalized ChatGPT experiences based on a specific set of instructions, knowledge files, and actions.

Search marketers with ChatGPT Plus can try GPTs for helpful content assessment and learning SEO.

There are also GPTs for analyzing Google Search Console data.

And GPTs that will let you chat with analytics data from 20 platforms, including Google Ads, GA4, and Facebook.

Google search has indexed hundreds of public GPTs. According to an alleged list of GPT statistics in a GitHub repository, DALL-E, the top GPT from OpenAI, has received 5,620,981 visits since its launch last month. Included in the top 20 GPTs is Canva, with 291,349 views.

 

Weighing The Benefits Of The Pause

Ideally, this means that developers working on building GPTs and using the API should encounter fewer issues (like being unable to save GPT drafts).

But it could also mean a temporary decrease in new users of GPTs since they are only available to Plus subscribers – including the ones I tested for learning about ranking factors and gaining insights on E-E-A-T from Google’s Search Quality Rater Guidelines.

custom gpts for seoScreenshot from ChatGPT, November 2023custom gpts for seo

Featured image: Robert Way/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

The Best Times To Post On Social Media In 2024

Published

on

By

The Best Times To Post On Social Media In 2024

Marketers worldwide know the importance of having a solid social media marketing strategy – and a key part of this is finding the best times to post on social media.

The old adage ‘timing is everything’ holds especially true in the world of social media, where the difference between a post that fades into obscurity and one that goes viral can often be just a matter of when it was shared.

With an always-growing array of social platforms hosting billions of users worldwide, it has never been more challenging to stand above the noise and make your voice heard on social.

To determine the best times to post on social media in 2024, we reviewed original data from leading social media management tools.

It’s important to note that the data from these sources present a variety of findings and suggestions, which underscore the fact that social media is an ever-evolving landscape. The most crucial thing is understanding the behavior of your own target audience.

Let’s dive in.

The Best Times To Post On Social Media

Source Day Of Week Time To Post
Sprout Social Tuesday and Wednesday 9 a.m. – 2 p.m. Local
Hootsuite Monday 12 p.m. EST
CoSchedule Friday, Wednesday, and Monday (in that order) 7 p.m. Local
  • Best times to post on social media: 9 a.m. – 2 p.m.
  • Best days to post on social media: Monday and Wednesday.
  • Worst days to post on social media: Saturday and Sunday.

Determining an ideal time for posting on social media in general is complicated, as each platform is different, with unique users, features, and communities.

When deciding which social media platforms to focus on, you should think carefully about your brand’s target audience and overarching goals.

If you’re looking to reach a network of professionals, LinkedIn might be a good fit; if your brand is hoping to speak to Gen Z consumers, you might consider TikTok or Snapchat.

This explains why – when analyzing data from Sprout Social, Hootsuite, and CoSchedule on the best overall times to post on social media – we can draw some similarities but also see a variety of recommendations.

Weekdays emerge as a clear winner. CoSchedule and Sprout Social both highlight Wednesday as a good day, with Hootsuite and CoSchedule also highlighting Mondays as a strong day for engagement.

The most common time range among the sources is in the morning to mid-afternoon, with CoSchedule providing some very specific suggestions for post-timing.

Both CoSchedule and Sprout Social agree on avoiding Saturdays and Sundays.

The Best Times To Post On Facebook

Source Day Of Week Time To Post
Sprout Social Monday to Thursday 8 a.m. – 1 p.m. Local
Hootsuite Monday and Tuesday 1 p.m. EST
CoSchedule Friday, Wednesday, and Monday (in that order) 9 a.m. Local
  • Best times to post on Facebook: 8 a.m. – 1 p.m.
  • Best days to post on Facebook: Weekdays.
  • Worst day to post on Facebook: Sunday.

Facebook remains the most used social media platform in the world, with the largest advertising market share (16%).

While it’s experienced a shift in user demographics over recent years – now catering to older users – its popularity continues to climb, and its potential as a brand marketing tool cannot be disputed.

Regarding the best times to post on Facebook, all of our sources agree that weekdays are best. Sprout Social, Hootsuite, and CoSchdule all name Monday as a great day to engage on Facebook, along with calling out various other days of the week.

There is a general consensus that Sundays should be avoided.

The sources vary in their suggestions for optimal time slots, but generally speaking, early to mid-morning seems to be the most popular selection.

The Best Times To Post On YouTube

Source Day Of Week Time To Post
SocialPilot Sunday 2-4 p.m. EST
HubSpot Friday and Saturday 6-9 p.m. Local
  • Best times to post on YouTube: 2-4 p.m. on weekdays and 9-11 a.m. on weekends.
  • Best days to post on YouTube: Friday, Saturday, and Sunday.
  • Worst day to post on YouTube: Tuesday.

As the second most visited site in the world and the second most used social platform globally, YouTube offers an unparalleled opportunity for brands and individuals to connect with audiences through video.

And with its continued expansion – by introducing features like YouTube Shorts, initiatives like expanding the ways creators can get paid on the platform, and its increasing popularity as a search engine – the platform shows no signs of slowing.

YouTube is no longer just a video-sharing site; it’s a robust marketing tool that empowers businesses to raise brand awareness and drive meaningful engagement.

Finding recent data on the best times to post on YouTube proved harder than for some other channels, so these recommendations should be taken with a grain of salt.

While HubSpot suggests Friday and Saturday are the strongest days to publish on YouTube, SocialPilot specifically calls out Sunday as the most engaging day – so it’s worth experimenting with all three.

SocialPilot doesn’t specifically name the worst day, but according to HubSpot, you’d be wise to steer clear of Tuesday.

Both sources suggest the afternoon as an effective time for posting during the week. SocialPilot specifies that publishing in the mornings on weekends (9-11 a.m.) is effective, so this is important to bear in mind.

The Best Times To Post On Instagram

Source Day Of Week Time To Post
Sprout Social Tuesday and Wednesday 9 a.m. – 1 p.m. Local
Hootsuite Wednesday 2 p.m. EST
HubSpot Saturday 6-9 p.m. Local
CoSchedule Wednesday, Friday, and Tuesday (in that order)

9 a.m. Local

Later Monday 4 a.m. Local
  • Best times to post on Instagram: 8 a.m. to 1 p.m.
  • Best day to post on Instagram: Wednesday.
  • Worst day to post on Instagram: Sunday.

From its origins as a photo-sharing platform, Instagram has evolved into one of the most popular social media networks in the world – and an indispensable marketing tool.

With billions of users – 90% of whom are following at least one business – Instagram has become a powerful engine for ecommerce, brand awareness, and community-building.

As a leader in the social media space, Instagram constantly provides new formats and features for users to try out – from Reels to Stories, user quizzes and polls, and more.

We consulted a handful of sources to determine the top posting times for Instagram and came away with a mixed bag of answers.

Wednesday appears to take the cake as the most consistently recommended day, with CoSchedule, Sprout Social, and Hootsuite all suggesting it.

Generally, our sources seem to lean towards weekdays as being strongest for Instagram engagement – with the exception of HubSpot, which recommends Saturday.

In terms of timing, the morning to midday hours seem to be your best bet, especially around 8 a.m. through 1 p.m. HubSpot and Later provide times that significantly differ from other sources, which suggests that effectiveness can vary based on audience and content type.

The Best Times To Post On TikTok

Source Day Of Week Time To Post
Sprout Social Tuesday and Wednesday 2-6 p.m. Local
Hootsuite Thursday 10 p.m. EST
SocialPilot Tuesday and Thursday 2 a.m. and 9 a.m. EST
HubSpot Friday 6-9 p.m. Local
  • Best time to post on TikTok: Inconclusive.
  • Best day to post on TikTok: Tuesday.
  • Worst day to post on TikTok: Inconclusive.

While it’s a relative newcomer to the fold, TikTok has quickly become one of the most beloved social platforms worldwide – and is drawing brands in increasing numbers.

With the average user spending nearly 54 minutes on the app daily, it’s hard to beat the hold that TikTok has among audiences. By optimizing your presence there, you can stand to generate some impressive returns on your marketing efforts.

So, what’s the best time to post on TikTok? The jury is out on this one – and it may take extra experimentation on your part to find the sweet spot that engages your audience.

Tuesday seems to rise to the top among the sources we consulted, with Wednesdays and Thursdays also getting recommendations. Generally speaking, it looks like midweek is a good time to test out your TikTok content, but there are plenty of discrepancies in the data.

While HubSpot named Friday as the best day, it also highlighted that Saturdays and Thursdays are strong for B2B brands, and Saturdays and Sundays work well for B2C brands.

Sprout Social found Sunday to be the worst performing day, while Monday and Tuesday are the worst days, according to HubSpot.

We also find a mix of recommended time slots, from early morning to mid-afternoon and also evening being suggested.

The Best Times To Post On Snapchat

Snapchat, the pioneer of ephemeral social media content (and the inspiration behind Instagram Stories), provides unique opportunities to reach younger demographics.

It differs from other platforms in how it works and the type of content that engages there. Snapchat typically centers around showcasing real-time experiences and authentic behind-the-scenes content versus polished marketing content.

This makes Snapchat an advantageous yet often underutilized tool in digital marketing. But it should not be overlooked, especially given that the platform continues to innovate.

While we have seen 10 a.m. – 1 p.m. cited as the best times to post on Snapchat in various secondary sources around the internet, we have found no recent original data to either confirm or refute this.

Given this, we would recommend testing out different times and days based on the behaviors and lifestyles of your target audience and then iterating based on your results (which is what you should be doing across the board, regardless!)

The Best Times To Post On Pinterest

Source Day Of Week Time To Post
Sprout Social Wednesday to Friday 1-3 p.m. Local
HubSpot Friday 3-6 p.m. Local
CoSchedule Sunday, Monday, and Tuesday (in that order)

8 p.m. Local

  • Best times to post on Pinterest: 3-6 p.m.
  • Best day to post on Pinterest: Friday.
  • Worst day to post on Pinterest: Sunday.

Pinterest, once thought of as a simple inspiration board-style site, has today become a crucial player in the world of ecommerce.

Businesses can leverage Pinterest to showcase their products and drive conversions, but also to grow and expand brand awareness and sentiment.

Success on Pinterest can be found through sharing brand-specific imagery, optimizing for mobile, and appealing to your audience’s sense of aspiration and inspiration.

Friday, alongside other weekdays, is consistently mentioned as a strong day among our sources. On the other end, Sunday is commonly named as the least effective day for posting on Pinterest.

When it comes to the most fruitful posting time on the platform, it appears that the late afternoon to early evening, specifically around 3-6 p.m., is optimal for best engagement.

The Best Times To Post On X (Twitter)

Source Day Of Week Time To Post
Sprout Social Tuesday to Thursday 9 a.m. – 2 p.m. Local
Hootsuite Monday and Wednesday 10 a.m. – 1 p.m. EST
CoSchedule Wednesday, Tuesday, and Friday (in that order) 9 a.m. Local
HubSpot Friday and Wednesday (in that order) 9 a.m. to 12 p.m. Local
  • Best times to post on X (Twitter): 9 a.m. to 12 p.m.
  • Best days to post on X (Twitter): Wednesday and Friday.
  • Worst day to post on X (Twitter): Sunday.

X (formerly known as Twitter) has long been a place for marketers to connect and engage with their audience, join trending conversations, and build community.

The real-time nature of X (Twitter) differentiates it from other social platforms and allows for spur-of-the-moment and reactionary marketing moves. And with CEO Elon Musk’s big plans for the app, it’s undoubtedly a space to watch.

When looking for the top days to post among the sources we consulted, Wednesday and Friday are most often mentioned – with Sprout Social specifying Tuesday through Thursday.

Hootsuite nominates Monday and Wednesday as the top days, proving that weekdays reign supreme on X (Twitter).

Like many other platforms, Sunday seems to be the least effective day for post-engagement.

Looking for the best times to post on X (Twitter)?

Late morning, from around 9 a.m. to noon, seems to be the most recommended time – though, as always, this will differ based on your specific audience and the type of content you are sharing.

We always recommend testing and experimenting to see what works for you.

The Best Times To Post On LinkedIn

Source Day Of Week Time To Post
Sprout Social Tuesday to Thursday 10 a.m. – 12 p.m. Local
Hootsuite Monday 4 p.m. EST
CoSchedule Thursday, Tuesday, and Wednesday (in that order) 10 a.m. Local
HubSpot Monday, Wednesday, and Tuesday (in that order) 9 a.m. – 12 p.m. Local
  • Best times to post on LinkedIn: 10 a.m. – 3 p.m.
  • Best days to post on LinkedIn: Tuesday, Wednesday, and Thursday.
  • Worst days to post on LinkedIn: Weekends.

Though first and foremost a platform for professionals, LinkedIn has picked up steam in recent years, becoming a hub of engagement and a frontrunner among social media networks.

It’s also an essential tool for businesses that want to reach business executives and decision-makers, as well as potential candidates.

Done right, LinkedIn content can go a long way in building a public perception of your brand and providing deep value to your target audience.

Digging into the data, we can see that weekdays provide the biggest opportunities for engagement on LinkedIn, which is hardly surprising. Tuesdays through Thursdays are often mentioned as the top days, with Mondays also highlighted by Hootsuite and HubSpot.

All of our sources agree that weekends are less effective for LinkedIn posts.

If you’re searching for the right time, you might try your hand at posting from late morning to mid-afternoon, based on what these sources discovered.

But (and not to sound like a broken record) your results may differ based on your brand, niche, target audience, and content.

What Is The Best Time For You To Post On Social Media?

Finding the best times to post on social media requires a delicate blend of testing, experimentation, and personal analytics.

And it never hurts to start your journey with industry insights like the ones we’ve covered in this article.

By aligning your content strategy with your target audience and trying out different posting strategies – taking into account these recommended time slots – you will be able to determine what works best for you and significantly enhance your social media presence and engagement.

Sources of data, November 2023.

All data above was taken from the sources below.

Each platform conducted its own extensive research, analyzing millions of posts across various social networks to find the times when users are most engaged.

Sources:

  • Sprout Social analyzed nearly 2 billion engagements across 400,000 social profiles.
  • Hootsuite analyzed thousands of social media posts using an audience of 8 million followers. For its Instagram updates, it analyzed over 30,000 posts.
  • CoSchedule analyzed more than 35 million posts from more than 30,000 organizations.
  • SocialPilot studied over 50,000 YouTube accounts and over 50,000 TikTok accounts to compile its data. 
  • Later analyzed over 11 million Instagram posts.
  • HubSpot surveyed over 1,000 global marketers to discern the best times to post on social media. For its Instagram-specific data, it partnered with Mention to analyze over 37 million posts.

More resources: 


Featured Image: Kaspars Grinvalds/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google Updating Cryptocurrency Advertising Policy For 2024

Published

on

By

Google Updating Cryptocurrency Advertising Policy For 2024

Google published an announcement of upcoming changes to their cryptocurrency advertising policies and advises advertisers to make themselves aware of the changes and prepare to be in compliance with the new requirements.

The upcoming updates are to Google’s Cryptocurrencies and related products policy for the advertisement of Cryptocurrency Coin Trusts. The changes are set to take effect on January 29th, 2024.

Cryptocurrency Coin Trusts are financial products that enable investors to trade shares in trusts holding substantial amounts of digital currency. These trusts provide investors with equity in cryptocurrencies without having direct ownership. They are also an option for creating a more diversified portfolio.

The policy updates by Google that are coming in 2024 aim to describe the scope and requirements for the advertisement of Cryptocurrency Coin Trusts. Advertisers targeting the United States will be able to promote these products and services as long as they abide by specific policies outlined in the updated requirements and that they also obtain certification from Google.

The updated policy changes are not limited to the United States. They will apply globally to all accounts advertising Cryptocurrency Coin Trusts.

Google’s announcement also reminded advertisers of their obligation for compliance to local laws in the areas where the ads are targeted.

Google’s approach for violations of the new policy will be to first give a warning before imposing an account suspension.

Advertisers that fail to comply with the updated policy will receive a warning at least seven days before a potential account suspension. This time period provides advertisers with an opportunity to fix non-compliance issues and to get back into compliance with the revised guidelines.

Advertisers are encouraged to refer to Google’s documentation on “About restricted financial products certification.”

The deadline for the change in policy is January 29th, 2024. Cryptocurrency Coin Trusts advertisers will need to pay close attention to the updated policies in order to ensure compliance.

Read Google’s announcement:

Updates to Cryptocurrencies and related products policy (December 2023)

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending