Connect with us

SEO

How To Use IndexNow API With Python For Bulk Indexing

Published

on

How To Use IndexNow API With Python For Bulk Indexing

IndexNow is a protocol developed by Microsoft Bing and adopted by Yandex that enables webmasters and SEO pros to easily notify search engines when a webpage has been updated via an API.

And today, Microsoft announced that it is making the protocol easier to implement by ensuring that submitted URLs are shared between search engines.

Given its positive implications and the promise of a faster indexing experience for publishers, the IndexNow API should be on every SEO professional’s radar.

Using Python for automating URL submission to the IndexNow API or making an API request to the IndexNow API for bulk URL indexing can make managing IndexNow more efficient for you.

In this tutorial, you’ll learn how to do just that, with step-by-step instructions for using the IndexNow API to submit URLs to Microsoft Bing in bulk with Python.

Note: The IndexNow API is similar to Google’s Indexing API with only one difference: the Google Indexing API is only for job advertisements or broadcasting web pages that contain a video object within it.

Google announced that they will test the IndexNow API but hasn’t updated us since.

Bulk Indexing Using IndexNow API with Python: Getting Started

Below are the necessities to understand and implement the IndexNow API tutorial.

Below are the Python packages and libraries that will be used for the Python IndexNow API tutorial.

  • Advertools (must).
  • Pandas (must).
  • Requests (must).
  • Time (optional).
  • JSON (optional).

Before getting started, reading the basics can help you to understand this IndexNow API and Python tutorial better. We will be using an API Key and a .txt file to provide authentication along with specific HTTP Headers.

IndexNow API Usage Steps with Python.

1. Import The Python Libraries

To use the necessary Python libraries, we will use the “import” command.

  • Advertools will be used for sitemap URL extraction.
  • Requests will be used for making the GET and POST requests.
  • Pandas will be used for taking the URLs in the sitemap into a list object.
  • The “time” module is to prevent a “Too much request” error with the “sleep()” method.
  • JSON is for possibly modifying the POST JSON object if needed.

Below, you will find all of the necessary import lines for the IndexNow API tutorial.

import advertools as adv
import pandas as pd
import requests
import json
import time

2. Extracting The Sitemap URLs With Python

To extract the URLs from a sitemap file, different web scraping methods and libraries can be used such as Requests or Scrapy.

But to keep things simple and efficient, I will use my favorite Python SEO package – Advertools.

With only a single line of code, all of the URLs within a sitemap can be extracted.

sitemap_urls = adv.sitemap_to_df("https://www.example.com/sitemap_index.xml")

The “sitemap_to_df” method of the Advertools can extract all the URLs and other sitemap-related tags such as “lastmod” or “priority.”

Below, you can see the output of the “adv.sitemap_to_df” command.

Sitemap URL Extraction for IndexNow API UsageSitemap URL Extraction can be done via Advertools’ “sitemap_to_df” method.

All of the URLs and dates are specified within the “sitemap_urls” variable.

Since sitemaps are useful sources for search engines and SEOs, Advertools’ sitemap_to_df method can be used for many different tasks including a Sitemap Python Audit.

But that’s a topic for another time.

3. Take The URLs Into A List Object With “to_list()”

Python’s Pandas library has a method for taking a data frame column (data series) into a list object, to_list().

Below is an example usage:

sitemap_urls["loc"].to_list()

Below, you can see the result:

Sitemap URL ListingPandas’ “to_list” method can be used with Advertools for listing the URLs.

All URLs within the sitemap are in a Python list object.

4. Understand The URL Syntax Of IndexNow API Of Microsoft Bing

Let’s take a look at the URL syntax of the IndexNow API.

Here’s an example:

https://<searchengine>/indexnow?url=url-changed&key=your-key

The URL syntax represents the variables and their relations to each other within the RFC 3986 standards.

  • The <searchengine> represents the search engine name that you will use the IndexNow API for.
  • “?url=” parameter is to determine the URL that will be submitted to the search engine via IndexNow API.
  • “&key=” is the API Key that will be used within the IndexNow API.
  • “&keyLocation=” is to provide an authenticity that shows that you are the owner of the website that IndexNow API will be used for.

The “&keyLocation” will bring us to the API Key and its “.txt” version.

5. Gather The API Key For IndexNow And Upload It To The Root

You’ll need a valid key to use the IndexNow API.

Use this link to generate the Microsoft Bing IndexNow API Key.

IndexNow API Key Taking There is no limit for generating the IndexNow API Key.

Clicking the “Generate” button creates an IndexNow API Key.

When you click on the download button, it will download the “.txt” version of the IndexNow API Key.

IndexNow API Key GenerationIndexNow API Key can be generated by Microsoft Bing’s stated address.
txt version of IndexNow API KeyDownloaded IndexNow API Key as txt file.

The TXT version of the API key will be the file name and as well as within the text file.

IndexNow API Key in TXT FileIndexNow API Key in TXT File should be the same with the name of the file, and the actual API Key value.

The next step is uploading this TXT file to the root of the website’s server.

Since I use FileZilla for my FTP, I have uploaded it easily to my web server’s root.

Root Server and IndexNow API Set upBy putting the .txt file into the web server’s root folder, the IndexNow API setup can be completed.

The next step is performing a simple for a loop example for submitting all of the URLs within the sitemap.

6. Submit The URLs Within The Sitemap With Python To IndexNow API

To submit a single URL to the IndexNow, you can use a single “requests.get()” instance. But to make it more useful, we will use a for a loop.

To submit URLs in bulk to the IndexNow API with Python, follow the steps below:

  1. Create a key variable with the IndexNow API Key value.
  2. Replace the <searchengine> section with the search engine that you want to submit URLs (Microsoft Bing, or Yandex, for now).
  3. Assign all of the URLs from the sitemap within a list to a variable.
  4. Use the “txt” file within the root of the web server with its URL value.
  5. Place the URL, key, and key location URL within the string manipulation value.
  6. Start your for a loop, and use the “requests.get()” for all of the URLs within the sitemap.

Below, you can see the implementation:

key = "22bc7c564b334f38b0b1ed90eec8f2c5"
url = sitemap_urls["loc"].to_list()
for i in url:
          endpoint = f"https://bing.com/indexnow?url={i}&key={key}&keyLocation={location}"
          response = requests.get(endpoint)
          print(i)
          print(endpoint)
          print(response.status_code, response.content)
          #time.sleep(5)

If you’re concerned about sending too many requests to the IndexNow API, you can use the Python time module to make the script wait between every request.

Here you can see the output of the script:

IndexNow API Automation ScriptThe empty string as the request’s response body represents the success of the IndexNow API request according to Microsoft Bing’s IndexNow documentation.

The 200 Status Code means that the request was successful.

With the for a loop, I have submitted 194 URLs to Microsoft Bing.

According to the IndexNow Documentation, the HTTP 200 Response Code signals that the search engine is aware of the change in the content or the new content. But it doesn’t necessarily guarantee indexing.

For instance, I have used the same script for another website. After 120 seconds, Microsoft Bing says that 31 results are found. And conveniently, it shows four pages.

The only problem is that on the first page there are only two results, and it says that the URLs are blocked by Robots.txt even if the blocking was removed before submission.

This can happen if the robots.txt was changed to remove some URLs before using the IndexNow API because it seems that Bing does not check the Robots.txt again.

Thus, if you previously blocked them, they try to index your website but still use the previous version of the robots.txt file.

Bing IndexNow API ResultsIt shows what will happen if you use IndexNow API by blocking Bingbot via Robots.txt.

On the second page, there is only one result:

IndexNow Bing Paginated ResultMicrosoft Bing might use a different indexation and pagination method than Google. The second page shows only one among the 31 results.

On the third page, there is no result, and it shows the Microsoft Bing Translate for translating the string within the search bar.

Microsoft Bing TranslateIt shows sometimes, Microsoft Bing infers the “site” search operator as a part of the query.

When I checked Google Analytics, it shows that Bing still hadn’t crawled the website or indexed it. I know this is true as I also checked the log files.

Google and Bing Indexing ProcessesBelow, you will see the Bing Webmaster Tool’s report for the example website:

Bing Webmaster Tools Report

It says that I submitted 38 URLs.

The next step will involve the bulk request with the POST Method and a JSON object.

7. Perform An HTTP Post Request To The IndexNow API

To perform an HTTP post request to the IndexNow API for a set of URLs, a JSON object should be used with specific properties.

  • Host property represents the search engine hostname.
  • Key represents the API Key.
  • Key represents the location of the API Key’s txt file within the web server.
  • urlList represents the URL set that will be submitted to the IndexNow API.
  • Headers represent the POST Request Headers that will be used which are “Content-type” and “charset.”

Since this is a POST request, the “requests.post” will be used instead of the “requests.get().”

Below, you will find an example of a set of URLs submitted to Microsoft Bing’s IndexNow API.

data = {
  "host": "www.bing.com",
  "key": "22bc7c564b334f38b0b1ed90eec8f2c5",
  "keyLocation": "https://www.example.com/22bc7c564b334f38b0b1ed90eec8f2c5.txt",
  "urlList": [
    'https://www.example.com/technical-seo/http-header/',
    'https://www.example.com/python-seo/nltk/lemmatize',
    'https://www.example.com/pagespeed/broser-hints/preload',
    'https://www.example.com/python-seo/nltk/stemming',
    'https://www.example.com/python-seo/categorize-queries/',
    'https://www.example.com/python-seo/nltk/tokenization',
    'https://www.example.com/review/oncrawl/',
    'https://www.example.com/technical-seo/hreflang/',
    'https://www.example.com/technical-seo/multilingual-seo/'
      ]
}
headers = {"Content-type":"application/json", "charset":"utf-8"}
r = requests.post("https://bing.com/", data=data, headers=headers)
r.status_code, r.content

In the example above, we have performed a POST Request to index a set of URLs.

We have used the “data” object for the “data parameter of requests.post,” and the headers object for the “headers” parameter.

Since we POST a JSON object, the request should have the “content-type: application/json” key and value with the “charset:utf-8.”

After I make the POST request, 135 seconds later, my live logfile analysis dashboard started to show the immediate hits from the Bingbot.

Bingbot Log File Analysis

8. Create Custom Function For IndexNow API To Make Time

Creating a custom function for IndexNow API is useful to decrease the time that will be spent on the code preparation.

Thus, I have created two different custom Python functions to use the IndexNow API for bulk requests and individual requests.

Below, you will find an example for only the bulk requests to the IndexNow API.

The custom function for bulk requests is called “submit_url_set.”

Even if you just fill in the parameters, still you will be able to use it properly.

def submit_url_set(set_:list, key, location, host="https://www.bing.com", headers={"Content-type":"application/json", "charset":"utf-8"}):
     key = "22bc7c564b334f38b0b1ed90eec8f2c5"
     set_ = sitemap_urls["loc"].to_list()
     data = {
     "host": "www.bing.com",
     "key": key,
     "keyLocation": "https://www.example.com/22bc7c564b334f38b0b1ed90eec8f2c5.txt",
     "urlList": set_
     }
     r = requests.post(host, data=data, headers=headers)
     return r.status_code

An explanation of this custom function:

  • The “Set_” parameter is to provide a list of URLs.
  • “Key” parameter is to provide an IndexNow API Key.
  • “Location” parameter is to provide the location of the IndexNow API Key’s txt file within the web server.
  • “Host” is to provide the search engine host address.
  • “Headers” is to provide the headers that are necessary for the IndexNow API.

I have defined some of the parameters with default values such as “host” for Microsoft Bing. If you want to use it for Yandex, you will need to state it while calling the function.

Below is an example usage:

submit_url_set(set_=sitemap_urls["loc"].to_list(), key="22bc7c564b334f38b0b1ed90eec8f2c5", location="https://www.example.com/22bc7c564b334f38b0b1ed90eec8f2c5.txt")

If you want to extract sitemap URLs with a different method, or if you want to use the IndexNow API for a different URL set, you will need to change “set_” parameter value.

Below, you will see an example of the Custom Python function for the IndexNow API for only individual requests.

def submit_url(url, location, key = "22bc7c564b334f38b0b1ed90eec8f2c5"):
     key = "22bc7c564b334f38b0b1ed90eec8f2c5"
     url = sitemap_urls["loc"].to_list()
     for i in url:
          endpoint = f"https://bing.com/indexnow?url={i}&key={key}&keyLocation={location}"
          response = requests.get(endpoint)
          print(i)
          print(endpoint)
          print(response.status_code, response.content)
          #time.sleep(5)

Since this is for a loop, you can submit more URLs one by one. The search engine can prioritize these types of requests differently.

Some of the bulk requests will include non-important URLs, the individual requests might be seen as more reasonable.

If you want to include the sitemap URL extraction within the function, you should include Advertools naturally into the functions themselves.

Tips For Using The IndexNow API With Python

An Overview of How The IndexNow API Works, Capabilities & Uses

  • The IndexNow API doesn’t guarantee that your website or the URLs that you submitted will be indexed.
  • You should only submit URLs that are new or for which the content has changed.
  • The IndexNow API impacts the crawl budget.
  • Microsoft Bing has a threshold for the URL Content Quality and Calculation of the Crawl Need for a URL. If the submitted URL is not good enough, they may not crawl it.
  • You can submit up to 10,000 URLs.
  • The IndexNow API suggests submitting URLs even if the website is small.
  • Submitting the same pages many times within a day can block the IndexNow API from crawling the redundant URLs or the source.
  • The IndexNow API is useful for sites where the content changes frequently, like every 10 minutes.
  • IndexNow API is useful for pages that are gone and are returning a 404 response code. It lets the search engine know that the URLs are gone.
  • IndexNow API can be used for notifying of new 301 or 302 redirects.
  • The 200 Status Response Code means that the search engine is aware of the submitted URL.
  • The 429 Status Code means that you made too many requests to the IndexNow API.
  • If you put a “txt” file that contains the IndexNow API Key into a subfolder, the IndexNow API can be used only for that subfolder.
  • If you have two different CMS, you can use two different IndexNow API Keys for two different site sections
  • Subdomains need to use a different IndexNow API key.
  • Even if you already use a sitemap, using IndexNow API is useful because it efficiently tells the search engines of website changes and reduces unnecessary bot crawling.
  • All search engines that adopt the IndexNow API (Microsoft Bing and Yandex) share the URLs that are submitted between each other.
IndexNow API Infographic SEOIndexNow API Documentation and usage tips can be found above.

In this IndexNow API tutorial and guideline with Python, we have examined a new search engine technology.

Instead of waiting to be crawled, publishers can notify the search engines to crawl when there is a need.

IndexNow reduces the use of search engine data center resources, and now you know how to use Python to make the process more efficient, too.

More resources:

An Introduction To Python & Machine Learning For Technical SEO

How to Use Python to Monitor & Measure Website Performance

Advanced Technical SEO: A Complete Guide


Featured Image: metamorworks/Shutterstock




Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Answers Question About Toxic Link Sabotage

Published

on

By

Gary Illyes answers a question about how to notify Google about toxic link sabotage

Google’s Gary Illyes answered a question about how to notify Google that someone is poisoning their backlink profile with “toxic links” which is a problem that many people have been talking about for at least fifteen years.

Question About Alerting Google To Toxic Links

Gary narrated the question:

“Someone’s asking, how to alert Google of sabotage via toxic links?”

And this is Gary’s answer:

I know what I would do: I’d ignore those links.

Generally Google is really, REALLY good at ignoring links that are irrelevant to the site they’re pointing at. If you feel like it, you can always disavow those “toxic” links, or file a spam report.

Disavow Links If You Feel Like It

Gary linked to Google’s explainer about disavowing links where it’s explained that the disavow tool is for a site owner to tell Google about links that they are responsible for in some way, like paid links or some other link scheme.

This is what it advises:

“If you have a manual action against your site for unnatural links to your site, or if you think you’re about to get such a manual action (because of paid links or other link schemes that violate our quality guidelines), you should try to remove the links from the other site to your site. If you can’t remove those links yourself, or get them removed, then you should disavow the URLs of the questionable pages or domains that link to your website.”

Google suggests that a link disavow is only necessary when two conditions are met:

  1. “You have a considerable number of spammy, artificial, or low-quality links pointing to your site,
    AND
  2. The links have caused a manual action, or likely will cause a manual action, on your site.”

Both of the above conditions must be met in order to file a valid link disavow tool.

Origin Of The Phrase Toxic Links

As Google became better at penalizing sites for low quality links and paid links, some in the highly competitive gambling industry started creating low quality links to sabotage their competitors. The practice was called negative SEO.

The phrase toxic link is something that was never heard of until after the Penguin link updates in 2012 which required penalized sites to remove all the paid and low quality links they created and then disavow the rest. An industry grew around disavowing links and it was that industry that invented the phrase Toxic Links for use in their marketing.

Confirmation That Google Is Able To Ignore Links

I have shared this anecdote before and I’ll share it here again. Someone I knew contacted me and said that their site lost rankings from negative SEO links. I took a look and their site had a ton of really nasty looking links. So out of curiosity (and because I knew that the site was this person’s main income), I emailed someone at Google Mountain View headquarters about it. That person checked it and replied that the site didn’t lose rankings because of the links. They lost rankings because of a Panda update related content issue.

That was around 2012 and it showed me how good Google was at ignoring links. Now, if Google was that good at ignoring really bad links back then, they’re probably better at it now, twelve years later now that they have the spam brain AI.

Listen to the question and answer at the 8:22 minute mark:

Featured Image by Shutterstock/New Africa

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

How To Build A Diverse & Healthy Link Profile

Published

on

By

How To Build A Diverse & Healthy Link Profile

Search is evolving at an incredible pace and new features, formats, and even new search engines are popping up within the space.

Google’s algorithm still prioritizes backlinks when ranking websites. If you want your website to be visible in search results, you must account for backlinks and your backlink profile.

A healthy backlink profile requires a diverse backlink profile.

In this guide, we’ll examine how to build and maintain a diverse backlink profile that powers your website’s search performance.

What Does A Healthy Backlink Profile Look Like?

As Google states in its guidelines, it primarily crawls pages through links from other pages linked to your pages, acquired through promotion and naturally over time.

In practice, a healthy backlink profile can be divided into three main areas: the distribution of link types, the mix of anchor text, and the ratio of followed to nofollowed links.

Let’s look at these areas and how they should look within a healthy backlink profile.

Distribution Of Link Types

One aspect of your backlink profile that needs to be diversified is link types.

It looks unnatural to Google to have predominantly one kind of link in your profile, and it also indicates that you’re not diversifying your content strategy enough.

Some of the various link types you should see in your backlink profile include:

  • Anchor text links.
  • Image links.
  • Redirect links.
  • Canonical links.

Here is an example of the breakdown of link types at my company, Whatfix (via Semrush):

Screenshot from Semrush, May 2024

Most links should be anchor text links and image links, as these are the most common ways to link on the web, but you should see some of the other types of links as they are picked up naturally over time.

Mix Of Anchor Text

Next, ensure your backlink profile has an appropriate anchor text variance.

Again, if you overoptimize for a specific type of anchor text, it will appear suspicious to search engines like Google and could have negative repercussions.

Here are the various types of anchor text you might find in your backlink profile:

  • Branded anchor text – Anchor text that is your brand name or includes your brand name.
  • Empty – Links that have no anchor text.
  • Naked URLs – Anchor text that is a URL (e.g., www.website.com).
  • Exact match keyword-rich anchor text – Anchor text that exactly matches the keyword the linked page targets (e.g., blue shoes).
  • Partial match keyword-rich anchor text – Anchor text that partially or closely matches the keyword the linked page targets (e.g., “comfortable blue footwear options”).
  • Generic anchor text – Anchor text such as “this website” or “here.”

To maintain a healthy backlink profile, aim for a mix of anchor text within a similar range to this:

  • Branded anchor text – 35-40%.
  • Partial match keyword-rich anchor text – 15-20%.
  • Generic anchor text -10-15%.
  • Exact match keyword-rich anchor text – 5-10%.
  • Naked URLs – 5-10%.
  • Empty – 3-5%.

This distribution of anchor text represents a natural mix of differing anchor texts. It is common for the majority of anchors to be branded or partially branded because most sites that link to your site will default to your brand name when linking. It also makes sense that the following most common anchors would be partial-match keywords or generic anchor text because these are natural choices within the context of a web page.

Exact-match anchor text is rare because it only happens when you are the best resource for a specific term, and the site owner knows your page exists.

Ratio Of Followed Vs. Nofollowed Backlinks

Lastly, you should monitor the ratio of followed vs. nofollowed links pointing to your website.

If you need a refresher on what nofollowed backlinks are or why someone might apply the nofollow tag to a link pointing to your site, check out Google’s guide on how to qualify outbound links to Google.

Nofollow attributes should only be applied to paid links or links pointing to a site the linking site doesn’t trust.

While it is not uncommon or suspicious to have some nofollow links (people misunderstand the purpose of the nofollow attribute all the time), a healthy backlink profile will have far more followed links.

You should aim for a ratio of 80%:20% or 70%:30% in favor of followed links. For example, here is what the followed vs. nofollowed ratio looks like for my company’s backlink profile (according to Ahrefs):

Referring domainsScreenshot from Ahrefs, May 2024

You may see links with other rel attributes, such as UGC or Sponsored.

The “UGC” attribute tags links from user-generated content, while the “Sponsored” attribute tags links from sponsored or paid sources. These attributes are slightly different than the nofollow tag, but they essentially work the same way, letting Google know these links aren’t trusted or endorsed by the linking site. You can simply group these links in with nofollowed links when calculating your ratio.

Importance Of Diversifying Your Backlink Profile

So why is it important to diversify your backlink profile anyway? Well, there are three main reasons you should consider:

  • Avoiding overoptimization.
  • Diversifying traffic sources.
  • And finding new audiences.

Let’s dive into each of these.

Avoiding Overoptimization

First and foremost, diversifying your backlink profile is the best way to protect yourself from overoptimization and the damaging penalties that can come with it.

As SEO pros, our job is to optimize websites to improve performance, but overoptimizing in any facet of our strategy – backlinks, keywords, structure, etc. – can result in penalties that limit visibility within search results.

In the previous section, we covered the elements of a healthy backlink profile. If you stray too far from that model, your site might look suspicious to search engines like Google and you could be handed a manual or algorithmic penalty, suppressing your rankings in search.

Considering how regularly Google updates its search algorithm these days (and how little information surrounds those updates), you could see your performance tank and have no idea why.

This is why it’s so important to keep a watchful eye on your backlink profile and how it’s shaping up.

Diversifying Traffic Sources

Another reason to cultivate a diverse backlink profile is to ensure you’re diversifying your traffic sources.

Google penalties come swiftly and can often be a surprise. If you have all your eggs in that basket when it comes to traffic, your site will suffer badly and might need help to recover.

However, diversifying your traffic sources (search, social, email, etc.) will mitigate risk – similar to a stock portfolio – as you’ll have other traffic sources to provide a steady flow of visitors if another source suddenly dips.

Part of building a diverse backlink profile is acquiring a diverse set of backlinks and backlink types, and this strategy will also help you find differing and varied sources of traffic.

Finding New Audiences

Finally, building a diverse backlink profile is essential, as doing so will also help you discover new audiences.

If you acquire links from the same handful of websites and platforms, you will need help expanding your audience and building awareness for your website.

While it’s important to acquire links from sites that cater to your existing audience, you should also explore ways to build links that can tap into new audiences. The best way to do this is by casting a wide net with various link acquisition tactics and strategies.

A diverse backlink profile indicates a varied approach to SEO and marketing that will help bring new visitors and awareness to your site.

Building A Diverse Backlink Profile

So that you know what a healthy backlink profile looks like and why it’s important to diversify, how do you build diversity into your site’s backlink profile?

This comes down to your link acquisition strategy and the types of backlinks you actively pursue. To guide your strategy, let’s break link building into three main categories:

  • Foundational links.
  • Content promotion.
  • Community involvement.

Here’s how to approach each area.

Foundational Links

Foundational links represent those links that your website simply should have. These are opportunities where a backlink would exist if all sites were known to all site owners.

Some examples of foundational links include:

  • Mentions – Websites that mention your brand in some way (brand name, product, employees, proprietary data, etc.) on their website but don’t link.
  • Partners – Websites that belong to real-world partners or companies you connect with offline and should also connect (link) with online.
  • Associations or groups – Websites for offline associations or groups you belong to where your site should be listed with a link.
  • Sponsorships – Any events or organizations your company sponsors might have websites that could (and should) link to your site.
  • Sites that link to competitors – If a website is linking to a competitor, there is a strong chance it would make sense for them to link to your site as well.

These link opportunities should set the foundation for your link acquisition efforts.

As the baseline for your link building strategy, you should start by exhausting these opportunities first to ensure you’re not missing highly relevant links to bolster your backlink profile.

Content Promotion

Next, consider content promotion as a strategy for building a healthy, diverse backlink profile.

Content promotion is much more proactive than the foundational link acquisition mentioned above. You must manifest the opportunity by creating link-worthy content rather than simply capitalizing on an existing opportunity.

Some examples of content promotion for links are:

  • Digital PR – Digital PR campaigns have numerous benefits and goals beyond link acquisition, but backlinks should be a primary KPI.
  • Original research – Similar to digital PR, original research should focus on providing valuable data to your audience. Still, you should also make sure any citations or references to your research are correctly linked.
  • Guest content – Whether regular columns or one-off contributions, providing guest content to websites is still a viable link acquisition strategy – when done right. The best way to gauge your guest content strategy is to ask yourself if you would still write the content for a site without guaranteeing a backlink, knowing you’ll still build authority and get your message in front of a new audience.
  • Original imagery – Along with research and data, if your company creates original imagery that offers unique value, you should promote those images and ask for citation links.

Content promotion is a viable avenue for building a healthy backlink profile as long as the content you’re promoting is worthy of links.

Community Involvement

Community involvement is the final piece of your link acquisition puzzle when building a diverse backlink profile.

After pursuing all foundational opportunities and manually promoting your content, you should ensure your brand is active and represented in all the spaces and communities where your audience engages.

In terms of backlinks, this could mean:

  • Wikipedia links – Wikipedia gets over 4 billion monthly visits, so backlinks here can bring significant referral traffic to your site. However, acquiring these links is difficult as these pages are moderated closely, and your site will only be linked if it is legitimately a top resource on the web.
  • Forums (Reddit, Quora, etc.) – Another great place to get backlinks that drive referral traffic is forums like Reddit and Quora. Again, these forums are strictly moderated, and earning link placements on these sites requires a page that delivers significant and unique value to a specific audience.
  • Social platforms – Social media platforms and groups represent communities where your brand should be active and engaged. While these strategies are likely handled by other teams outside SEO and focus on different metrics, you should still be intentional about converting these interactions into links when or where possible.
  • Offline events – While it may seem counterintuitive to think of offline events as a potential source for link acquisition, legitimate link opportunities exist here. After all, most businesses, brands, and people you interact with at these events also have websites, and networking can easily translate to online connections in the form of links.

While most of the link opportunities listed above will have the nofollow link attribute due to the nature of the sites associated with them, they are still valuable additions to your backlink profile as these are powerful, trusted domains.

These links help diversify your traffic sources by bringing substantial referral traffic, and that traffic is highly qualified as these communities share your audience.

How To Avoid Developing A Toxic Backlink Profile

Now that you’re familiar with the link building strategies that can help you cultivate a healthy, diverse backlink profile, let’s discuss what you should avoid.

As mentioned before, if you overoptimize one strategy or link, it can seem suspicious to search engines and cause your site to receive a penalty. So, how do you avoid filling your backlink profile with toxic links?

Remember The “Golden Rule” Of Link Building

One simple way to guide your link acquisition strategy and avoid running afoul of search engines like Google is to follow one “golden rule.”

That rule is to ask yourself: If search engines like Google didn’t exist, and the only way people could navigate the web was through backlinks, would you want your site to have a link on the prospective website?

Thinking this way strips away all the tactical, SEO-focused portions of the equation and only leaves the human elements of linking where two sites are linked because it makes sense and makes the web easier to navigate.

Avoid Private Blog Networks (PBNs)

Another good rule is to avoid looping your site into private blog networks (PBNs). Of course, it’s not always obvious or easy to spot a PBN.

However, there are some common traits or red flags you can look for, such as:

  • The person offering you a link placement mentions they have a list of domains they can share.
  • The prospective linking site has little to no traffic and doesn’t appear to have human engagement (blog comments, social media followers, blog views, etc.).
  • The website features thin content and little investment into user experience (UX) and design.
  • The website covers generic topics and categories, catering to any and all audiences.
  • Pages on the site feature numerous external links but only some internal links.
  • The prospective domain’s backlink profile features overoptimization in any of the previously discussed forms (high-density of exact match anchor text, abnormal ratio of nofollowed links, only one or two link types, etc.).

Again, diversification – in both tactics and strategies – is crucial to building a healthy backlink profile, but steering clear of obvious PBNs and remembering the ‘golden rule’ of link building will go a long way toward keeping your profile free from toxicity.

Evaluating Your Backlink Profile

As you work diligently to build and maintain a diverse, healthy backlink profile, you should also carve out time to evaluate it regularly from a more analytical perspective.

There are two main ways to evaluate the merit of your backlinks: leverage tools to analyze backlinks and compare your backlink profile to the greater competitive landscape.

Leverage Tools To Analyze Backlink Profile

There are a variety of third-party tools you can use to analyze your backlink profile.

These tools can provide helpful insights, such as the total number of backlinks and referring domains. You can use these tools to analyze your full profile, broken down by:

  • Followed vs. nofollowed.
  • Authority metrics (Domain Rating, Domain Authority, Authority Score, etc.).
  • Backlink types.
  • Location or country.
  • Anchor text.
  • Top-level domain types.
  • And more.

You can also use these tools to track new incoming backlinks, as well as lost backlinks, to help you better understand how your backlink profile is growing.

Some of the best tools for analyzing your backlink profile are:

Many of these tools also have features that estimate how toxic or suspicious your profile might look to search engines, which can help you detect potential issues early.

Compare Your Backlink Profile To The Competitive Landscape

Lastly, you should compare your overall backlink profile to those of your competitors and those competing with your site in the search results.

Again, the previously mentioned tools can help with this analysis – as far as providing you with the raw numbers – but the key areas you should compare are:

  • Total number of backlinks.
  • Total number of referring domains.
  • Breakdown of authority metrics of links (Domain Rating, Domain Authority, Authority Score, etc.).
  • Authority metrics of competing domains.
  • Link growth over the last two years.

Comparing your backlink profile to others within your competitive landscape will help you assess where your domain currently stands and provide insight into how far you must go if you’re lagging behind competitors.

It’s worth noting that it’s not as simple as whoever has the most backlinks will perform the best in search.

These numbers are typically solid indicators of how search engines gauge the authority of your competitors’ domains, and you’ll likely find a correlation between strong backlink profiles and strong search performance.

Approach Link Building With A User-First Mindset

The search landscape continues to evolve at a breakneck pace and we could see dramatic shifts in how people search within the next five years (or sooner).

However, at this time, search engines like Google still rely on backlinks as part of their ranking algorithms, and you need to cultivate a strong backlink profile to be visible in search.

Furthermore, if you follow the advice in this article as you build out your profile, you’ll acquire backlinks that benefit your site regardless of search algorithms, futureproofing your traffic sources.

Approach link acquisition like you would any other marketing endeavor – with a customer-first mindset – and over time, you’ll naturally build a healthy, diverse backlink profile.

More resources: 


Featured Image: Sammby/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google On Traffic Diversity As A Ranking Factor

Published

on

By

Google answers the question of whether traffic diversity is a ranking factor for SEO

Google’s SearchLiaison tweeted encouragement to diversify traffic sources, being clear about the reason he was recommending it. Days later, someone followed up to ask if traffic diversity is a ranking factor, prompting SearchLiaison to reiterate that it is not.

What Was Said

The question of whether diversity of traffic was a ranking factor was elicited from a previous tweet in a discussion about whether a site owner should be focusing on off-site promotion.

Here’s the question from the original discussion that was tweeted:

“Can you please tell me if I’m doing right by focusing on my site and content – writing new articles to be found through search – or if I should be focusing on some off-site effort related to building a readership? It’s frustrating to see traffic go down the more effort I put in.”

SearchLiaison split the question into component parts and answered each one. When it came to the part about off-site promotion, SearchLiaison (who is Danny Sullivan), shared from his decades of experience as a journalist and publisher covering technology and search marketing.

I’m going to break down his answer so that it’s clearer what he meant

This is the part from the tweet that talks about off-site activities:

“As to the off-site effort question, I think from what I know from before I worked at Google Search, as well as my time being part of the search ranking team, is that one of the ways to be successful with Google Search is to think beyond it.”

What he is saying here is simple, don’t limit your thinking about what to do with your site to thinking about how to make it appeal to Google.

He next explains that sites that rank tend to be sites that are created to appeal to people.

SearchLiaison continued:

“Great sites with content that people like receive traffic in many ways. People go to them directly. They come via email referrals. They arrive via links from other sites. They get social media mentions.”

What he’s saying there is that you’ll know that you’re appealing to people if people are discussing your site in social media, if people are referring the site in social media and if other sites are citing it with links.

Other ways to know that a site is doing well is when when people engage in the comments section, send emails asking follow up questions, and send emails of thanks and share anecdotes of their success or satisfaction with a product or advice.

Consider this, fast fashion site Shein at one point didn’t rank for their chosen keyword phrases, I know because I checked out of curiosity. But they were at the time virally popular and making huge amounts of sales by gamifying site interaction and engagement, propelling them to become a global brand. A similar strategy propelled Zappos when they pioneered no-questions asked returns and cheerful customer service.

SearchLiaison continued:

“It just means you’re likely building a normal site in the sense that it’s not just intended for Google but instead for people. And that’s what our ranking systems are trying to reward, good content made for people.”

SearchLiaison explicitly said that building sites with diversified content is not a ranking factor.

He added this caveat to his tweet:

“This doesn’t mean you should get a bunch of social mentions, or a bunch of email mentions because these will somehow magically rank you better in Google (they don’t, from how I know things).”

Despite The Caveat…

A journalist tweeted this:

“Earlier this week, @searchliaison told people to diversify their traffic. Naturally, people started questioning whether that meant diversity of traffic was a ranking factor.

So, I asked @iPullRank what he thought.”

SearchLiaison of course answered that he explicitly said it’s not a ranking factor and linked to his original tweet that I quoted above.

He tweeted:

“I mean that’s not exactly what I myself said, but rather repeat all that I’ll just add the link to what I did say:”

The journalist responded:

“I would say this is calling for publishers to diversify their traffic since you’re saying the great sites do it. It’s the right advice to give.”

And SearchLiaison answered:

“It’s the part of “does it matter for rankings” that I was making clear wasn’t what I myself said. Yes, I think that’s a generally good thing, but it’s not the only thing or the magic thing.”

Not Everything Is About Ranking Factors

There is a longstanding practice by some SEOs to parse everything that Google publishes for clues to how Google’s algorithm works. This happened with the Search Quality Raters guidelines. Google is unintentionally complicit because it’s their policy to (in general) not confirm whether or not something is a ranking factor.

This habit of searching for “ranking factors” leads to misinformation. It takes more acuity to read research papers and patents to gain a general understanding of how information retrieval works but it’s more work to try to understand something than skimming a PDF for ranking papers.

The worst approach to understanding search is to invent hypotheses about how Google works and then pore through a document to confirm those guesses (and falling into the confirmation bias trap).

In the end, it may be more helpful to back off of exclusively optimizing for Google and focus at least equally as much in optimizing for people (which includes optimizing for traffic). I know it works because I’ve been doing it for years.

Featured Image by Shutterstock/Asier Romero

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending