For those who have been in SEO for some time, you may have heard of site taxonomy as it refers to the website.
When you refer to a website’s structure and how easy it is for users to navigate, you are referring to the site’s taxonomy.
Attention to your site taxonomy is a critical skill for SEO professionals to master.
That’s because a site’s taxonomy not only influences its overall organizational structure but also influences how it’s perceived on Google and how users navigate your site.
Because of this, placing your site’s taxonomy optimization in your queue (in a high-priority position, hopefully) is a critical step toward a solid website architecture.
What, Exactly, Is A Site’s Taxonomy?
When one talks about taxonomy, they usually refer to a classification system.
This classification system will control everything in a site structure from organization to classification – and this is all based on their semantic characteristics and how they relate to each other.
Your website’s taxonomy is something that can play heavily into how Google crawls your site, as well as how your users will perceive their user experience.
It can also heavily impact search engine rankings. It pays to focus on your website’s taxonomy, how it plays out throughout your site, and how it is set up overall.
Your website’s taxonomy can also play into how your site creates internal links, which can also be a significant boost for your website’s success on Google.
Google Guidelines: Create A Clear Conceptual Page Hierarchy
If you were wondering whether or not this could be a black hat tactic, it’s not.
It’s actually a white hat technique.
Because you’re focusing on your content organization, you are not risking anything black hat being interpreted by Google.
In fact, Google’s Webmaster Guidelines state that you should create a hierarchical taxonomy:
Design your site to have a clear conceptual page hierarchy.
Google prefers a clear conceptual taxonomic structure that includes top-level categories pursuant to a site’s content type.
This structure should also include related topics organized within this.
The Different Types Of Taxonomies
There are a couple of different types of taxonomies that can aid you in creating your taxonomic structure. They include flat taxonomies and faceted taxonomies.
Flat taxonomies, or hierarchical taxonomies, are easily used when you have a group of topics where the semantic relationship is already very well known.
Entities are easily used in a flat taxonomy with one classification dimension.
Using a parent-child relationship for these entities can help Google dive deep into a topic and can help organize things in a logical way for users.
You may want to use faceted taxonomies when you have a subject matter with many different dimensions of classification (as opposed to just one).
It’s possible to utilize faceted taxonomies to organize an entire, deep library.
Whether you’re organizing all the different types of dishes in your kitchen, or you are organizing thousands of products with similar and many different dimensions of classification, you may want to use faceted taxonomies.
The interesting thing about faceted taxonomies is that complete knowledge of the semantic relationship between entities is not required.
It’s possible to construct an ad hoc taxonomy that encompasses all of these pieces of content, regardless of where they may fall in the taxonomic spectrum.
Okay, I’m Sold On A Site’s Taxonomy. Why Is This So Important?
Creating a well-organized taxonomy can truly impact how users positively interact with your site. This is especially true when you have a logical organization of your content.
The better a site’s taxonomy, the more reputable a source your users will see you, and the more they will stay and read your stuff.
If a site does not have a specified structure, it will be very difficult for users to understand and consume your content.
Many users will leave a site if it is poorly organized. We want to make sure that users have the easiest time possible when trying to navigate your website.
That’s also critical for SEO because it gives Google a better understanding of your site architecture. Additionally, it provides easier crawling and indexing for bots.
Creating the proper relationships between semantic definitions that apply to Google’s knowledge graph also explains how Google will understand your site.
The easier you can make it for Google to analyze and understand your overall site taxonomy, the better your site’s performance in the search engines and for your users.
Let’s examine this in more detail with an example website about search engine optimization (maybe you own ilovedoingseoonallthethingsintheworldsosueme.com).
Say that you have your site targeting a variety of topics within the search engine optimization field. They may include things like:
- Content Writing.
- Content Marketing.
- Link Building.
- Technical SEO.
- Social Media Promotion.
- Pay per click (PPC).
These would all be categories that you can use to organize your content.
If any of your users are looking for topics on SEO, content writing, or content marketing, the taxonomy might look like this:
- And so on.
The first part of the URL (/content-writing/) is the category.
And if someone is looking for something like content writing, they would likely go to this category page, where they can find all the articles on the topic that are organized into this category.
It’s important that closely related topics are organized within this hierarchical navigation.
Site Taxonomy: Best Practices For Creating The Navigational Hierarchy
The absolute prime directive here is to ensure that your site’s taxonomy is good for users and search engines.
You want to provide a balance between being easy to use and easy to navigate.
If users can’t navigate the site and find the organized content, you may only get so far in your site’s growth.
That’s why we separate this kind of content into these categories: to better organize and present them to users and bots.
The easy two-fold navigation is a win, both from an engineering perspective and the human factors perspective.
Make Sure You Do The Relevant Keyword And Topic Research
A solid foundation for any successful SEO strategy is doing the right keyword research and researching your topics. One cannot exist without the other.
Keyword research is needed to know more about what your audience is searching for online.
Topic research is needed to find out more about your audience’s interests.
The combination will help you organize your taxonomy into useful categories and content written to those categories.
By doing things in this fashion, you don’t miss anything and hit on all the pain points your audience might be experiencing elsewhere – delivering a much higher-quality experience than otherwise.
All of these keywords that you research should be related to any content you might produce that will show up on these pages.
You will pick one topic for the taxonomic category. Then, you will choose topics and keywords to cluster underneath this.
That will help you build a relevant topic cluster that will reinforce your topical focus across certain topics on your site.
However, it’s important to note that you don’t have to optimize things as much as you may have in the old days.
You don’t have to include your target keyword in every single paragraph, sentence, or whatever. Instead, you want to ensure that your content is organized and structured around the topic, and that you write naturally.
Google’s algorithm will help make extrapolations about the meaning and understanding of your content as a result of crawling it.
But, you still want to include keywords. And you still want to optimize based on what software like Frase tells you.
You just don’t want to keyword stuff.
It’s helpful to read about entities also once you grasp keywords. As you create your site taxonomy, it will help inform your topical entity map.
Keep Your Site Taxonomy Simple
Building a taxonomy with hundreds of categories and subcategories is an exercise in futility. You only make things worse for your site in the search engines and make things more difficult for both Google and your users.
The worse you make your site structure, the harder it is for Google to crawl and index – and the longer it takes. It may take your users eons to find what they are looking for.
While it is possible to come up with such a taxonomy structure regardless of your niche, the reality is that this just adds friction between what your users want and what Google wants to see.
The more friction you add, the more difficult it becomes for users and search engines. An ideal site taxonomy is easily navigable, focused on topics, and simple enough for users.
Keeping your taxonomy simple also means making sure that you have fewer main categories and where these categories can have other sub-categories.
It’s possible to have a higher-level category that’s focused entirely on on-page SEO, and the content you post in that section will all be about on-page SEO.
There are different ways that you can set up your taxonomy structure.
You can have a pure category structure that’s only focused on organizing pages within that category, or you can have a more granular drill-down structure to organize your topics within a true physical silo.
The possibilities may be endless, but results tend to show that simpler taxonomies are preferred compared to the more complex issues that having hundreds of taxonomies can bring.
Don’t Forget About Your Audience When Creating Your Taxonomy
This should be common sense, but more often than not, it’s not so common.
To create the most effective site taxonomy, it’s important to know exactly who your audience is and why they are on your site.
You also need to know their needs and how they typically search. In addition, you may want to figure out how they use websites in general.
This way, you can structure your content within the appropriate taxonomy properly.
Buyer personas are a great tool that you can use to identify these facts.
For example, if your audience searches for SEO, it’s useful to know what they expect regarding that navigation.
You can find this out by looking at already-optimized websites in your niche, or you can use a site like usertesting.com to have real users navigate your site and provide feedback on this.
In addition to resolving how to present information about their main topic, you also want to know what supporting topics they might want to know about and include those in your navigation.
Continuing with “example.com,” for instance, is there anything that can help enhance the topic at hand?
By spending time diving into your users, you can make sure that your overall site is designed accordingly and that it will be able to facilitate their needs much better.
You Also Want To Leave Enough Room For Growing Your Site
If you only have a finite number of categories, and you only deal with those topics, eventually, you will run out of things to talk about.
This is why ensuring that you leave enough room for growth in your site’s taxonomy is critical.
It’s not just about ensuring you have enough topics to discuss, although that’s a large part.
Your taxonomy is likely to change as your overall business grows.
As new types of content are created, you will likely need to move some categories around to ensure that everything is still interrelated.
You also need to make sure that you have room for new content pieces.
For example, say that you have an existing taxonomy that covers certain blog topics.
You hire new team members. They are all well-versed in related topics in this regard.
But, you don’t have them within your taxonomy. As you expand your team, who are subject matter experts, then you will also need to expand your categories throughout your blog.
It’s also possible that you may change your mind and find that some categories are not quite as strong as you thought initially.
That’s why being open to change, and adapting as your situation changes, is so important.
You don’t want to be so rigid that you’re not open to the possibilities of your audience changing (and they will).
On the other hand, you don’t want to constantly change your site’s taxonomy either because you will lose stability in the search results.
Finding that balance that works for your users and your company’s growth is key.
Consistency Is Always Core To A Successful Strategy
As you get better at creating taxonomies, you will refine your own consistency, which is a very important factor for SEO.
If your site is poorly organized or contains irrelevant content, it may be considered something that’s not of very high quality.
Google is intelligent enough to understand the semantic relationship between your content, and you should ensure that your navigational hierarchy is organized enough to facilitate these taxonomic semantic relationships.
By making sure that you create a consistent, structured taxonomic hierarchy, you create a simple and easy website structure that Google (and your users) can follow.
It aids significantly in content findability and allows you to arrange your content items within that taxonomic structure.
This hierarchical structure is also search-engine-friendly, with plenty of “spider food” to feed Google, so it understands exactly what your entire site contains.
A well-planned taxonomy is also consistent by nature because it lends itself to more consistent topical navigation and ensures you easily present your content to readers.
A navigation menu organized into a well-planned taxonomy makes it much easier to ensure a consistent and
high-quality content findability factor.
Your URL Taxonomy Can Significantly Impact Your Site’s Architecture
Making your site easier to navigate for both search engine spiders and your users is the ultimate goal (or should be) of any enterprising SEO professional.
Your URL taxonomy can mean the difference between the success and failure of your site.
By creating a hierarchical taxonomy that includes the full semantic relationship between your topical entities, it’s possible to keep feeding Google the right signals while also making sure that your site is not too difficult to understand.
Let’s take a look at the following examples of taxonomy to further clarify our taxonomic preferences:
Examples Of Bad URL Taxonomy
Like most SEO practices, there are good URL taxonomies and bad URL taxonomies. Terribly bad (worst of the bad) URL taxonomies include ones like the following:
The problem with these URL taxonomies is that they are very complex and could lead to some devaluation of your site – because Google can’t bother with understanding the complexity of these URLs.
That’s why it’s preferable to always utilize a simple taxonomy, where possible, and not to get overly complex.
In addition, these types of taxonomies do not group everything together properly, nor do they group your blog posts under a single website section.
Also, they do not have relevant content based on the URLs that are shown within this particular taxonomy.
Examples Of Good URL Taxonomy
A good URL taxonomy, however, creates an easy-to-understand structure that’s easy to crawl and users can easily read. For example:
Good URL taxonomies (such as the ones above) are preferable because they – again – aid in your content findability.
They also help users because if they see your URL in Google’s search results, it’s shorter and more memorable.
They help spiders because they use less processing power.
Focusing on ensuring that you stay consistent with a good URL taxonomy makes it possible to cater to both users and search engines.
Creating The Relationship Of Your Content Within The Silo
When you focus on the relationship of your content within a silo, you want to group all of your related pages into an organized silo.
This helps build a better taxonomy foundation for your site.
Organizing your content by taxonomy allows for easier content discovery, especially if they are organized within the proper silo.
When you organize your content based on the relationship of that content within this silo, then you provide Google with a better understanding of your content.
Google will then figure out that any of your content grouped within this silo must be related in some way.
Using this hierarchical structure to organize your content pages in a silo also provides more content discovery ability.
Create Internal Links Across Content Silos
Don’t forget to create internal links across your content silos.
Internal links are a powerful tool that can help you think about website taxonomy.
Creating internal links across different content silos helps give Google more context about the relationship between different types of content.
You want to, ideally, utilize links with the proper contextual content surrounding them so that you can provide the all-important contextual relevance about that link.
This practice will help aid both users and search engines when it comes to helping them learn more about the relationships between the topics on your site.
Make Your Site Future-Proof With The Right Taxonomies
Creating the right site taxonomies is something that will help future-proof your site.
Not only will it help with topical relevance and topical focus, but it will also help with ensuring that search engines discover your content in the correct way that you want them to.
In addition, it aids in creating topical authority.
Because your site is organized in this fashion, you also build topical authority through the links and contextually relevant URLs that you create.
Ensuring you create the right taxonomies reinforces your site’s authority on the topic.
You also create an organized, hierarchical taxonomic structure that Google loves and provide a contextual home for all of your content.
What do you plan to do with your next site’s taxonomy implementation?
Featured Image: BestForBest/Shutterstock
How to Execute the Skyscraper Technique (And Get Results)
In 2015, Brian Dean revealed a brand-new link building strategy. He called it the Skyscraper Technique.
With over 10,000 backlinks since the post was published, it’s fair to say that the Skyscraper Technique took the world by storm in 2015. But what is it exactly, how can you implement it, and can you still get results with this technique in 2023?
Let’s get started.
The Skyscraper Technique is a link building strategy where you improve existing popular content and replicate the backlinks.
Brian named it so because in his words, “It’s human nature to be attracted to the best. And what you’re doing here is finding the tallest ‘skyscraper’ in your space… and slapping 20 stories to the top of it.”
Here’s how the technique works:
Follow these three steps to execute the Skyscraper Technique.
1. Find relevant content with lots of backlinks
There are three methods to find relevant pages with plenty of links:
Use Site Explorer
Enter a popular site into Ahrefs’ Site Explorer. Next, go to the Best by backlinks report.
This report shows you a list of pages from the site with the highest number of referring domains. If there are content pieces with more than 50 referring domains, they’re likely to be good potential targets.
Ignore homepages and other irrelevant content when eyeballing this report.
Use Content Explorer
Let’s start by entering a broad topic related to your niche into Content Explorer. Next, set a Referring domains filter to a minimum of 50.
We can also add:
- Language filter to get only pages in our target language.
- Exclude homepages to remove homepages from the results.
Eyeball the results to see if there are any potential pieces of content you could beat.
Use Keywords Explorer
Enter a broad keyword into Ahrefs’ Keywords Explorer. Next, go to the Matching terms report and set a Keyword Difficulty (KD) filter to a minimum of 40.
Why filter for KD?
The reason is due to the method we use at Ahrefs to calculate KD. Our KD score is calculated from a trimmed mean of referring domains (RDs) to the top 10 ranking pages.
In other words, the top-ranking pages for keywords with high KD scores have lots of backlinks on average.
From here, you’ll want to go through the report to find potential topics you could build a better piece of content around.
2. Make it better
The core idea (or assumption) behind the Skyscraper Technique is that people want to see the best.
Once you’ve found the content you want to beat, the next step is to make something even better.
According to Brian, there are four aspects worth improving:
- Length – If the post has 25 tips, list more.
- Freshness – Update any outdated parts of the original article with new images, screenshots, information, stats, etc.
- Design – Make it stand out with a custom design. You could even make it interactive.
- Depth – Don’t just list things. Fill in the details and make them actionable.
3. Reach out to the right people
The key to successfully executing the Skyscraper Technique is email outreach. But instead of spamming everyone you know, you reach out to those who have already linked to the specific content you have improved.
The assumption: Since they’ve already linked to a similar article, they’re more likely to link to one that’s better.
You can find these people by pasting the URL of the original piece into Ahrefs’ Site Explorer and then going to the Backlinks report.
This report shows all the backlinks to the page. In this case, there are 441 groups of links.
But not all of these links will make good prospects. So you’ll likely need to add some filters to clean them up. For example, you can:
- Add a Language filter for the language you’re targeting (e.g., English).
- Switch the tab to Dofollow for equity-passing links.
It’s been roughly eight years since Brian shared this link building strategy. Honestly speaking, the technique has been oversaturated. Given its widespread use, its effectiveness may even be limited.
Some SEOs even say they wouldn’t recommend it.
Clearly, many SEOs and marketers still believe the technique works.
According to Aira’s annual State of Link Building report, only 18% of SEOs still use the Skyscraper Technique. It’s not a go-to for many SEOs, as it ranks #20 among the list of tactics. I suspect its popularity has waned because (1) it’s old and SEOs are looking for newer stuff and (2) SEOs believe that content is more important than links these days.
Fundamentally, it makes sense that the Skyscraper Technique still works. After all, the principles are the same behind (almost) any link building strategy:
- Create great content
- Reach out to people and promote it
But why do people think it’s no longer effective? There are a few reasons why and knowing them will help you improve your chances of success with the Skyscraper Technique.
Let’s start with:
1. Sending only Brian’s email template
In Brian’s original post, he suggested an email template for his readers to use:
Hey, I found your post: http://post1
It links to this post: http://post2
I made something better: http://post3
Please swap out the link for mine.
Unfortunately, many SEOs decided to use this exact template word for word.
Link building doesn’t exist in a vacuum. If everyone in your niche decides to send this exact template to every possible website, it’ll burn out real fast. And that’s exactly what happened.
Now, if a website owner sees this template, chances are they’ll delete it right away.
Judging by my inbox, there are still people using this exact template. And, like everyone else, I delete the email immediately.
I’m not saying this to disparage templated emails. If you’re sending something at scale, templating is necessary. But move away from this template. Write your own, personalize it as much as possible, and follow the outreach principles here.
Even better, ask yourself:
“What makes my content unique and link-worthy?”
2. Not segmenting your prospects
People link for different reasons, so you shouldn’t send everyone the same pitch.
Consider dividing your list of prospects into segments according to the context in which they linked. You can do this by checking the Anchors report in Site Explorer.
You can clearly see people are linking to different statistics from our SEO statistics post. So, for example, if we were doing outreach for a hypothetical post, we might want to mention to the first group that we have a new statistic for “Over 90% of content gets no traffic from Google.”
Then, to the second group, we’ll mention that we have new statistics for “68% of online experiences.” And so on.
In fact, that’s exactly what we did when we built links to this post. Check out the case study here:
3. Not reaching out to enough people
Ultimately, link building is still a numbers game. If you don’t reach out to enough people, you won’t get enough links.
Simply put: You need to curate a larger list of link prospects.
So rather than limiting yourself to only replicating the backlinks of the original content, you should replicate the backlinks from other top-ranking pages covering the same topic too.
To find these pages, enter the target keyword into Keywords Explorer and scroll down to the SERP overview.
In this example, most top-ranking pages have tons of links, and all of them (after filtering, of course) could be potential link prospects.
Search for your keyword, set a Referring domains filter, and you’ll see relevant pages where you can “mine” for more skyscraper prospects.
4. Thinking bigger equals better
Someone creates a list with 15 tools. The next person ups it to 30. Another “skyscrapers” it to 50, and the next increases it to 100.
Not only is it a never-ending arms race, there’s also no value for the reader.
No one wants to skim through 5,000 words or hundreds of items just to find what they need. Curation is where the value is.
When considering the four aspects mentioned by Brian, don’t improve things for the sake of improving them. Adding 25 mediocre tips to an existing list of 25 doesn’t make it “better.” Likewise for changing the publish date or adding a few low-quality illustrations.
He differentiated himself through his knowledge and expertise. After all, Chris has 10 years of experience in SEO.
So when you’re creating your article, always look at any improvement through the lens of value:
Are you giving more value to the reader?
5. Not considering brand
As Ross Hudgens says, “Better does not occur in a branding vacuum.”
Most of the time, people didn’t read the article. They linked to us because of our brand and reputation—they knew we were publishing great content consistently, and they had confidence that the article we were pitching was great too.
In other words, there are times where no matter how hard you “skyscraper” your content, people just won’t link to it because they don’t know who you are.
Having your own personal brand is important these days. But think about it: What is a “strong brand” if not a consistent output of high-quality work that people enjoy? One lone skyscraper doesn’t make a city; many of them together do.
What I’m saying is this: Don’t be discouraged if your “skyscraper” article gets no results. And don’t be discouraged just because you don’t have a brand right now—you can work on that over time.
Keep on making great content—skyscraper or not—and results will come if you trust the process.
“Rome wasn’t built in a day, but they were laying bricks every hour.”
The Skyscraper Technique is a legitimate link building tactic that works. But that can only happen if you:
Any questions or comments? Let me know on Twitter.
13 Best High Ticket Affiliate Marketing Programs 2023
Are you looking for more ways to generate income for yourself or your business this year?
With high-ticket affiliate marketing programs, you earn money by recommending your favorite products or services to those who need them.
Affiliate marketers promote products through emails, blog posts, social media updates, YouTube videos, podcasts, and other forms of content with proper disclosure.
While not all affiliate marketers make enough to quit their 9-to-5, any additional income in the current economy can come in handy for individuals and businesses.
How To Get Started With Affiliate Marketing
Here’s a simple summary of how to get started with affiliate marketing.
- Build an audience. You need websites with traffic, email lists with subscribers, or social media accounts with followers to promote a product – or ideally, a combination of all three.
- Find products and services you can passionately promote to the audience you have built. The more you love something and believe in its efficacy, the easier it will be to convince someone else to buy it.
- Sign up for affiliate and referral programs. These will be offered directly through the company selling the product or service, or a third-party affiliate platform.
- Fill out your application and affiliate profile completely. Include your niche, monthly website traffic, number of email subscribers, and social media audience size. Companies will use that information to approve or reject your application.
- Get your custom affiliate or referral link and share it with your audience, or the segment of your audience that would benefit most from the product you are promoting.
- Look for opportunities to recommend products to new people. You can be helpful, make a new acquaintance, and earn a commission.
- Monitor your affiliate dashboard and website analytics for insights into your clicks and commissions.
- Adjust your affiliate marketing tactics based on the promotions that generate the most revenue.
Now, continue reading about the best high-ticket affiliate programs you can sign up for in 2023. They offer a high one-time payout, recurring commissions, or both.
The Best High-Ticket Affiliate Marketing Programs
What makes them these affiliate marketing programs the “best” is subjective, but I chose these programs based on their payout amounts, number of customers, and average customer ratings. Customer ratings help determine whether a product is worth recommending. You can also use customer reviews to help you market the products or services when you highlight impressive results customers gain from using the product or service, and the features customers love most.
Smartproxy allows customers to access business data worldwide for competitor research, search engine results page (SERP) scraping, price aggregation, and ad verification.
836 reviewers gave it an average rating of 4.7 out of five stars.
Earn up to $2,000 per customer that you refer to Smartproxy using its affiliate program.
Thinkific is an online course creation platform used by over 50,000 instructors in over 100 million courses.
669 reviewers gave it an average rating of 4.6 out of five stars.
Earn up to $1,700 per referral per year through the Thinkific affiliate program.
BigCommerce is an ecommerce provider with open SaaS, headless integrations, omnichannel, B2B, and offline-to-online solutions.
648 reviewers gave it an average rating of 8.1 out of ten stars.
Earn up to $1,500 for new enterprise customers, or 200% of the customer’s first payment by signing up for the BigCommerce affiliate program.
Teamwork, project management software focused on maximizing billable hours, helps everyone in your organization become more efficient – from the founder to the project managers.
1,022 reviewers gave it an average rating of 4.4 out of five stars.
Earn up to $1,000 per new customer referral with the Teamwork affiliate program.
Flywheel provides managed WordPress hosting geared towards agencies, ecommerce, and high-traffic websites.
36 reviewers gave it an average rating of 4.4 out of five stars.
Earn up to $500 per new referral from the Flywheel affiliate program.
Teachable is an online course platform used by over 100,000 entrepreneurs, creators, and businesses of all sizes to create engaging online courses and coaching businesses.
150 reviewers gave it a 4.4 out of five stars.
Earn up to $450 (average partner earnings) per month by joining the Teachable affiliate program.
Shutterstock is a global marketplace for sourcing stock photographs, vectors, illustrations, videos, and music.
507 reviewers gave it an average rating of 4.4 out of five stars.
Earn up to $300 for new customers by signing up for the Shutterstock affiliate program.
HubSpot provides a CRM platform to manage your organization’s marketing, sales, content management, and customer service.
3,616 reviewers gave it an average rating of 4.5 out of five stars.
Sucuri is a cloud-based security platform with experienced security analysts offering malware scanning and removal, protection from hacks and attacks, and better site performance.
251 reviewers gave it an average rating of 4.6 out of five stars.
Earn up to $210 per new sale by joining Sucuri referral programs for the platform, firewall, and agency products.
ADT is a security systems provider for residences and businesses.
588 reviewers gave it an average rating of 4.5 out of five stars.
Earn up to $200 per new customer that you refer through the ADT rewards program.
DreamHost web hosting supports WordPress and WooCommerce websites with basic, managed, and VPS solutions.
3,748 reviewers gave it an average rating of 4.7 out of five stars.
Earn up to $200 per referral and recurring monthly commissions with the DreamHost affiliate program.
Shopify, a top ecommerce solution provider, encourages educators, influencers, review sites, and content creators to participate in its affiliate program. Affiliates can teach others about entrepreneurship and earn a commission for recommending Shopify.
Earn up to $150 per referral and grow your brand as a part of the Shopify affiliate program.
Kinsta is a web hosting provider that offers managed WordPress, application, and database hosting.
529 reviewers gave it a 4.3 out of five stars.
Earn $50 – $100 per new customer, plus recurring revenue via the Kinsta affiliate program.
Even More Affiliate Marketing Programs
In addition to the high-ticket affiliate programs listed above, you can find more programs to join with a little research.
- Search for affiliate or referral programs for all of the products or services you have a positive experience with, personally or professionally.
- Search for affiliate or referral programs for all of the places you shop online.
- Search for partner programs for products and services your organization uses or recommends to others.
- Search for products and services that match your audience’s needs on affiliate platforms like Shareasale, Awin, and CJ.
- Follow influencers in your niche to see what products and services they recommend. They may have affiliate or referral programs as well.
A key to affiliate marketing success is to diversify the affiliate marketing programs you join.
It will ensure that you continue to generate an affiliate income, regardless of if one company changes or shutters its program.
Featured image: Shutterstock/fatmawati achmad zaenuri
The Current State of Google PageRank & How It Evolved
PageRank (PR) is an algorithm that improves the quality of search results by using links to measure the importance of a page. It considers links as votes, with the underlying assumption being that more important pages are likely to receive more links.
PageRank was created by Google co-founders Sergey Brin and Larry Page in 1997 when they were at Stanford University, and the name is a reference to both Larry Page and the term “webpage.”
In many ways, it’s similar to a metric called “impact factor” for journals, where more cited = more important. It differs a bit in that PageRank considers some votes more important than others.
By using links along with content to rank pages, Google’s results were better than competitors. Links became the currency of the web.
Want to know more about PageRank? Let’s dive in.
In terms of modern SEO, PageRank is one of the algorithms comprising Experience Expertise Authoritativeness Trustworthiness (E-E-A-T).
Google’s algorithms identify signals about pages that correlate with trustworthiness and authoritativeness. The best known of these signals is PageRank, which uses links on the web to understand authoritativeness.
Source: How Google Fights Disinformation
We’ve also had confirmation from Google reps like Gary Illyes, who said that Google still uses PageRank and that links are used for E-A-T (now E-E-A-T).
When I ran a study to measure the impact of links and effectively removed the links using the disavow tool, the drop was obvious. Links still matter for rankings.
PageRank has also been a confirmed factor when it comes to crawl budget. It makes sense that Google wants to crawl important pages more often.
Crazy fact: The formula published in the original PageRank paper was wrong. Let’s look at why.
PageRank was described in the original paper as a probability distribution—or how likely you were to be on any given page on the web. This means that if you sum up the PageRank for every page on the web together, you should get a total of 1.
Here’s the full PageRank formula from the original paper published in 1997:
PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn))
Simplified a bit and assuming the damping factor (d) is 0.85 as Google mentioned in the paper (I’ll explain what the damping factor is shortly), it’s:
PageRank for a page = 0.15 + 0.85 (a portion of the PageRank of each linking page split across its outbound links)
In the paper, they said that the sum of the PageRank for every page should equal 1. But that’s not possible if you use the formula in the paper. Each page would have a minimum PageRank of 0.15 (1-d). Just a few pages would put the total at greater than 1. You can’t have a probability greater than 100%. Something is wrong!
The formula should actually divide that (1-d) by the number of pages on the internet for it to work as described. It would be:
PageRank for a page = (0.15/number of pages on the internet) + 0.85 (a portion of the PageRank of each linking page split across its outbound links)
It’s still complicated, so let’s see if I can explain it with some visuals.
1. A page is given an initial PageRank score based on the links pointing to it. Let’s say I have five pages with no links. Each gets a PageRank of (1/5) or 0.2.
2. This score is then distributed to other pages through the links on the page. If I add some links to the five pages above and calculate the new PageRank for each, then I end up with this:
You’ll notice that the scores are favoring the pages with more links to them.
3. This calculation is repeated as Google crawls the web. If I calculate the PageRank again (called an iteration), you’ll see that the scores change. It’s the same pages with the same links, but the base PageRank for each page has changed, so the resulting PageRank is different.
The PageRank formula also has a so-called “damping factor,” the “d” in the formula, which simulates the probability of a random user continuing to click on links as they browse the web.
Think of it like this: The probability of you clicking a link on the first page you visit is reasonably high. But the likelihood of you then clicking a link on the next page is slightly lower, and so on and so forth.
If a strong page links directly to another page, it’s going to pass a lot of value. If the link is four clicks away, the value transferred from that strong page will be a lot less because of the damping factor.
The first PageRank patent was filed on January 9, 1998. It was titled “Method for node ranking in a linked database.” This patent expired on January 9, 2018, and was not renewed.
Google first made PageRank public when the Google Directory launched on March 15, 2000. This was a version of the Open Directory Project but sorted by PageRank. The directory was shut down on July 25, 2011.
It was December 11, 2000, when Google launched PageRank in the Google toolbar, which was the version most SEOs obsessed over.
This is how it looked when PageRank was included in Google’s toolbar.
PageRank in the toolbar was last updated on December 6, 2013, and was finally removed on March 7, 2016.
The PageRank shown in the toolbar was a little different. It used a simple 0–10 numbering system to represent the PageRank. But PageRank itself is a logarithmic scale where achieving each higher number becomes increasingly difficult.
PageRank even made its way into Google Sitemaps (now known as Google Search Console) on November 17, 2005. It was shown in categories of high, medium, low, or N/A. This feature was removed on October 15, 2009.
Over the years, there have been a lot of different ways SEOs have abused the system in the search for more PageRank and better rankings. Google has a whole list of link schemes that include:
- Buying or selling links—exchanging links for money, goods, products, or services.
- Excessive link exchanges.
- Using software to automatically create links.
- Requiring links as part of a terms of service, contract, or other agreement.
- Text ads that don’t use nofollow or sponsored attributes.
- Advertorials or native advertising that includes links that pass ranking credit.
- Articles, guest posts, or blogs with optimized anchor text links.
- Low-quality directories or social bookmark links.
- Keyword-rich, hidden, or low-quality links embedded in widgets that get put on other websites.
- Widely distributed links in footers or templates. For example, hard-coding a link to your website into the WP Theme that you sell or give away for free.
- Forum comments with optimized links in the post or signature.
The systems to combat link spam have evolved over the years. Let’s look at some of the major updates.
On January 18, 2005, Google announced it had partnered with other major search engines to introduce the rel=“nofollow” attribute. It encouraged users to add the nofollow attribute to blog comments, trackbacks, and referrer lists to help combat spam.
Here’s an excerpt from Google’s official statement on the introduction of nofollow:
If you’re a blogger (or a blog reader), you’re painfully familiar with people who try to raise their own websites’ search engine rankings by submitting linked blog comments like “Visit my discount pharmaceuticals site.” This is called comment spam, we don’t like it either, and we’ve been testing a new tag that blocks it. From now on, when Google sees the attribute (rel=“nofollow”) on hyperlinks, those links won’t get any credit when we rank websites in our search results.
Almost all modern systems use the nofollow attribute on blog comment links.
SEOs even began to abuse nofollow—because of course we did. Nofollow was used for PageRank sculpting, where people would nofollow some links on their pages to make other links stronger. Google eventually changed the system to prevent this abuse.
In 2009, Google’s Matt Cutts confirmed that this would no longer work and that PageRank would be distributed across links even if a nofollow attribute was present (but only passed through the followed link).
Google added a couple more link attributes that are more specific versions of the nofollow attribute on September 10, 2019. These included rel=“ugc” meant to identify user-generated content and rel=“sponsored” meant to identify links that were paid or affiliate.
Algorithms targeting link spam
As SEOs found new ways to game links, Google worked on new algorithms to detect this spam.
When the original Penguin algorithm launched on April 24, 2012, it hurt a lot of websites and website owners. Google gave site owners a way to recover later that year by introducing the disavow tool on October 16, 2012.
When Penguin 4.0 launched on September 23, 2016, it brought a welcome change to how link spam was handled by Google. Instead of hurting websites, it began devaluing spam links. This also meant that most sites no longer needed to use the disavow tool.
Google launched its first Link Spam Update on July 26, 2021. This recently evolved, and a Link Spam Update on December 14, 2022, announced the use of an AI-based detection system called SpamBrain to neutralize the value of unnatural links.
The original version of PageRank hasn’t been used since 2006, according to a former Google employee. The employee said it was replaced with another less resource-intensive algorithm.
They replaced it in 2006 with an algorithm that gives approximately-similar results but is significantly faster to compute. The replacement algorithm is the number that’s been reported in the toolbar, and what Google claims as PageRank (it even has a similar name, and so Google’s claim isn’t technically incorrect). Both algorithms are O(N log N) but the replacement has a much smaller constant on the log N factor, because it does away with the need to iterate until the algorithm converges. That’s fairly important as the web grew from ~1-10M pages to 150B+.
Remember those iterations and how PageRank kept changing with each iteration? It sounds like Google simplified that system.
What else has changed?
Some links are worth more than others
Rather than splitting the PageRank equally between all links on a page, some links are valued more than others. There’s speculation from patents that Google switched from a random surfer model (where a user may go to any link) to a reasonable surfer model (where some links are more likely to be clicked than others so they carry more weight).
Some links are ignored
There have been several systems put in place to ignore the value of certain links. We’ve already talked about a few of them, including:
- Nofollow, UGC, and sponsored attributes.
- Google’s Penguin algorithm.
- The disavow tool.
- Link Spam updates.
Google also won’t count any links on pages that are blocked by robots.txt. It won’t be able to crawl these pages to see any of the links. This system was likely in place from the start.
Some links are consolidated
Google has a canonicalization system that helps it determine what version of a page should be indexed and to consolidate signals from duplicate pages to that main version.
Canonical link elements were introduced on February 12, 2009, and allow users to specify their preferred version.
Redirects were originally said to pass the same amount of PageRank as a link. But at some point, this system changed and no PageRank is currently lost.
A bit is still unknown
When pages are marked as noindex, we don’t exactly know how Google treats the links. Even Googlers have conflicting statements.
According to John Mueller, pages that are marked noindex will eventually be treated as noindex, nofollow. This means that the links eventually stop passing any value.
These aren’t necessarily contradictory. But if you go by Gary’s statement, it could be a very long time before Google stops crawling and counting links—perhaps never.
There’s currently no way to see Google’s PageRank.
URL Rating (UR) is a good replacement metric for PageRank because it has a lot in common with the PageRank formula. It shows the strength of a page’s link profile on a 100-point scale. The bigger the number, the stronger the link profile.
Both PageRank and UR account for internal and external links when being calculated. Many of the other strength metrics used in the industry completely ignore internal links. I’d argue link builders should be looking more at UR than metrics like DR, which only accounts for links from other sites.
However, it’s not exactly the same. UR does ignore the value of some links and doesn’t count nofollow links. We don’t know exactly what links Google ignores and don’t know what links users may have disavowed, which will impact Google’s PageRank calculation. We also may make different decisions on how we treat some of the canonicalization signals like canonical link elements and redirects.
So our advice is to use it but know that it may not be exactly like Google’s system.
We also have Page Rating (PR) in Site Audit’s Page Explorer. This is similar to an internal PageRank calculation and can be useful to see what the strongest pages on your site are based on your internal link structure.
Since PageRank is based on links, to increase your PageRank, you need better links. Let’s look at your options.
Redirect broken pages
Redirecting old pages on your site to relevant new pages can help reclaim and consolidate signals like PageRank. Websites change over time, and people don’t seem to like to implement proper redirects. This may be the easiest win, since those links already point to you but currently don’t count for you.
Here’s how to find those opportunities:
I usually sort this by “Referring domains.”
Take those pages and redirect them to the current pages on your site. If you don’t know exactly where they go or don’t have the time, I have an automated redirect script that may help. It looks at the old content from archive.org and matches it with the closest current content on your site. This is where you likely want to redirect the pages.
Backlinks aren’t always within your control. People can link to any page on your site they choose, and they can use whatever anchor text they like.
Internal links are different. You have full control over them.
Internally link where it makes sense. For instance, you may want to link more to pages that are more important to you.
We have a tool within Site Audit called Internal Link Opportunities that helps you quickly locate these opportunities.
This tool works by looking for mentions of keywords that you already rank for on your site. Then it suggests them as contextual internal link opportunities.
For example, the tool shows a mention of “faceted navigation” in our guide to duplicate content. As Site Audit knows we have a page about faceted navigation, it suggests we add an internal link to that page.
You can also get more links from other sites to your own to increase your PageRank. We have a lot of guides around link building already. Some of my favorites are:
Even though PageRank has changed, we know that Google still uses it. We may not know all the details or everything involved, but it’s still easy to see the impact of links.
Also, Google just can’t seem to get away from using links and PageRank. It once experimented with not using links in its algorithm and decided against it.
So we don’t have a version like that that is exposed to the public but we have our own experiments like that internally and the quality looks much much worse. It turns out backlinks, even though there is some noise and certainly a lot of spam, for the most part are still a really really big win in terms of quality of search results.
We played around with the idea of turning off backlink relevance and at least for now backlinks relevance still really helps in terms of making sure that we turn the best, most relevant, most topical set of search results.
Source: YouTube (Google Search Central)
If you have any questions, message me on Twitter.
Helpful Content & Link Spam Update Done, SEO, Search Console & More
OpenAI Introduces ChatGPT Plus with Monthly Subscription of $20
Why (& How) to Set Up Conversion Paths in Google Analytics (Successfully!)
Twitter’s Cancelling Free Access to its API, Which Will Shut Down Hundreds of Apps
What the Big Tech Layoffs Mean for SMBs & PPC: 8 Key Takeaways
The Return Of Yahoo Search
Yelp Details Removal Of Paid Review Groups & Lead Generators
What to Consider When Choosing a Brand Ambassador for Your Social Media Campaign
Pinterest Focuses on Travel Inspiration and Education for Black History Month
Tools to Make Designing Your Site Easier Than Ever – WordPress.com News
SEARCHENGINES5 days ago
Helpful Content & Link Spam Update Done, SEO, Search Console & More
NEWS6 days ago
OpenAI Introduces ChatGPT Plus with Monthly Subscription of $20
PPC7 days ago
Why (& How) to Set Up Conversion Paths in Google Analytics (Successfully!)
SOCIAL5 days ago
Twitter’s Cancelling Free Access to its API, Which Will Shut Down Hundreds of Apps
PPC5 days ago
What the Big Tech Layoffs Mean for SMBs & PPC: 8 Key Takeaways
SEARCHENGINES7 days ago
The Return Of Yahoo Search
SEO6 days ago
Yelp Details Removal Of Paid Review Groups & Lead Generators
MARKETING5 days ago
What to Consider When Choosing a Brand Ambassador for Your Social Media Campaign