SEO
Why Google Seems To Favor Big Brands & Low-Quality Content
Many people are convinced that Google shows a preference for big brands and ranking low quality content, something that many feel has become progressively worse. This may not be a matter of perception, something is going on, nearly everyone has an anecdote of poor quality search results. The possible reasons for it are actually quite surprising.
Google Has Shown Favoritism In The Past
This isn’t the first time that Google’s search engine results pages (SERPs) have shown a bias that favored big brand websites. During the early years of Google’s algorithm it was obvious that sites with a lot of PageRank ranked for virtually anything they wanted.
For example, I remember a web design company that built a lot of websites, creating a network of backlinks, raising their PageRank to a remarkable level normally seen only in big corporate sites like IBM. As a consequence they ranked for the two-word keyword phrase, Web Design and virtually every other variant like Web Design + [any state in the USA].
Everyone knew that websites with a PageRank of 10, the highest level shown on Google’s toolbar, practically had a free pass in the SERPs, resulting in big brand sites outranking more relevant webpages. It didn’t go unnoticed when Google eventually adjusted their algorithm to fix this issue.
The point of this anecdote is to point out an instance of where Google’s algorithm unintentionally created a bias that favored big brands.
Here are are other algorithm biases that publishers exploited:
- Top 10 posts
- Longtail “how-to” articles
- Misspellings
- Free Widgets in footer that contained links (always free to universities!)
Big Brands And Low Quality Content
There are two things that have been a constant for all of Google’s history:
- Low quality content
- Big brands crowding out small independent publishers
Anyone that’s ever searched for a recipe knows that the more general the recipe the lower the quality of recipe that gets ranked. Search for something like cream of chicken soup and the main ingredient for nearly every recipe is two cans of chicken soup.
A search for Authentic Mexican Tacos results in recipes with these ingredients:
- Soy sauce
- Ground beef
- “Cooked chicken”
- Taco shells (from the store!)
- Beer
Not all recipe SERPs are bad. But some of the more general recipes Google ranks are so basic that a hobo can cook them on a hotplate.
Robin Donovan (Instagram), a cookbook author and online recipe blogger observed:
“I think the problem with google search rankings for recipes these days (post HCU) are much bigger than them being too simple.
The biggest problem is that you get a bunch of Reddit threads or sites with untested user-generated recipes, or scraper sites that are stealing recipes from hardworking bloggers.
In other words, content that is anything but “helpful” if what you want is a tested and well written recipe that you can use to make something delicious.”
Explanations For Why Google’s SERPs Are Broken
It’s hard not to get away from the perception that Google’s rankings for a variety of topics always seem to default to big brand websites and low quality webpages.
Small sites grow to become big brands that dominate the SERPs, it happens. But that’s the thing, even when a small site gets big, it’s now another big brand dominating the SERPs.
Typical explanations for poor SERPs:
- It’s a conspiracy to increase ad clicks
- Content itself these days are low quality across the board
- Google doesn’t have anything else to rank
- It’s the fault of SEOs
- Affiliates
- Poor SERPs is Google’s scheme to drive more ad clicks
- Google promotes big brands because [insert your conspiracy]
So what’s going on?
People Love Big Brands & Garbage Content
The recent Google anti-trust lawsuit exposed the importance of the Navboost algorithm signals as a major ranking factor. Navboost is an algorithm that interprets user engagement signals to understand what topics a webpage is relevant for, among other things.
The idea of using engagement signals as an indicator of what users expect to see makes sense. After all, Google is user-centric and who better to decide what’s best for users than the users themselves, right?
Well, consider that arguably the the biggest and most important song of 1991, Smells Like Teen Spirt by Nirvana, didn’t make the Billboard top 100 for that year. Michael Bolton and Rod Stewart made the list twice, with Rod Stewart top ranked for a song called “The Motown Song” (anyone remember that one?)
Nirvana didn’t make the charts until the next year…
My opinion, given that we know that user interactions are a strong ranking signal, is that Google’s search rankings follow a similar pattern related to users’ biases.
People tend to choose what they know. It’s called a Familiarity Bias.
Consumers have a habit of choosing things that are familiar over those that are unfamiliar. This preference shows up in product choices that prefer brands, for example.
Behavioral scientist, Jason Hreha, defines Familiarity Bias like this:
“The familiarity bias is a phenomenon in which people tend to prefer familiar options over unfamiliar ones, even when the unfamiliar options may be better. This bias is often explained in terms of cognitive ease, which is the feeling of fluency or ease that people experience when they are processing familiar information. When people encounter familiar options, they are more likely to experience cognitive ease, which can make those options seem more appealing.”
Except for certain queries (like those related to health), I don’t think Google makes an editorial decision to certain kinds of websites, like brands.
Google uses many signals for ranking. But Google is strongly user focused.
I believe it’s possible that strong user preferences can carry a more substantial weight than Reviews System signals. How else to explain why Google seemingly has a bias for big brand websites with fake reviews rank better than honest independent review sites?
It’s not like Google’s algorithms haven’t created poor search results in the past.
- Google’s Panda algorithm was designed to get rid of a bias for cookie cutter content.
- The Reviews System is a patch to fix Google’s bias for content that’s about reviews but aren’t necessarily reviews.
If Google has systems for catching low quality sites that their core algorithm would otherwise rank, why do big brands and poor quality content still rank?
I believe the answer is that is what users prefer to see those sites, as indicated by user interaction signals.
The big question to ask is whether Google will continue to rank what users biases and inexperience trigger user satisfaction signals. Or will Google continue serving the sugar-frosted bon-bons that users crave?
Should Google make the choice to rank quality content at the risk that users find it too hard to understand?
Or should publishers give up and focus on creating for the lowest common denominator like the biggest popstars do?