SEO
5 Top Enterprise Local SEO Challenges & How To Solve Them
Local SEO can be challenging for enterprise brands because it means knowing how to do “national” SEO, Google Business Profile, and then learning how Google handles your priority search queries in various markets.
That means there are an infinite number of challenges in local SEO for enterprise search marketers. So what are the most common challenges in enterprise local SEO? Let’s find out.
1. Knowing When To Prioritize Local vs. National SEO
One of the biggest challenges enterprises face is knowing when to focus on a “local” SEO strategy instead of a “national” SEO strategy and vice versa.
This is understandable as it’s not always immediately apparent if your priorities are better served by one or the other. It can be challenging to tell if your target top keywords have local intent.
But, it’s vital to success with your overall strategy because it will significantly impact how well your initiatives serve your business goals.
Understanding which terms Google regards as local can help you develop your keyword strategy and determine how to approach and support your SEO investment.
You could lose a lot of traffic due to poor site design or keyword strategy.
Understanding Local Search Intent
So, what do we mean by local search intent, exactly?
By understanding search intent, you know what kind of features will appear in search results and what content you should prioritize.
For this discussion, there are four main types of search intents to focus on:
- Search queries with national intent.
- Search queries with semi-national.
- Search queries with local intent.
- Search queries with hyper-local intent.
You can tell what type of intent your target search queries fall into by the features shown on the SERPs, for example:
Queries With National Search Intent
SERPs feature no state/city-specific pages and no map pack (example).
The fact that there are no “local’ results in this SERP probably means Google sees zero local intent for these queries.
The minute a large portion of searchers starts to redo this query with location info such as “Pleasanton newspaper article,” the SERPs will likely shift to results that have some local results, which brings us to…
Queries With Semi-National Search Intent
SERPs feature no state/city-specific pages but a map pack (example).
Semi-national queries like [bank] might include a map pack because there’s an equal amount of local and national clicks. This could be because some users are looking for a bank branch close to them, but others are looking for the bank’s main home page.
Queries With Local Search Intent
SERPs feature partial to full state/city-specific pages and a map pack (example).
For a term like [plumber], Google will feature a map pack of nearby plumbers, and the remainder of the page one results are filled with location pages. Google predicts that the user intends to find a plumber near their location.
Queries With Hyper-Local Search Intent
Hyper-local keywords are where the searcher’s location is the most significant and significantly impacts SERP results (example).
In the case of hyper-local intent queries, the distance between the user and business matters most. You can see that the map pack dominates the SERP real estate for this query. So, Google likely thinks [Auto insurance near me] requires hyper-local results to be helpful for the user.
How To Identify Search Intent
- Analyze current SERP outcomes across different geos.
- Examine the SERP for a map element.
- Check for state or city-specific pages.
- Review the titles and URLs.
- Analyze consistency and make an intent determination.
How To Build A Strategy For Different Types Of Search Intent
National Strategy
Nationally focused strategies will need a ton of content and authority.
Your main website should be where you invest the most of your SEO budget if you’ve determined that your target keywords are in queries with little to no local intent. This will help you get that ranking by generating backlinks.
Semi-Local Strategy
Semi-local keywords will require the bulk of focus to build the content and authority of your main site with one additional point of focus. Because semi-local keywords generate a map pack, you must optimize your Google Business Profile listings.
Local Strategy
Your site structure will become significantly more important if you’ve determined that Google treats your keyword as local. You can increase the volume of searches if you create a directory of state or city pages.
Hyper-Local Strategy
When your priority keywords are hyper-local, creating a directory of state and city-level pages is preferable and optimizing them for near-me keywords with special location pages is preferable.
The layers will likely look different depending on your vertical, but broadly, they might resemble this:
- Locator index page.
- State page.
- City page.
- Location page.
2. Having A Single Source Of Truth For Location Data
With the advent of local listings management companies such as Yext and Uberall, this is no longer a problem.
However, we still run into multi-location businesses that don’t have a “single source of truth” for all of their location information.
If you don’t have this yet, put it in place.
3. Optimizing Store Locators
Many brands outsource their store locators to third-party vendors. There’s nothing wrong with this in theory, but there are a few ways we have seen this go wrong:
Search-Only Store Locators
For SEO, an effective store locator should be a basic linked set of state, city, and location pages that a bot or user can easily click around to get to every page. But many brands often build their store locators a locator page with a search box to find your location.
A few years ago, we looked at the locators for the top 100 U.S. retailers and found those with search-only locators ranked for ~50% fewer keywords than those with a linkable state > city > location architecture.
So, make sure your locator architecture is built this way.
Location Page Content
Often, brands budget for building a locator on their site but leave nothing for the content.
There’s nothing wrong with a basic location page with the business name, address, phone number, product/service categories, etc. But a location page with unique, beefed-up content relevant to the location and topics you are trying to rank for can improve SEO performance.
This is where your location managers can come in handy. We often see successful brands use surveys of their location managers to get unique local content.
Other sources might include local customer reviews, syndicated local point of interest data, and popular products in the specific market.
Priority Categories
Most ecommerce queries show local results near the top of the SERPs these days.
We often see brands winning in Local Packs linking from their location pages to their key categories.
Think of it as signaling to Google that your locations are relevant for these categories.
4. Google Business Profile Management And Optimization
Google Business Profile (GBP) really shouldn’t be a challenge – I mean, it’s just a simple set of yellow pages listings for your locations – but there are a million ways it can go wrong for businesses.
Here are just a few challenges and opportunities with GBP.
Beware Of Duplicate Listings
Amazingly, duplicate listings are still a thing with GBP, but I just talked to a service area business that was having problems ranking. It was pretty easy to see they had duplicate GBP listings.
The minute they deleted the duplicate listings, their rankings went up by 15 positions for the main keyword they were targeting. So, keep an eye on those.
Monitor Your GBP listings
Your GBP listings are in a constant state of flux. Users are adding photos and reviews.
Google can overwrite your data if it trusts data from another party more than it trusts you.
GBP is not a “set it and forget it” thing. Create a system to monitor changes to your GBP pages regularly.
While you can see many changes via the GBP Dashboard, it won’t catch everything. That’s one of the reasons we built this free, open-source tool to monitor image changes to your GBP.
Scale GBP Posts
GBP Posts are short announcements you can attach to your GBP. These can be an inexpensive way to generate high-converting visits to your site. Posts can include text, photos, or videos.
The challenge we often see is that businesses are often not set up to produce content for each location. If you want to do GBP Posts for multiple locations, implement a system for creating GBP-ready marketing collateral for new promotions so they can be posted.
This often involves creating a GBP-sized version (400 x 300) of approved marketing images and copy for GBP as part of each new promotion.
You’ll also want to ensure you tag links from your GBP posts with tracking parameters to measure performance.
5. Building A Local Search Presence For SABs And Marketplaces
Not every local enterprise brand has locations.
There are plenty of local marketplace brands like Yelp, DoorDash, and Zillow, and service area businesses (SABs) like plumbers and roofers that target local search queries but are not eligible to appear in Local Packs. This is because they have no physical locations in their target markets.
And this means they are missing out on many potential clicks and revenue.
This won’t work for every brand, but for those with a suitable business model, creating a “store within a store” at a partner brand’s location is a great way to get additional local pack visibility.
FedEx OnSite services located in Walgreens is a good example of how this can work:
And, of course, if the value of the leads is high enough, you may want to consider opening up physical locations in certain areas to try to rank well in the Local Packs.
As I said at the top, there are an infinite number of local SEO tactics enterprise brands can deploy.
As you deploy new tactics, make sure you test, measure, and iterate like any other marketing channel.
More Resources:
Featured Image: GaudiLab/Shutterstock
SEO
How Compression Can Be Used To Detect Low Quality Pages
The concept of Compressibility as a quality signal is not widely known, but SEOs should be aware of it. Search engines can use web page compressibility to identify duplicate pages, doorway pages with similar content, and pages with repetitive keywords, making it useful knowledge for SEO.
Although the following research paper demonstrates a successful use of on-page features for detecting spam, the deliberate lack of transparency by search engines makes it difficult to say with certainty if search engines are applying this or similar techniques.
What Is Compressibility?
In computing, compressibility refers to how much a file (data) can be reduced in size while retaining essential information, typically to maximize storage space or to allow more data to be transmitted over the Internet.
TL/DR Of Compression
Compression replaces repeated words and phrases with shorter references, reducing the file size by significant margins. Search engines typically compress indexed web pages to maximize storage space, reduce bandwidth, and improve retrieval speed, among other reasons.
This is a simplified explanation of how compression works:
- Identify Patterns:
A compression algorithm scans the text to find repeated words, patterns and phrases - Shorter Codes Take Up Less Space:
The codes and symbols use less storage space then the original words and phrases, which results in a smaller file size. - Shorter References Use Less Bits:
The “code” that essentially symbolizes the replaced words and phrases uses less data than the originals.
A bonus effect of using compression is that it can also be used to identify duplicate pages, doorway pages with similar content, and pages with repetitive keywords.
Research Paper About Detecting Spam
This research paper is significant because it was authored by distinguished computer scientists known for breakthroughs in AI, distributed computing, information retrieval, and other fields.
Marc Najork
One of the co-authors of the research paper is Marc Najork, a prominent research scientist who currently holds the title of Distinguished Research Scientist at Google DeepMind. He’s a co-author of the papers for TW-BERT, has contributed research for increasing the accuracy of using implicit user feedback like clicks, and worked on creating improved AI-based information retrieval (DSI++: Updating Transformer Memory with New Documents), among many other major breakthroughs in information retrieval.
Dennis Fetterly
Another of the co-authors is Dennis Fetterly, currently a software engineer at Google. He is listed as a co-inventor in a patent for a ranking algorithm that uses links, and is known for his research in distributed computing and information retrieval.
Those are just two of the distinguished researchers listed as co-authors of the 2006 Microsoft research paper about identifying spam through on-page content features. Among the several on-page content features the research paper analyzes is compressibility, which they discovered can be used as a classifier for indicating that a web page is spammy.
Detecting Spam Web Pages Through Content Analysis
Although the research paper was authored in 2006, its findings remain relevant to today.
Then, as now, people attempted to rank hundreds or thousands of location-based web pages that were essentially duplicate content aside from city, region, or state names. Then, as now, SEOs often created web pages for search engines by excessively repeating keywords within titles, meta descriptions, headings, internal anchor text, and within the content to improve rankings.
Section 4.6 of the research paper explains:
“Some search engines give higher weight to pages containing the query keywords several times. For example, for a given query term, a page that contains it ten times may be higher ranked than a page that contains it only once. To take advantage of such engines, some spam pages replicate their content several times in an attempt to rank higher.”
The research paper explains that search engines compress web pages and use the compressed version to reference the original web page. They note that excessive amounts of redundant words results in a higher level of compressibility. So they set about testing if there’s a correlation between a high level of compressibility and spam.
They write:
“Our approach in this section to locating redundant content within a page is to compress the page; to save space and disk time, search engines often compress web pages after indexing them, but before adding them to a page cache.
…We measure the redundancy of web pages by the compression ratio, the size of the uncompressed page divided by the size of the compressed page. We used GZIP …to compress pages, a fast and effective compression algorithm.”
High Compressibility Correlates To Spam
The results of the research showed that web pages with at least a compression ratio of 4.0 tended to be low quality web pages, spam. However, the highest rates of compressibility became less consistent because there were fewer data points, making it harder to interpret.
Figure 9: Prevalence of spam relative to compressibility of page.
The researchers concluded:
“70% of all sampled pages with a compression ratio of at least 4.0 were judged to be spam.”
But they also discovered that using the compression ratio by itself still resulted in false positives, where non-spam pages were incorrectly identified as spam:
“The compression ratio heuristic described in Section 4.6 fared best, correctly identifying 660 (27.9%) of the spam pages in our collection, while misidentifying 2, 068 (12.0%) of all judged pages.
Using all of the aforementioned features, the classification accuracy after the ten-fold cross validation process is encouraging:
95.4% of our judged pages were classified correctly, while 4.6% were classified incorrectly.
More specifically, for the spam class 1, 940 out of the 2, 364 pages, were classified correctly. For the non-spam class, 14, 440 out of the 14,804 pages were classified correctly. Consequently, 788 pages were classified incorrectly.”
The next section describes an interesting discovery about how to increase the accuracy of using on-page signals for identifying spam.
Insight Into Quality Rankings
The research paper examined multiple on-page signals, including compressibility. They discovered that each individual signal (classifier) was able to find some spam but that relying on any one signal on its own resulted in flagging non-spam pages for spam, which are commonly referred to as false positive.
The researchers made an important discovery that everyone interested in SEO should know, which is that using multiple classifiers increased the accuracy of detecting spam and decreased the likelihood of false positives. Just as important, the compressibility signal only identifies one kind of spam but not the full range of spam.
The takeaway is that compressibility is a good way to identify one kind of spam but there are other kinds of spam that aren’t caught with this one signal. Other kinds of spam were not caught with the compressibility signal.
This is the part that every SEO and publisher should be aware of:
“In the previous section, we presented a number of heuristics for assaying spam web pages. That is, we measured several characteristics of web pages, and found ranges of those characteristics which correlated with a page being spam. Nevertheless, when used individually, no technique uncovers most of the spam in our data set without flagging many non-spam pages as spam.
For example, considering the compression ratio heuristic described in Section 4.6, one of our most promising methods, the average probability of spam for ratios of 4.2 and higher is 72%. But only about 1.5% of all pages fall in this range. This number is far below the 13.8% of spam pages that we identified in our data set.”
So, even though compressibility was one of the better signals for identifying spam, it still was unable to uncover the full range of spam within the dataset the researchers used to test the signals.
Combining Multiple Signals
The above results indicated that individual signals of low quality are less accurate. So they tested using multiple signals. What they discovered was that combining multiple on-page signals for detecting spam resulted in a better accuracy rate with less pages misclassified as spam.
The researchers explained that they tested the use of multiple signals:
“One way of combining our heuristic methods is to view the spam detection problem as a classification problem. In this case, we want to create a classification model (or classifier) which, given a web page, will use the page’s features jointly in order to (correctly, we hope) classify it in one of two classes: spam and non-spam.”
These are their conclusions about using multiple signals:
“We have studied various aspects of content-based spam on the web using a real-world data set from the MSNSearch crawler. We have presented a number of heuristic methods for detecting content based spam. Some of our spam detection methods are more effective than others, however when used in isolation our methods may not identify all of the spam pages. For this reason, we combined our spam-detection methods to create a highly accurate C4.5 classifier. Our classifier can correctly identify 86.2% of all spam pages, while flagging very few legitimate pages as spam.”
Key Insight:
Misidentifying “very few legitimate pages as spam” was a significant breakthrough. The important insight that everyone involved with SEO should take away from this is that one signal by itself can result in false positives. Using multiple signals increases the accuracy.
What this means is that SEO tests of isolated ranking or quality signals will not yield reliable results that can be trusted for making strategy or business decisions.
Takeaways
We don’t know for certain if compressibility is used at the search engines but it’s an easy to use signal that combined with others could be used to catch simple kinds of spam like thousands of city name doorway pages with similar content. Yet even if the search engines don’t use this signal, it does show how easy it is to catch that kind of search engine manipulation and that it’s something search engines are well able to handle today.
Here are the key points of this article to keep in mind:
- Doorway pages with duplicate content is easy to catch because they compress at a higher ratio than normal web pages.
- Groups of web pages with a compression ratio above 4.0 were predominantly spam.
- Negative quality signals used by themselves to catch spam can lead to false positives.
- In this particular test, they discovered that on-page negative quality signals only catch specific types of spam.
- When used alone, the compressibility signal only catches redundancy-type spam, fails to detect other forms of spam, and leads to false positives.
- Combing quality signals improves spam detection accuracy and reduces false positives.
- Search engines today have a higher accuracy of spam detection with the use of AI like Spam Brain.
Read the research paper, which is linked from the Google Scholar page of Marc Najork:
Detecting spam web pages through content analysis
Featured Image by Shutterstock/pathdoc
SEO
New Google Trends SEO Documentation
Google Search Central published new documentation on Google Trends, explaining how to use it for search marketing. This guide serves as an easy to understand introduction for newcomers and a helpful refresher for experienced search marketers and publishers.
The new guide has six sections:
- About Google Trends
- Tutorial on monitoring trends
- How to do keyword research with the tool
- How to prioritize content with Trends data
- How to use Google Trends for competitor research
- How to use Google Trends for analyzing brand awareness and sentiment
The section about monitoring trends advises there are two kinds of rising trends, general and specific trends, which can be useful for developing content to publish on a site.
Using the Explore tool, you can leave the search box empty and view the current rising trends worldwide or use a drop down menu to focus on trends in a specific country. Users can further filter rising trends by time periods, categories and the type of search. The results show rising trends by topic and by keywords.
To search for specific trends users just need to enter the specific queries and then filter them by country, time, categories and type of search.
The section called Content Calendar describes how to use Google Trends to understand which content topics to prioritize.
Google explains:
“Google Trends can be helpful not only to get ideas on what to write, but also to prioritize when to publish it. To help you better prioritize which topics to focus on, try to find seasonal trends in the data. With that information, you can plan ahead to have high quality content available on your site a little before people are searching for it, so that when they do, your content is ready for them.”
Read the new Google Trends documentation:
Get started with Google Trends
Featured Image by Shutterstock/Luis Molinero
SEO
All the best things about Ahrefs Evolve 2024
Hey all, I’m Rebekah and I am your Chosen One to “do a blog post for Ahrefs Evolve 2024”.
What does that entail exactly? I don’t know. In fact, Sam Oh asked me yesterday what the title of this post would be. “Is it like…Ahrefs Evolve 2024: Recap of day 1 and day 2…?”
Even as I nodded, I couldn’t get over how absolutely boring that sounded. So I’m going to do THIS instead: a curation of all the best things YOU loved about Ahrefs’ first conference, lifted directly from X.
Let’s go!
OUR HUGE SCREEN
The largest presentation screen I’ve ever seen! #ahrefsevolve pic.twitter.com/oboiMFW1TN
— Patrick Stox (@patrickstox) October 24, 2024
This is the biggest presentation screen I ever seen in my life. It’s like iMax for SEO presentations. #ahrefsevolve pic.twitter.com/sAfZ1rtePx
— Suganthan Mohanadasan (@Suganthanmn) October 24, 2024
CONFERENCE VENUE ITSELF
It was recently named the best new skyscraper in the world, by the way.
The Ahrefs conference venue feels like being in inception. #AhrefsEvolve pic.twitter.com/18Yjai1Cej
— Suganthan Mohanadasan (@Suganthanmn) October 24, 2024
I’m in Singapore for @ahrefs Evolve this week. Keen to connect with people doing interesting work on the future of search / AI #ahrefsevolve pic.twitter.com/s00UkIbxpf
— Alex Denning (@AlexDenning) October 23, 2024
OUR AMAZING SPEAKER LINEUP – SUPER INFORMATIVE, USEFUL TALKS!
A super insightful explanation of how Google Search Ranking works #ahrefsevolve pic.twitter.com/Cd1VSET2Aj
— Amanda Walls (@amandajwalls) October 24, 2024
“would I even do this if Google didn’t exist?” – what a great question to assess if you actually have the right focus when creating content amazing presentation from @amandaecking at #AhrefsEvolve pic.twitter.com/a6OKbKxwiS
— Aleyda Solis ️ (@aleyda) October 24, 2024
Attending @CyrusShepard ‘s talk on WTF is Helpful Content in Google’s algorithm at #AhrefsEvolve
“Focus on people first content”
Super relevant for content creators who want to stay ahead of the ever evolving Google search curve! #SEOTalk #SEO pic.twitter.com/KRTL13SB0g
This is the first time I am listening to @aleyda and it is really amazing. Lot of insights and actionable information.
Thank you #aleyda for power packed presentation.#AhrefsEvolve @ahrefs #seo pic.twitter.com/Xe3A9MGfrr
— Jignesh Gohel (@jigneshgohel) October 25, 2024
— Parth Suba (@parthsuba77) October 24, 2024
@thinking_slows thoughts on AI content – “it’s very good if you want to be average”.
We can do a lot better and Ryan explains how. Love it @ahrefs #AhrefsEvolve pic.twitter.com/qFqWs6QBH5
— Andy Chadwick (@digitalquokka) October 24, 2024
A super insightful explanation of how Google Search Ranking works #ahrefsevolve pic.twitter.com/Cd1VSET2Aj
— Amanda Walls (@amandajwalls) October 24, 2024
This is the first time I am listening to @aleyda and it is really amazing. Lot of insights and actionable information.
Thank you #aleyda for power packed presentation.#AhrefsEvolve @ahrefs #seo pic.twitter.com/Xe3A9MGfrr
— Jignesh Gohel (@jigneshgohel) October 25, 2024
GREAT MUSIC
First time I’ve ever Shazam’d a track during SEO conference ambience…. and the track wasn’t even Shazamable! #AhrefsEvolve @ahrefs pic.twitter.com/ZDzJOZMILt
— Lily Ray (@lilyraynyc) October 24, 2024
AMAZING GOODIES
Ahrefs Evolveきました!@ahrefs @AhrefsJP #AhrefsEvolve pic.twitter.com/33EiejQPdX
— さくらぎ (@sakuragi_ksy) October 24, 2024
Aside from the very interesting topics, what makes this conference even cooler are the ton of awesome freebies
Kudos for making all of these happen for #AhrefsEvolve @ahrefs team pic.twitter.com/DGzk5FSTN8
— Krista Melgarejo (@kimelgarejo) October 24, 2024
Content Goblin and SEO alligator party stickers are definitely going on my laptop. @ahrefs #ahrefsevolve pic.twitter.com/QBsBuY5Yix
— Patrick Stox (@patrickstox) October 24, 2024
This is one of the best swag bags I’ve received at any conference!
Either @ahrefs actually cares or the other conference swag bags aren’t up to par w Ahrefs!#AhrefsEvolve pic.twitter.com/Yc9e6wZPHn— Moses Sanchez (@SanchezMoses) October 25, 2024
SELFIE BATTLE
Some background: Tim and Sam have a challenge going on to see who can take the most number of selfies with all of you. Last I heard, Sam was winning – but there is room for a comeback yet!
Got the rare selfie with both @timsoulo and @samsgoh #AhrefsEvolve
— Bernard Huang (@bernardjhuang) October 24, 2024
THAT BELL
Everybody’s just waiting for this one.
@timsoulo @ahrefs #AhrefsEvolve pic.twitter.com/6ypWaTGDDP
— Jinbo Liang (@JinboLiang) October 24, 2024
STICKER WALL
Viva la vida, viva Seo!
Awante Argentina loco!#AhrefsEvolve pic.twitter.com/sfhbI2kWSH
— Gaston Riera. (@GastonRiera) October 24, 2024
AND, OF COURSE…ALL OF YOU!
#AhrefsEvolve let’s goooooooooooo!!! pic.twitter.com/THtdvdtUyB
— Tim Soulo (@timsoulo) October 24, 2024
–
There’s a TON more content on LinkedIn – click here – but I have limited time to get this post up and can’t quite figure out how to embed LinkedIn posts so…let’s stop here for now. I’ll keep updating as we go along!
-
AI3 days ago
How AI is Transforming SEO and What Website Owners Need to Know
-
SEARCHENGINES7 days ago
Google Ranking Movement, Sitelinks Search Box Going Away, Gen-AI In Bing & Google, Ad News & More
-
WORDPRESS6 days ago
5 Most Profitable Online Businesses You Can Start Today for Free!
-
WORDPRESS6 days ago
Automattic demanded web host pay $32M annually for using WordPress trademark
-
SEARCHENGINES6 days ago
Google Search Ranking Volatility October 26th & 27th & 23rd & 24th
-
SEO6 days ago
New Google Trends SEO Documentation
-
AFFILIATE MARKETING6 days ago
Cut Costs, Not Features with This Microsoft Bundle Deal
-
WORDPRESS5 days ago
WP Engine sues WordPress co-creator Mullenweg and Automattic, alleging abuse of power
You must be logged in to post a comment Login