SEO
How To See Google Search Results And Rankings For Different Locations
Every time someone enters a query into Google, the search engine applies a complicated equation to discover precisely what the searcher is looking for.
All sorts of factors are included in this predictive algorithm, but one of the important ones is the searcher’s location. Google has been very clear about the emphasis it puts on local search.
And that’s great if you’re a neighborhood mom and pop serving a small geographical area.
But what happens if you have multiple locations? Or if you operate on a national or even international scale? Can you still rank as highly as competitors that are local to the searcher?
For the answers to these questions and more, read on.
Why Do Search Results Vary By Location?
The reason Google factors location into its rankings is pretty obvious when you think about it: In many cases, local means more relevant.
For example, if you are hankering for a mocha latte, a Google search that directs you to a coffee shop on the other side of the country isn’t very helpful; even a result from the other side of town isn’t as useful as one just around the corner.
Google’s locating capabilities are very accurate, using several sources to estimate where you are. Depending on what’s available, it considers:
- Your device location (via Wi-Fi location, cell phone triangulation, or GPS, which can pinpoint your location to around 20 meters).
- Your labeled places (i.e., the names that show up next to markers on Google Maps).
- The home address that is linked to your Google account.
- Previous activity across Google products.
- Your IP address.
Working together, these allow Google to determine where you are – and what’s within your local search radius. And this means you and someone living a block away could get different search results for the same exact query.
Now consider that 25% of people click on the very first search result, and the vast majority never make it off the first page, and you can begin to understand why ranking locally is crucial.
To accomplish this, your local search engine optimization must be on point, particularly if your business depends on physical traffic.
But how can you tell if yours is working? You could hop in your car and drive all over town (or the country) performing searches in various locations to check your ranking, but that would take forever.
Luckily Google lets you see how you rank without leaving the comfort of your desk. Here’s how to do it:
Add A Local Parameter To Your Search
Google provides a handy way to check the local map pack in specific locations. Simply perform your search in Google, then add “&near=cityname” at the end of the URL in the search bar.
For example, let’s imagine you’re doing SEO for a coffee shop with branches throughout the Pacific Northwest, but you live in Kansas City. Let’s call this imaginary business “Jitters.”
You want to see how Jitters stacks up to the competition in Seattle, so you navigate to Google and type in [coffee shops near me].
When the results page pops up, go to the end of the long URL and add “&near=Seattle.”
Press enter, and voila: You have performed a local search from 1,800 miles away.
Change Your Regional Settings
You can manually change your regional settings if you’re looking for a higher-level view of search results for a given location.
This gives you search results on a country level rather than providing results from your IP address or other sources.
To perform this, click Settings on the bottom right corner of Google.com and select Search Settings. This will send you to the Search Settings page (obviously).
Scroll to the bottom, and you’ll see a list of Region Settings.
Choose the region you want to use for search and save the settings. You’ll now see search results from the country you chose.
Continuing our coffee shop example, let’s say Jitters just opened a location in Lisbon. You’ll select “Portugal” as your region, allowing you to check the rankings of the new Portuguese beanery.
Note: If you don’t add the local parameter discussed earlier to the search URL, you’ll continue to see results based on your current location.
Manage Your Work & Home Locations On Google Maps
One of the great things about Google’s local search is its machine learning capabilities.
It automatically identifies places you often visit, including your home and work. And because it understands your commuting habits, it can save you lots of time and provide you with more relevant searches.
Of course, it’s not perfect. Sometimes, it doesn’t realize that you left that job at the cracker factory months ago. But setting your work and home locations is easy.
Open Google Maps, click Menu, then Your Places, and choose Locations. Pick Work or Home and enter the address. Click Save, and you’re all set.
Now you can perform local searches from either location by adding the modifier [near home] or [near work] to your query.
Delete Location History In Your Google Account Activity Controls
Some people find it a little Big Brother-ish, but Google tracks your location, even when you’re not actively using a specific product from the search engine.
It does this because it uses your location history to help it improve accuracy.
For example, if it notes you repeatedly visit a martial arts gym, it’s more likely to respond to queries about boxing with pugilism sites rather than blogs about cardboard boxes.
This is useful in many ways, but it can complicate the process of examining search results from different locations.
In 2020, Google announced it would delete users’ location history after 18 months, but if you can’t wait that long, deleting it or turning the service off is easy.
Go to the “Location History” section of your Google account, and you can toggle it on and off.
If you want to use location history on one device but not another, you can change that from this page. You can manually delete all or some of your location history from your browser or Google Maps.
You should be aware that if you delete this information, you’ll lose some personalized information like recommendations based on places you’ve been, traffic reports, and automatically generated Google Photo albums.
Override Your Location With Google Chrome Developer Tools
If you’re more tech-savvy, you can also check search engine results by overriding your location using developer tools in the Chrome browser.
To do this, open DevTools and then open a command prompt. Select Show Sensors and hit enter.
From the “Geolocations” list, choose one of the preset cities or choose “Custom Location.” If you opt for the latter, you can enter longitude and latitude coordinates for precise positioning.
You can also select “Location Unavailable” to see how your site works when a user’s location is unknown.
Change Location Settings On Your Device
Some mobile devices allow users to change their location under the Settings tab. Others require you to be a little cleverer.
The easiest way to check search results from a different location is to use a GPS-changing app.
Several of these are available on both the App Store and Google Play. Most work by using a network operator to change your GPS location, thus letting you perform searches from your location of choice.
Experiment With Google Ads Preview And Diagnosis Tool
Google’s Ad Preview and Diagnosis tool is a great way to see how your paid ads appear in local searches, but did you know you can also use it to view Google searches from different locations?
Open the tool and select Location in the dropdown menu. Now enter your desired location. You can view by country, city, or zip code, so you can get a high-level or precise view, depending on your needs.
You can also change the type of device to check if you’re performing as well on mobile as you are on desktop.
View Local Search Results With Valentin.app
Valentin.app is a free online tool that lets you check search engine rankings from an exact location without any additional tools or data sources.
It’s extremely easy to use. Type in your keywords, select your region and language, and enter an address.
Your address input will then be converted into geolocation and sent to Google (along with your other inputs).
Valentin.app then opens a localized search engine results page from Google in a new tab.
Use A VPN To Change Your Location
Another way to remove location data from the search equation is by changing the location setting on your device.
One of the most common and simplest ways to do this is with a virtual private network (VPN).
Long used by pirates (the virtual kind, not the swashbuckling ones), VPNs mask your IP address by routing it through secure servers located elsewhere. (Please note: Search Engine Journal in no way condones intellectual property theft or breaking the law in any way, so don’t call us if you need bail money).
VPNs have legitimate uses, of course, including protecting you from hackers, securing your data, and circumventing those annoying YouTube blockers that restrict certain videos in your country. And they’re also an excellent way to get search results from a different location.
The drawback to doing this is that most VPNs only have a handful of IP locations to choose from. So, if you’re looking to see exactly how your coffee shop ranks in Vancouver-based searches, you may be out of luck.
Automate With Local Rank Checking Tools
Tracking local search results pages for a business with two locations is quite manageable on your own. But what if our pretend coffee company gets acquired by a company that wants to take Jitters global?
You’ll drive yourself nuts trying to manage local searches at each of the company’s 315 worldwide locations. No need to worry – platforms exist to solve just this problem.
Called rank checking tools, they can automate local searches and generate reports so you can decide where your SEO efforts can be best applied.
Some of these you may be familiar with include:
Location Is Everything
Google results are different for different people in different locations on different devices. And this means it’s incredibly difficult to take a one-size-fits-all approach to search engine optimization.
With Google’s emphasis on local search, it’s crucial that you’re showing up to people in the neighborhood, whether you’re managing a single location, doing SEO remotely, or running a website for a business with multiple locations.
Luckily, you don’t have to actually be in that neighborhood to see what local searchers are getting on search engine results pages. There are several ways you can see how you’re ranking from different locations, each with its own advantages and drawbacks.
No matter which one you feel is best for your needs, the ability to adjust your SEO to target customers within a specific area is something you can’t afford to neglect.
More Resources:
Featured Image: Antonio Guillem/Shutterstock
SEO
8% Of Automattic Employees Choose To Resign
WordPress co-founder and Automattic CEO announced today that he offered Automattic employees the chance to resign with a severance pay and a total of 8.4 percent. Mullenweg offered $30,000 or six months of salary, whichever one is higher, with a total of 159 people taking his offer.
Reactions Of Automattic Employees
Given the recent controversies created by Mullenweg, one might be tempted to view the walkout as a vote of no-confidence in Mullenweg. But that would be a mistake because some of the employees announcing their resignations either praised Mullenweg or simply announced their resignation while many others tweeted how happy they are to stay at Automattic.
One former employee tweeted that he was sad about recent developments but also praised Mullenweg and Automattic as an employer.
He shared:
“Today was my last day at Automattic. I spent the last 2 years building large scale ML and generative AI infra and products, and a lot of time on robotics at night and on weekends.
I’m going to spend the next month taking a break, getting married, and visiting family in Australia.
I have some really fun ideas of things to build that I’ve been storing up for a while. Now I get to build them. Get in touch if you’d like to build AI products together.”
Another former employee, Naoko Takano, is a 14 year employee, an organizer of WordCamp conferences in Asia, a full-time WordPress contributor and Open Source Project Manager at Automattic announced on X (formerly Twitter) that today was her last day at Automattic with no additional comment.
She tweeted:
“Today was my last day at Automattic.
I’m actively exploring new career opportunities. If you know of any positions that align with my skills and experience!”
Naoko’s role at at WordPress was working with the global WordPress community to improve contributor experiences through the Five for the Future and Mentorship programs. Five for the Future is an important WordPress program that encourages organizations to donate 5% of their resources back into WordPress. Five for the Future is one of the issues Mullenweg had against WP Engine, asserting that they didn’t donate enough back into the community.
Mullenweg himself was bittersweet to see those employees go, writing in a blog post:
“It was an emotional roller coaster of a week. The day you hire someone you aren’t expecting them to resign or be fired, you’re hoping for a long and mutually beneficial relationship. Every resignation stings a bit.
However now, I feel much lighter. I’m grateful and thankful for all the people who took the offer, and even more excited to work with those who turned down $126M to stay. As the kids say, LFG!”
Read the entire announcement on Mullenweg’s blog:
Featured Image by Shutterstock/sdx15
SEO
YouTube Extends Shorts To 3 Minutes, Adds New Features
YouTube expands Shorts to 3 minutes, adds templates, AI tools, and the option to show fewer Shorts on the homepage.
- YouTube Shorts will allow 3-minute videos.
- New features include templates, enhanced remixing, and AI-generated video backgrounds.
- YouTube is adding a Shorts trends page and comment previews.
SEO
How To Stop Filter Results From Eating Crawl Budget
Today’s Ask An SEO question comes from Michal in Bratislava, who asks:
“I have a client who has a website with filters based on a map locations. When the visitor makes a move on the map, a new URL with filters is created. They are not in the sitemap. However, there are over 700,000 URLs in the Search Console (not indexed) and eating crawl budget.
What would be the best way to get rid of these URLs? My idea is keep the base location ‘index, follow’ and newly created URLs of surrounded area with filters switch to ‘noindex, no follow’. Also mark surrounded areas with canonicals to the base location + disavow the unwanted links.”
Great question, Michal, and good news! The answer is an easy one to implement.
First, let’s look at what you’re trying and apply it to other situations like ecommerce and publishers. This way, more people can benefit. Then, go into your strategies above and end with the solution.
What Crawl Budget Is And How Parameters Are Created That Waste It
If you’re not sure what Michal is referring to with crawl budget, this is a term some SEO pros use to explain that Google and other search engines will only crawl so many pages on your website before it stops.
If your crawl budget is used on low-value, thin, or non-indexable pages, your good pages and new pages may not be found in a crawl.
If they’re not found, they may not get indexed or refreshed. If they’re not indexed, they cannot bring you SEO traffic.
This is why optimizing a crawl budget for efficiency is important.
Michal shared an example of how “thin” URLs from an SEO point of view are created as customers use filters.
The experience for the user is value-adding, but from an SEO standpoint, a location-based page would be better. This applies to ecommerce and publishers, too.
Ecommerce stores will have searches for colors like red or green and products like t-shirts and potato chips.
These create URLs with parameters just like a filter search for locations. They could also be created by using filters for size, gender, color, price, variation, compatibility, etc. in the shopping process.
The filtered results help the end user but compete directly with the collection page, and the collection would be the “non-thin” version.
Publishers have the same. Someone might be on SEJ looking for SEO or PPC in the search box and get a filtered result. The filtered result will have articles, but the category of the publication is likely the best result for a search engine.
These filtered results can be indexed because they get shared on social media or someone adds them as a comment on a blog or forum, creating a crawlable backlink. It might also be an employee in customer service responded to a question on the company blog or any other number of ways.
The goal now is to make sure search engines don’t spend time crawling the “thin” versions so you can get the most from your crawl budget.
The Difference Between Indexing And Crawling
There’s one more thing to learn before we go into the proposed ideas and solutions – the difference between indexing and crawling.
- Crawling is the discovery of new pages within a website.
- Indexing is adding the pages that are worthy of showing to a person using the search engine to the database of pages.
Pages can get crawled but not indexed. Indexed pages have likely been crawled and will likely get crawled again to look for updates and server responses.
But not all indexed pages will bring in traffic or hit the first page because they may not be the best possible answer for queries being searched.
Now, let’s go into making efficient use of crawl budgets for these types of solutions.
Using Meta Robots Or X Robots
The first solution Michal pointed out was an “index,follow” directive. This tells a search engine to index the page and follow the links on it. This is a good idea, but only if the filtered result is the ideal experience.
From what I can see, this would not be the case, so I would recommend making it “noindex,follow.”
Noindex would say, “This is not an official page, but hey, keep crawling my site, you’ll find good pages in here.”
And if you have your main menu and navigational internal links done correctly, the spider will hopefully keep crawling them.
Canonicals To Solve Wasted Crawl Budget
Canonical links are used to help search engines know what the official page to index is.
If a product exists in three categories on three separate URLs, only one should be “the official” version, so the two duplicates should have a canonical pointing to the official version. The official one should have a canonical link that points to itself. This applies to the filtered locations.
If the location search would result in multiple city or neighborhood pages, the result would likely be a duplicate of the official one you have in your sitemap.
Have the filtered results point a canonical back to the main page of filtering instead of being self-referencing if the content on the page stays the same as the original category.
If the content pulls in your localized page with the same locations, point the canonical to that page instead.
In most cases, the filtered version inherits the page you searched or filtered from, so that is where the canonical should point to.
If you do both noindex and have a self-referencing canonical, which is overkill, it becomes a conflicting signal.
The same applies to when someone searches for a product by name on your website. The search result may compete with the actual product or service page.
With this solution, you’re telling the spider not to index this page because it isn’t worth indexing, but it is also the official version. It doesn’t make sense to do this.
Instead, use a canonical link, as I mentioned above, or noindex the result and point the canonical to the official version.
Disavow To Increase Crawl Efficiency
Disavowing doesn’t have anything to do with crawl efficiency unless the search engine spiders are finding your “thin” pages through spammy backlinks.
The disavow tool from Google is a way to say, “Hey, these backlinks are spammy, and we don’t want them to hurt us. Please don’t count them towards our site’s authority.”
In most cases, it doesn’t matter, as Google is good at detecting spammy links and ignoring them.
You do not want to add your own site and your own URLs to the disavow tool. You’re telling Google your own site is spammy and not worth anything.
Plus, submitting backlinks to disavow won’t prevent a spider from seeing what you want and do not want to be crawled, as it is only for saying a link from another site is spammy.
Disavowing won’t help with crawl efficiency or saving crawl budget.
How To Make Crawl Budgets More Efficient
The answer is robots.txt. This is how you tell specific search engines and spiders what to crawl.
You can include the folders you want them to crawl by marketing them as “allow,” and you can say “disallow” on filtered results by disallowing the “?” or “&” symbol or whichever you use.
If some of those parameters should be crawled, add the main word like “?filter=location” or a specific parameter.
Robots.txt is how you define crawl paths and work on crawl efficiency. Once you’ve optimized that, look at your internal links. A link from one page on your site to another.
These help spiders find your most important pages while learning what each is about.
Internal links include:
- Breadcrumbs.
- Menu navigation.
- Links within content to other pages.
- Sub-category menus.
- Footer links.
You can also use a sitemap if you have a large site, and the spiders are not finding the pages you want with priority.
I hope this helps answer your question. It is one I get a lot – you’re not the only one stuck in that situation.
More resources:
Featured Image: Paulo Bobita/Search Engine Journal
-
WORDPRESS2 days ago
WordPress biz Automattic details WP Engine deal demands • The Register
-
SEARCHENGINES4 days ago
Daily Search Forum Recap: September 30, 2024
-
SEO6 days ago
Yoast Co-Founder Suggests A WordPress Contributor Board
-
SEARCHENGINES6 days ago
Google’s 26th Birthday Doodle Is Missing
-
SEO5 days ago
6 Things You Can Do to Compete With Big Sites
-
SEARCHENGINES5 days ago
Google Volatility With Gains & Losses, Updated Web Spam Policies, Cache Gone & More Search News
-
SEARCHENGINES3 days ago
Daily Search Forum Recap: October 1, 2024
-
SEO5 days ago
An In-Depth Guide For Businesses
You must be logged in to post a comment Login