Connect with us

SEO

Local SEO Strategies For Plumbers And Other Trades

Published

on

Plumbers and other tradespeople are essential in modern society, yet they face one huge barrier to entering new markets: Peeling away customers from the competition.

Many homeowners have established relationships with plumbers and contractors, making breaking into certain markets difficult.

Thankfully, plumbers have ways to excel in a new market, particularly when capturing new homeowners or performing emergency services where other plumbers are unavailable.

The best sources for seizing these opportunities are Google Search, and you can improve your visibility there using local SEO.

In this guide, you’ll find tips and tactics to help plumbers, electricians, contractors, and other tradespeople and businesses break into a new market, build a customer base, and expand your brand.

Off-Site SEO Essentials

One of the biggest traffic sources for all businesses comes from ‘near me’ searches, particularly on mobile phones.

By optimizing your Google Business Profile listing and third-party directory listings, you can build your business’s exposure and increase the number of incoming calls to your business.

Google Business Profile Optimization

Your Google Business Profile shows in local search pack results and is displayed in Google Maps searches.

As a result, optimizing your Google Business Profile listing enables customers to call your business, drive to its location, or visit its website with just one click.

Screenshot from Google, July 2022

To optimize your Google Business Profile listing correctly, follow these tips and dive deeper with this guide.

  • Ensure consistent NAP information (i.e., name, address, and phone number).
  • Verify your business on Google Maps.
  • Respond to customer reviews on your profile with helpful advice or kind responses.
  • Write a description of your business and its services.
  • Use high-resolution and relevant photos to showcase your brand and company.

Local Directories

Next, you’ll want to optimize your brand’s business profile on other third-party sites that customers frequently use to find plumbers, including:

  • The Better Business Bureau.
  • Yellow Pages.
  • Angie’s List.
  • Yelp.
  • Houzz.
  • Home Advisor.
  • LinkedIn.
  • Iambuilders.com.
  • Blue Book.

While some sites like Yelp have waned over the years, these directory sites are still important business referral sources.

Some sites provide ranked lists of different businesses based on customer reviews, which can furnish social proof and trust for your business (if you get enough positive reviews).

Moz offers tools for local citation building, or you can manually claim each business yourself, following the tips above to optimize your social media presence.

Build A Social Media Presence

While engaging with customers on social media is not critical for many contractors or trades, sources like Facebook can be valuable for customers looking for special announcements, business hours, and reviews.

Build a social media page for Facebook and LinkedIn, providing high-resolution photos and clear NAP info for easy contact.

Managing Reviews

Finally, you’ll need to manage reviews on external third-party websites to build customer trust.

Generally, you should analyze each third-party site at least once a month, if not more, to see what people are saying about your business and how you can improve.

Think of reviews like personal referrals – which are already a massive source of revenue for your business.

According to one survey, 84% of customers of service businesses and tradespersons said reviews are ‘important’ or ‘very important’ in their decision-making process.

Follow these tips to manage online reviews for your plumbing business:

  • Encourage customers to leave a review after a positive service or engagement.
  • Encourage customers to leave reviews on your website and all marketing materials.
  • Respond to positive reviews.
  • Respond to negative reviews with solutions or an apology.

Don’t worry too much about negative reviews, as most customers will be equally dismayed by overtly harsh reviews.

However, responding to negative reviews with a positive service engagement could build more customer trust over time.

It also helps you show off your customer service skills.

On-Page SEO Essentials

Now that you’ve optimized your business listing on strategic third-party websites, it’s time to optimize your website for local SEO results.

Local Keyword Research

To begin, you’ll need to conduct local keyword research to see which terms drive the most qualified traffic to your website.

Open up a free Google Ads account and use the Keyword Planner tool to search for keywords in your area.

For example, if you operate in Houston, you could use “plumbers houston” as your seed keyword and filter your search for Houston, Texas, to uncover further ideas:

conduct local keyword researchScreenshot from Google Ads, July 2022

Based on this list, “plumber houston tx” and “houston plumbing services” have less competition and can be easily won in local search results.

You can also filter this list by “top of page bid” and look at the highest bids for the most commercially relevant keywords.

You can also take a competitor from your initial keyword list and plug in their URL to see which keywords they rank for.

You can add a semantic filter to adjust your results for strictly plumbing-related keywords.

Building a list of these keywordsScreenshot from Google Ads, July 2022

Building a list of these keywords will be critical for optimizing meta tags on top-level pages and developing content ideas.

Meta Tag Optimization

Now that we have our list of seed keywords, we need to apply them to our web pages.

The first area will be your homepage, where you can customize the title tag to include your brand name and a seed keyword, such as “Matt’s Plumbing Company | Plumbers Houston.”

You will need to optimize the metadata on each page with relevant keyword data to make pages more likely to rank for search results.

This metadata will include:

  • Title Tag/H1: The primary keyword related to a page and the page’s topic. Title tags must be between 50-60 characters, or titles will be truncated in SERPs (search engine results page).
  • Meta Description: A brief description of your webpage, which includes your seed keyword and a call-to-action to read or find out more. Meta descriptions are ideally between 145–160 characters.
  • Header Tags: The subtopics or dividing headers across each page. Each header should include a relevant long-tail keyword.
  • URLs: URLs should retain a simple structure with your site name followed by the name of the title of the webpage.
  • Keyword Usage: Seed keywords should be used in the webpage’s introduction and 1–5% throughout the text document.

Local Schema

While local keyword research will certainly help Google or Bing index your website for local search results, nothing is guaranteed.

To help search engines index your website properly, use schema markup on web pages to properly label and index them.

While schema markup can be complicated, Google’s Structured Data Tool simplifies the task.

Insert your URL and add the appropriate schema markup to ensure each page on your website is properly indexed by Google.

Some common schema markup data that will apply to your plumbing business include:

  • Geo.
  • Type.
  • Opening Hours.
  • Telephone.
  • Address.
  • Review.
  • Price.

See our Complete Guide to Local Schema for more useful tips.

Mobile Responsiveness

Another big component of local search is mobile search.

Many local searches for your business will be conducted via smartphones, so you can’t ignore the importance of mobile.

Thankfully, most modern CMS options come with responsive web design.

However, to ensure your website runs smoothly on mobile, consider the following tips:

  • Compress all images.
  • Reduce clicks and leverage scrolling.
  • Keep webpages short and simple.
  • Insert click-to-call buttons and icons.
  • Limit the amount of JavaScript.
  • Avoid large videos (leverage YouTube instead!)

Site Speed Optimization

Fortunately, by optimizing for mobile, you’ll also be optimizing for page speed.

To increase page speeds, consider the following tips:

  • Minify CSS.
  • Enable file compression.
  • Use browser caching.
  • Clean up redirects.

You should still identify page speed issues using Google’s Page Speed Insights for more helpful information.

Creating Consistent NAP

As a final tip for this section, it’s crucial to ensure that all contact information (name, address, phone number = NAP) and branding are consistent across all your pages.

For example, inserting a click-to-call button and your address in the top corner of each page (or the footer) will ensure customers can contact you whenever they’re ready.

Content Essentials

With your website in place and ready to rock, it’s time to build landing pages for your most important services and service regions.

Service Pages

Ideally, your UX should retain very simple navigation, with your target keyword (such as “Plumbers Houston”) and all auxiliary services as secondary or service pages.

These pages could include services, such as:

  • Emergency Plumbing.
  • Toilet Repairs.
  • Pipe Leaks.
  • Garbage Disposal Repair.
  • Water Heater Services.
  • Sewage.
  • Drain Cleaning.
  • Gas Piping.

These should all be located under a general Plumbing Services top-navigation page where users can explore different services, find out pricing, repair specifics, etc.

Regional Pages

Another important consideration for plumbers and contractors is whether your business serves a large metro area or different locations.

For example, if you provide plumbing services to most of New York City, you could create regional pages for Manhattan, Brooklyn, and Queens with a list of various services.

Furthermore, if there is enough keyword volume to warrant those pages, you could create regional pages for neighborhoods in New York, such as the Upper East Side, Upper West Side, and SoHo.

These pages may also rank for ‘near me’ searches in those neighborhoods.

Creating A Blog

Finally, you could also consider starting a blog if you feel it will give you a leg up on the competition.

Blogs provide several benefits for local businesses, including plumbers, such as:

  • Providing customers with easy DIY tips and repairs.
  • Separating your brand from other plumbers as a local authority.
  • Providing you with engaging content to share on your social channels.
  • Ranking for long-tail keywords relevant to your business to drive traffic.

You can also consider sharing DIY and tutorial videos on platforms like YouTube, which link back to your website.

Video content is highly shareable and easy to consume, giving your business much-needed exposure.

Link Building Strategies

While we discussed ways to drive traffic from third-party websites, link building can be an exceptionally useful tool in your arsenal to give your business a leg-up on the competition.

Link building – whether it’s using rel=follow or nofollow links – directs qualified traffic to your site and customers to your business.

If you don’t have the money to spend on a tool like Ahrefs, here are some free local link building tips to help you drive traffic to your website:

  • Reach out to newspapers in your town that list local contractors and ask for a link to your site.
  • Reach out to local bloggers who have interviewed local businesses in your area to contribute a quote or interview.
  • Write guest posts on websites with local influencers to contribute some DIY repair tips and other related content.
  • Sponsor a local team, volunteer, or host an event that forces journalists and bloggers to write about your business.

These tips will help give your brand exposure, which, in turn, will drive more customers to your business over time.

Digital Advertising

Online advertising can also be very effective if you want to drive quick exposure and calls to your business.

For example, advertising on Google Ads allows you to bid on the keywords you’re trying to rank for, so your website shows up above the local search pack.

online advertising can be very effective. Screenshot from Google Ads, July 2022

While I suggest enlisting help or taking a course to begin advertising on Google, here are a few helpful tips to help you promote your plumbing business on Google:

  • Use geotargeting to narrow your audience to a specific region.
  • Insert negative keywords to reduce ad spend.
  • Leverage location-specific keywords to compete with ‘“near me” and organic results.
  • Enable ad extensions that provide easy call options and list your business address.
  • Create landing page copy that’s relevant to your ad and includes relevant keywords, high-quality images, and an easy way to contact your business.
  • A/B test ads to see which ad copy generates the best performance.
  • Use longer tail keywords with less competition or change your bid strategy to limit CPC (cost per click).

In addition, Facebook provides sophisticated audience research tools that allow you to advertise to customers based on demographic information, such as whether they’re a homeowner, over a certain age, or own residential/commercial real estate.

Email Marketing

Finally, I want to mention email marketing, as it can be a powerful tool for local businesses.

Plumbers can benefit from email marketing by sending re-engagement emails that remind previous customers of your business, especially if you work with residential and commercial residents.

You can send emails to advertise local promotions in your area, such as discounted inspections or preventative maintenance.

However, email marketing can be expensive, especially for a trade that relies heavily on emergency repairs.

For this reason, email marketing is not truly necessary for plumbers unless you actively create content, engage with the community, or run promotions.

Conclusion

SEO for tradespeople and their businesses involves many of the same strategies as with other local businesses.

However, tradespeople require special consideration for their unique business model.

Some forms of marketing, such as social media and email marketing, may not be as effective as advertising or reputation management.

Hopefully, by following these tips, you can establish a positive web presence for your business and start getting more phone calls.

More resources:


Featured Image: Khosro/Shutterstock



Source link

SEO

How to Block ChatGPT From Using Your Website Content

Published

on

How to Block ChatGPT From Using Your Website Content

There is concern about the lack of an easy way to opt out of having one’s content used to train large language models (LLMs) like ChatGPT. There is a way to do it, but it’s neither straightforward nor guaranteed to work.

How AIs Learn From Your Content

Large Language Models (LLMs) are trained on data that originates from multiple sources. Many of these datasets are open source and are freely used for training AIs.

Some of the sources used are:

  • Wikipedia
  • Government court records
  • Books
  • Emails
  • Crawled websites

There are actually portals and websites offering datasets that are giving away vast amounts of information.

One of the portals is hosted by Amazon, offering thousands of datasets at the Registry of Open Data on AWS.

Screenshot from Amazon, January 2023

The Amazon portal with thousands of datasets is just one portal out of many others that contain more datasets.

Wikipedia lists 28 portals for downloading datasets, including the Google Dataset and the Hugging Face portals for finding thousands of datasets.

Datasets of Web Content

OpenWebText

A popular dataset of web content is called OpenWebText. OpenWebText consists of URLs found on Reddit posts that had at least three upvotes.

The idea is that these URLs are trustworthy and will contain quality content. I couldn’t find information about a user agent for their crawler, maybe it’s just identified as Python, I’m not sure.

Nevertheless, we do know that if your site is linked from Reddit with at least three upvotes then there’s a good chance that your site is in the OpenWebText dataset.

More information about OpenWebText is here.

Common Crawl

One of the most commonly used datasets for Internet content is offered by a non-profit organization called Common Crawl.

Common Crawl data comes from a bot that crawls the entire Internet.

The data is downloaded by organizations wishing to use the data and then cleaned of spammy sites, etc.

The name of the Common Crawl bot is, CCBot.

CCBot obeys the robots.txt protocol so it is possible to block Common Crawl with Robots.txt and prevent your website data from making it into another dataset.

However, if your site has already been crawled then it’s likely already included in multiple datasets.

Nevertheless, by blocking Common Crawl it’s possible to opt out your website content from being included in new datasets sourced from newer Common Crawl data.

The CCBot User-Agent string is:

CCBot/2.0

Add the following to your robots.txt file to block the Common Crawl bot:

User-agent: CCBot
Disallow: /

An additional way to confirm if a CCBot user agent is legit is that it crawls from Amazon AWS IP addresses.

CCBot also obeys the nofollow robots meta tag directives.

Use this in your robots meta tag:

<meta name="robots" content="nofollow">

Blocking AI From Using Your Content

Search engines allow websites to opt out of being crawled. Common Crawl also allows opting out. But there is currently no way to remove one’s website content from existing datasets.

Furthermore, research scientists don’t seem to offer website publishers a way to opt out of being crawled.

The article, Is ChatGPT Use Of Web Content Fair? explores the topic of whether it’s even ethical to use website data without permission or a way to opt out.

Many publishers may appreciate it if in the near future, they are given more say on how their content is used, especially by AI products like ChatGPT.

Whether that will happen is unknown at this time.

More resources:

Featured image by Shutterstock/ViDI Studio



Source link

Continue Reading

SEO

Google’s Mueller Criticizes Negative SEO & Link Disavow Companies

Published

on

Google's Mueller Criticizes Negative SEO & Link Disavow Companies

John Mueller recently made strong statements against SEO companies that provide negative SEO and other agencies that provide link disavow services outside of the tool’s intended purpose, saying that they are “cashing in” on clients who don’t know better.

While many frequently say that Mueller and other Googlers are ambiguous, even on the topic of link disavows.

The fact however is that Mueller and other Googlers have consistently recommended against using the link disavow tool.

This may be the first time Mueller actually portrayed SEOs who liberally recommend link disavows in a negative light.

What Led to John Mueller’s Rebuke

The context of Mueller’s comments about negative SEO and link disavow companies started with a tweet by Ryan Jones (@RyanJones)

Ryan tweeted that he was shocked at how many SEOs regularly offer disavowing links.

He tweeted:

“I’m still shocked at how many seos regularly disavow links. Why? Unless you spammed them or have a manual action you’re probably doing more harm than good.”

The reason why Ryan is shocked is because Google has consistently recommended the tool for disavowing paid/spammy links that the sites (or their SEOs) are responsible for.

And yet, here we are, eleven years later, and SEOs are still misusing the tool for removing other kinds of tools.

Here’s the background information about that.

Link Disavow Tool

In the mid 2000’s there was a thriving open market for paid links prior to the Penguin Update in April 2012. The commerce in paid links was staggering.

I knew of one publisher with around fifty websites who received a $30,000 check every month for hosting paid links on his site.

Even though I advised my clients against it, some of them still purchased links because they saw everyone else was buying them and getting away with it.

The Penguin Update caused the link selling boom collapsed.

Thousands of websites lost rankings.

SEOs and affected websites strained under the burden of having to contact all the sites from which they purchased paid links to ask to have them removed.

So some in the SEO community asked Google for a more convenient way to disavow the links.

Months went by and after resisting the requests, Google relented and released a disavow tool.

Google cautioned from the very beginning to only use the tool for disavowing links that the site publishers (or their SEOs) are responsible for.

The first paragraph of Google’s October 2012 announcement of the link disavow tool leaves no doubt on when to use the tool:

“Today we’re introducing a tool that enables you to disavow links to your site.

If you’ve been notified of a manual spam action based on ‘unnatural links’ pointing to your site, this tool can help you address the issue.

If you haven’t gotten this notification, this tool generally isn’t something you need to worry about.”

The message couldn’t be clearer.

But at some point in time, link disavowing became a service applied to random and “spammy looking” links, which is not what the tool is for.

Link Disavow Takes Months To Work

There are many anecdotes about link disavows that helped sites regain rankings.

They aren’t lying, I know credible and honest people who have made this claim.

But here’s the thing, John Mueller has confirmed that the link disavow process takes months to work its way through Google’s algorithm.

Sometimes things happen that are not related, no correlation. It just looks that way.

John shared how long it takes for a link disavow to work in a Webmaster Hangout:

“With regards to this particular case, where you’re saying you submitted a disavow file and then the ranking dropped or the visibility dropped, especially a few days later, I would assume that that is not related.

So in particular with the disavow file, what happens is we take that file into account when we reprocess the links kind of pointing to your website.

And this is a process that happens incrementally over a period of time where I would expect it would have an effect over the course of… I don’t know… maybe three, four, five, six months …kind of step by step going in that direction.

So if you’re saying that you saw an effect within a couple of days and it was a really strong effect then I would assume that this effect is completely unrelated to the disavow file. …it sounds like you still haven’t figured out what might be causing this.”

John Mueller: Negative SEO and Link Disavow Companies are Making Stuff Up

Context is important to understand what was said.

So here’s the context for John Mueller’s remark.

An SEO responded to Ryan’s tweet about being shocked at how many SEOs regularly disavow links.

The person responding to Ryan tweeted that disavowing links was still important, that agencies provide negative SEO services to take down websites and that link disavow is a way to combat the negative links.

The SEO (SEOGuruJaipur) tweeted:

“Google still gives penalties for backlinks (for example, 14 Dec update, so disavowing links is still important.”

SEOGuruJaipur next began tweeting about negative SEO companies.

Negative SEO companies are those that will build spammy links to a client’s competitor in order to make the competitor’s rankings drop.

SEOGuruJaipur tweeted:

“There are so many agencies that provide services to down competitors; they create backlinks for competitors such as comments, bookmarking, directory, and article submission on low quality sites.”

SEOGuruJaipur continued discussing negative SEO link builders, saying that only high trust sites are immune to the negative SEO links.

He tweeted:

“Agencies know what kind of links hurt the website because they have been doing this for a long time.

It’s only hard to down for very trusted sites. Even some agencies provide a money back guarantee as well.

They will provide you examples as well with proper insights.”

John Mueller tweeted his response to the above tweets:

“That’s all made up & irrelevant.

These agencies (both those creating, and those disavowing) are just making stuff up, and cashing in from those who don’t know better.”

Then someone else joined the discussion:

Mueller tweeted a response:

“Don’t waste your time on it; do things that build up your site instead.”

Unambiguous Statement on Negative SEO and Link Disavow Services

A statement by John Mueller (or anyone) can appear to conflict with prior statements when taken out of context.

That’s why I not only placed his statements into their original context but also the history going back eleven years that is a part of that discussion.

It’s clear that John Mueller feels that those selling negative SEO services and those providing disavow services outside of the intended use are “making stuff up” and “cashing in” on clients who might not “know better.”

Featured image by Shutterstock/Asier Romero



Source link

Continue Reading

SEO

Source Code Leak Shows New Ranking Factors to Consider

Published

on

Source Code Leak Shows New Ranking Factors to Consider

January 25, 2023, the day that Yandex—Russia’s search engine—was hacked. 

Its complete source code was leaked online. And, it might not be the first time we’ve seen hacking happen in this industry, but it is one of the most intriguing, groundbreaking events in years.

But Yandex isn’t Google, so why should we care? Here’s why we do: these two search engines are very similar in how they process technical elements of a website, and this leak just showed us the 1,922 ranking factors Yandex uses in its algorithm. 

Simply put, this information is something that we can use to our advantage to get more traffic from Google.

Yandex vs Google

As I said, a lot of these ranking factors are possibly quite similar to the signals that Google uses for search.

Yandex’s algorithm shows a RankBrain analog: MatrixNext. It also seems that they are using PageRank (almost the same way as Google does), and a lot of their text algorithms are the same. Interestingly, there are also a lot of ex-Googlers working in Yandex. 

So, reviewing these factors and understanding how they play into search rankings and traffic will provide some very useful insights into how search engines like Google work. No doubt, this new trove of information will greatly influence the SEO market in the months to come. 

That said, Yandex isn’t Google. The chances of Google having the exact same list of ranking factors is low — and Google may not even give that signal the same amount of weight that Yandex does. 

Still, it’s information that potentially will be useful for driving traffic, so make sure to take a look at them here (before it’s scrubbed from the internet forever).

An early analysis of ranking factors

Many of their ranking factors are as expected. These include:

  • Many link-related factors (e.g., age, relevancy, etc.).
  • Content relevance, age, and freshness.
  • Host reliability
  • End-user behavior signals.

Some sites also get preference (such as Wikipedia). FI_VISITS_FROM_WIKI even shows that sites that are referenced by Wikipedia get plus points. 

These are all things that we already know.

But something interesting: there were several factors that I and other SEOs found unusual, such as PageRank being the 17th highest weighted factor in Yandex, and the 19th highest weighted factor being query-document relevance (in other words, how close they match thematically). There’s also karma for likely spam hosts, based on Whois information.

Other interesting factors are the average domain ranking across queries, percent of organic traffic, and the number of unique visitors.

You can also use this Yandex Search Ranking Factor Explorer, created by Rob Ousbey, to search through the various ranking factors.

The possible negative ranking factors:

Here’s my thoughts on Yandex’s factors that I found interesting: 

FI_ADV: -0.2509284637 — this factor means having tons of adverts scattered around your page and buying PPC can affect rankings. 

FI_DATER_AGE: -0.2074373667 — this one evaluates content age, and whether your article is more than 10 years old, or if there’s no determinable date. Date metadata is important. 

FI_COMM_LINKS_SEO_HOSTS: -0.1809636391 — this can be a negative factor if you have too much commercial anchor text, particularly if the proportion of such links goes above 50%. Pay attention to anchor text distribution. I’ve written a guide on how to effectively use anchor texts if you need some help on this. 

FI_RANK_ARTROZ — outdated, poorly written text will bring your rankings down. Go through your site and give your content a refresh. FI_WORD_COUNT also shows that the number of words matter, so avoid having low-content pages.

FI_URL_HAS_NO_DIGITS, FI_NUM_SLASHES, FI_FULL_URL_FRACTION — urls shouldn’t have digits, too many slashes (too much hierarchy), and of course contain your targeted keyword.

FI_NUM_LINKS_FROM_MP — always interlink your main pages (such as your homepage or landing pages) to any other important content you want to rank. Otherwise, it can hurt your content.

FI_HOPS — reduce the crawl depth for any pages that matter to you. No important pages should be more than a few clicks away from your homepage. I recommend keeping it to two clicks, at most. 

FI_IS_UNREACHABLE — likewise, avoid making any important page an orphan page. If it’s unreachable from your homepage, it’s as good as dead in the eyes of the search engine.

The possible positive ranking factors:

FI_IS_COM: +0.2762504972 — .com domains get a boost in rankings.

FI_YABAR_HOST_VISITORS — the more traffic you get, the more ranking power your site has. The strategy of targeting smaller, easier keywords first to build up an audience before targeting harder keywords can help you build traffic.

FI_BEAST_HOST_MEAN_POS — the average position of the host for keywords affects your overall ranking. This factor and the previous one clearly show that being smart with your keyword and content planning matters. If you need help with that, check out these 5 ways to build a solid SEO strategy.

FI_YABAR_HOST_SEARCH_TRAFFIC — this might look bad but shows that having other traffic sources (such as social media, direct search, and PPC) is good for your site. Yandex uses this to determine if a real site is being run, not just some spammy SEO project.

This one includes a whole host of CTR-related factors. 

CTR ranking factors from Yandex

It’s clear that having searchable and interesting titles that drive users to check your content out is something that positively affects your rankings.

Google is rewarding sites that help end a user’s search journey (as we know from the latest mobile search updates and even the Helpful Content update). Do what you can to answer the query early on in your article. The factor “FI_VISITORS_RETURN_MONTH_SHARE“ also shows that it helps to encourage users to return to your site for more information on the topics they’re interested in. Email marketing is a handy tool here.

FI_GOOD_RATIO and FI_MANY_BAD — the percentage of “good” and “bad” backlinks on your site. Getting your backlinks from high-quality websites with traffic is important for your rankings. The factor FI_LINK_AGE also shows that adding a link-building strategy to your SEO as early as possible can help with your rankings.

FI_SOCIAL_URL_IS_VERIFIED — that little blue check has actual benefits now. Links from verified accounts have more weight.

Key Takeaway

Yandex and Google, being so similar to each other in theory, means that this data leak is something we must pay attention to. 

Several of these factors may already be common knowledge amongst SEOs, but having them confirmed by another search engine enforces how important they are for your strategy.

These initial findings, and understanding what it might mean for your website, can help you identify what to improve, what to scrap, and what to focus on when it comes to your SEO strategy. 

Source link

Continue Reading

Trending

en_USEnglish