Connect with us
Cloak And Track Your Affiliate Links With Our User-Friendly Link Cloaking Tool, Try It Free

SEO

SEO Woe or a Load of Baloney?

Published

on

SEO Woe or a Load of Baloney?

Toxic backlinks are links that some SEO tools say could hurt your website’s Google rankings. The implication is that you should disavow them to keep your site safe.

But there’s some disagreement and confusion among SEOs as to whether “toxic” links are actually a thing and what, if anything, you should do about them. 

If you believe Google’s John Mueller, they’re not: 

Yet, according to my poll, the majority (just!) of SEOs think they are: 

So… what’s the deal here? Are toxic backlinks actually a thing? Are they hurting your site? And if so, what should you be doing about them? 

Before we can answer those questions, we need to understand the terminology… 

Every website has some spammy backlinks that just don’t make sense. But that doesn’t necessarily make them manipulative or “toxic.”

For example, here are a couple of obviously spammy links to our site: 

Example of spammy links, via Ahrefs' Site ExplorerExample of spammy links, via Ahrefs' Site Explorer

We didn’t build or buy either of these, so they’re not “manipulative” by definition. They’re just low-quality links we’ve attracted over time because the internet is rife with spammers. 

If you study Google’s link spam documentation carefully, you’ll see that, in theory, these aren’t the kind of spammy links they have a problem with. They warn only against the ill effects of spam links intended to manipulate rankings. 

Google uses links as an important factor in determining the relevancy of web pages. Any links that are intended to manipulate rankings in Google Search results may be considered link spam. This includes any behavior that manipulates links to your site or outgoing links from your site. 

Here are the examples Google gives of these manipulative links: 

What Google says are manipulative linksWhat Google says are manipulative links

As for “toxic backlinks,” this is just a term made up by certain SEO tools to describe backlinks they think could hurt your rankings based on several so-called “markers.”

Key takeaway

  • Spammy links are low-quality links that every site attracts through no fault of their own. 
  • Manipulative links are links built or bought solely to improve Google rankings. 
  • Toxic links are links that certain SEO tools say could hurt your website’s rankings. 

If you asked this question before September 2016, the answer would have likely been “yes.”

So what changed? 

Penguin 4.0.

With this algorithm update, Google switched from demoting pages to a system that tries to ignore bad links.

Penguin is now more granular. Penguin now devalues spam by adjusting ranking based on spam signals, rather than affecting ranking of the whole site. 

Since then, Google’s stance has been that you can ignore spammy backlinks. 

If you’re seeing individual links that pop up and you say, “oh this looks like a spammer dropped the link” or whatever, I would completely ignore those. […] because these spammy links happen to every website and Google’s system has seen them so many times over the years that we’re very good at just ignoring them. 

John MuellerJohn Mueller

But is this true? Is Google really as good at ignoring low-level spam as we’re made to believe? 

Judging by my colleague Chris’s recent poll on LinkedIn, a good chunk of SEOs (38%) don’t think so, as they’re still disavowing them. 

Most SEOs either disavow or do nothing about spammy backlinksMost SEOs either disavow or do nothing about spammy backlinks

Does that mean they’re right to do so? Not necessarily. It just means they don’t fully trust Google that they won’t do any harm. They’re being careful. 

Personally, the person I trust most to answer this question in 2024 is Dr. Marie Haynes. I don’t think anyone’s done more research into this than her. She’s spent well over a decade working to understand Google’s search algorithms and auditing link profiles on behalf of business owners. 

Now, the interesting part of that statement (and why I actually trust her!) is the obvious conflict of interest. Until fairly recently, she made her living selling link audit and disavow file creation services—and for a pretty hefty sum at that! 

Pricing from Marie's link audit services page in March 2023Pricing from Marie's link audit services page in March 2023
Pricing from Marie’s link audit services page in March 2023

Clearly, it would be good news for Marie if Google were still terrible at ignoring spammy backlinks because she could sell more link audits! 

Yet, these days, she no longer appears to offer such services. In fact, she’s actually been warning folks against the need to disavow low-quality, spammy backlinks for a few years. 

Here’s a quote from a 2022 blog post of hers:

While there is no harm in disavowing low quality spammy links, it likely does not help improve rankings. We believe that Google’s algorithms are already ignoring these links. […]. When we do see improvements these days after disavowing, it is always in sites where we have disavowed links that were purposely made for SEO and very little else. 

Marie HaynesMarie Haynes

It’s clear that Marie is being cautious with her words here. But overall, her opinion after digging into this for many years seems to be that, yes, Google is now pretty good at ignoring most low-quality spammy links. 

Does that mean they’re perfect? No. But it does mean that worrying about obvious low-quality link spam is probably a waste of time for most people.

If you’re buying or building the types of links that Google class as “link spam” then, yes, they can absolutely hurt your rankings.

But before you panic about that link exchange you did with your best friend’s wife’s brother, Google is likely looking for patterns of manipulation here. In other words, manipulative link profiles rather than manipulative individual links: 

Danny Richman, founder of Richman SEO Training, agrees: 

Here’s a bit more context from Danny: 

As for Marie Haynes, she echoes a similar sentiment in this post. She states that manual actions aside, she would only recommend a client disavow links if they have “a very large number of links that [they] feel the webspam team would consider to be ‘manipulative.’ ”

In these cases, Google often slaps the worst offenders with an unnatural links manual action. If you get one of those, that’s Google telling you, “Hey… you’re being demoted in search because we think you’ve been trying to game the system with manipulative links.” 

But this doesn’t have to happen for manipulative links to be a problem. It’s possible for Google to algorithmically demote a site if they detect a large volume of spammy and manipulative links, at least according to John Mueller.

If we see a very strong pattern [of spammy links] there, then it can happen that our algorithms say well, we really have kind of lost trust with this website and at the moment based on the bigger picture on the web, we kind of need to be more on almost a conservative side when it comes to to understanding this website’s content and ranking it in the search results. And then you can see kind of a drop in the visibility there. 

John MuellerJohn Mueller

Either way, the point remains: it’s patterns of manipulation that are likely to hurt rankings. There’s very little chance that you need to worry about the odd potentially dodgy link here and there. 

While it might be tempting to use an SEO tool that finds “toxic backlinks” for you, I’d seriously urge you to reconsider. Trusting these can do more harm than good. Way more. 

Just look at this unfortunate Redditor’s reply to John Mueller: 

Someone on Reddit's traffic tanked 60% after disavowing "toxic" backlinks in one SEO toolSomeone on Reddit's traffic tanked 60% after disavowing "toxic" backlinks in one SEO tool
A 60% drop in traffic! That’s no joke! 

Even if this is an extreme case, worrying about these links likely only wastes time because, according to Marie Haynes, they’re rarely truly toxic: 

I find that the truly toxic links…the ones that could have the potential to harm your site algorithmically (although you’d have to really overdo it, as I’ll describe below), are rarely returned by an SEO tool. 

Marie HaynesMarie Haynes

Sam McRoberts, CEO of VUVU Marketing, seems to agree: 

So… how do you find truly toxic backlinks that are likely to be hurting your site? 

The truth? You might not even need to look for them. If you haven’t built or bought links that Google considers link spam at any reasonable scale, chances are you’re good. 

If you’re not confident about that, do a manual backlink audit with a tool like Ahrefs’ Site Explorer.

The Anchors report is a good starting point if you’ve never done this. It shows you the words and phrases people use when linking to you. If they look unnatural or over-optimized (lots of exact matches of keywords you’re trying to rank for), that could be a sign you have paid or other links intended to manipulate rankings. 

Example of keyword-rich anchors, which are often a sign of paid backlinksExample of keyword-rich anchors, which are often a sign of paid backlinks

If things look fishy there, use the Backlinks report to dig deeper and check the context of those links. It’s usually quite easy to spot paid and unnatural ones. 

The Backlinks report in Ahrefs' Site Explorer showing the context of the backlinkThe Backlinks report in Ahrefs' Site Explorer showing the context of the backlink

Just remember that you’re looking for patterns of unnatural links, not just one or two. 

WARNING

If you’re not 100% sure what you’re looking for when doing a backlink audit, hire someone who knows what they’re doing. You need to be confident that the links are truly “toxic.”

If you have a manual action for unnatural links or a bunch of what you believe to be truly toxic backlinks, yes. Google’s advice is to disavow them (assuming you can’t get the links removed). 

You should disavow backlinks only if: 

You have a considerable number of spammy, artificial, or low-quality links pointing to your site, 

AND

The links have caused a manual action, or likely will cause a manual action, on your site. 

Marie Haynes advises the same: 

There are two situations where we will recommend to our clients a thorough link audit followed by filing a disavow: 

  1. The site has a manual action for unnatural links in GSC. 
  2. The site has a very large number of links that we feel the webspam team would consider to be “manipulative”.
Marie HaynesMarie Haynes

If you just have a bunch of spammy backlinks that most sites naturally attract or the odd paid backlink, probably not. Google probably ignores most, if not all, of these links, so disavowing them is likely a waste of time. 

While there is no harm in disavowing these links other than the time spent analyzing them, there is likely no benefit either. 

Marie HaynesMarie Haynes

But what about negative SEO?

Being the victim of a negative SEO attack is indeed the possible exception here. This is when a competitor sends a load of spammy or toxic backlinks your way to try to get your site penalized. 

Google remains adamant that it basically never works, but it really comes down to what you believe. 

[I’ve] looked at hundreds of supposed cases of negative SEO, but none have actually been the real reason a website was hurt. […] While it’s easier to blame negative SEO, typically the culprit of a traffic drop is something else you don’t know about–perhaps an algorithm update or an issue with their website. 

Gary IllyesGary Illyes

If you see a traffic drop after an influx of backlinks in Site Explorer, I’d say that it’s at least worth a bit more investigation. 

Site with traffic drop coinciding with an influx of backlinksSite with traffic drop coinciding with an influx of backlinks
This site experienced a traffic drop coinciding with an influx of referring domains. Maybe there’s benefit to disavowing here… and maybe it’s something else!

As Gary said above, something else could be to blame—but you never know. There’s always a chance that Google’s algorithms rule it was you who built or bought those backlinks to try to manipulate rankings and penalize you for it. 

If you just found a bunch of so-called “toxic backlinks” in an SEO tool, probably not. Again, most of these are probably just link spam Google already ignores. 

Here’s yet another quote from Marie Haynes backing this up: 

While there is probably no harm in disavowing [links reported as toxic in SEO tools], you are not likely to see any improvement as a result. Disavowing is meant for sites trying to remove a manual action and for those who have been actively building links for the purpose of improving rankings. 

Marie HaynesMarie Haynes

There’s also the risk that you could end up disavowing links that are actually helping you… 

Patrick showed further evidence that this can absolutely happen when he experimented with disavowing links to the Ahrefs blog. Traffic dipped, then went back up after he removed the disavow. 

The impact of disavowing links to the Ahrefs blogThe impact of disavowing links to the Ahrefs blog

Final thoughts

“Toxic backlinks” is a term made up by certain SEO tools to scare you. That’s not to say bad links can’t hurt your site. They absolutely can. But fortunately for most site owners, it’s rarely a problem worth worrying all that much about. 

Got questions? Disagree? Ping me on Twitter X.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Cautions On Blocking GoogleOther Bot

Published

on

By

Google cautions about blocking and opting out of getting crawled by the GoogleOther crawler

Google’s Gary Illyes answered a question about the non-search features that the GoogleOther crawler supports, then added a caution about the consequences of blocking GoogleOther.

What Is GoogleOther?

GoogleOther is a generic crawler created by Google for the various purposes that fall outside of those of bots that specialize for Search, Ads, Video, Images, News, Desktop and Mobile. It can be used by internal teams at Google for research and development in relation to various products.

The official description of GoogleOther is:

“GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.”

Something that may be surprising is that there are actually three kinds of GoogleOther crawlers.

Three Kinds Of GoogleOther Crawlers

  1. GoogleOther
    Generic crawler for public URLs
  2. GoogleOther-Image
    Optimized to crawl public image URLs
  3. GoogleOther-Video
    Optimized to crawl public video URLs

All three GoogleOther crawlers can be used for research and development purposes. That’s just one purpose that Google publicly acknowledges that all three versions of GoogleOther could be used for.

What Non-Search Features Does GoogleOther Support?

Google doesn’t say what specific non-search features GoogleOther supports, probably because it doesn’t really “support” a specific feature. It exists for research and development crawling which could be in support of a new product or an improvement in a current product, it’s a highly open and generic purpose.

This is the question asked that Gary narrated:

“What non-search features does GoogleOther crawling support?”

Gary Illyes answered:

“This is a very topical question, and I think it is a very good question. Besides what’s in the public I don’t have more to share.

GoogleOther is the generic crawler that may be used by various product teams for fetching publicly accessible content from sites. For example, it may be used for one-off crawls for internal research and development.

Historically Googlebot was used for this, but that kind of makes things murky and less transparent, so we launched GoogleOther so you have better controls over what your site is crawled for.

That said GoogleOther is not tied to a single product, so opting out of GoogleOther crawling might affect a wide range of things across the Google universe; alas, not Search, search is only Googlebot.”

It Might Affect A Wide Range Of Things

Gary is clear that blocking GoogleOther wouldn’t have an affect on Google Search because Googlebot is the crawler used for indexing content. So if blocking any of the three versions of GoogleOther is something a site owner wants to do, then it should be okay to do that without a negative effect on search rankings.

But Gary also cautioned about the outcome that blocking GoogleOther, saying that it would have an effect on other products and services across Google. He didn’t state which other products it could affect nor did he elaborate on the pros or cons of blocking GoogleOther.

Pros And Cons Of Blocking GoogleOther

Whether or not to block GoogleOther doesn’t necessarily have a straightforward answer. There are several considerations to whether doing that makes sense.

Pros

Inclusion in research for a future Google product that’s related to search (maps, shopping, images, a new feature in search) could be useful. It might be helpful to have a site included in that kind of research because it might be used for testing something good for a site and be one of the few sites chosen to test a feature that could increase earnings for a site.

Another consideration is that blocking GoogleOther to save on server resources is not necessarily a valid reason because GoogleOther doesn’t seem to crawl so often that it makes a noticeable impact.

If blocking Google from using site content for AI is a concern then blocking GoogleOther will have no impact on that at all. GoogleOther has nothing to do with crawling for Google Gemini apps or Vertex AI, including any future products that will be used for training associated language models. The bot for that specific use case is Google-Extended.

Cons

On the other hand it might not be helpful to allow GoogleOther if it’s being used to test something related to fighting spam and there’s something the site has to hide.

It’s possible that a site owner might not want to participate if GoogleOther comes crawling for market research or for training machine learning models (for internal purposes) that are unrelated to public-facing products like Gemini and Vertex.

Allowing GoogleOther to crawl a site for unknown purposes is like giving Google a blank check to use your site data in any way they see fit outside of training public-facing LLMs or purposes related to named bots like GoogleBot.

Takeaway

Should you block GoogleOther? It’s a coin toss. There are possible potential benefits but in general there isn’t enough information to make an informed decision.

Listen to the Google SEO Office Hours podcast at the 1:30 minute mark:

Featured Image by Shutterstock/Cast Of Thousands

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

AI Search Boosts User Satisfaction

Published

on

By

AI chat robot on search engine bar. Artificial intelligence bot innovation technology answer question with smart solution. 3D vector created from graphic software.

A new study finds that despite concerns about AI in online services, users are more satisfied with search engines and social media platforms than before.

The American Customer Satisfaction Index (ACSI) conducted its annual survey of search and social media users, finding that satisfaction has either held steady or improved.

This comes at a time when major tech companies are heavily investing in AI to enhance their services.

Search Engine Satisfaction Holds Strong

Google, Bing, and other search engines have rapidly integrated AI features into their platforms over the past year. While critics have raised concerns about potential negative impacts, the ACSI study suggests users are responding positively.

Google maintains its position as the most satisfying search engine with an ACSI score of 81, up 1% from last year. Users particularly appreciate its AI-powered features.

Interestingly, Bing and Yahoo! have seen notable improvements in user satisfaction, notching 3% gains to reach scores of 77 and 76, respectively. These are their highest ACSI scores in over a decade, likely due to their AI enhancements launched in 2023.

The study hints at the potential of new AI-enabled search functionality to drive further improvements in the customer experience. Bing has seen its market share improve by small but notable margins, rising from 6.35% in the first quarter of 2023 to 7.87% in Q1 2024.

Customer Experience Improvements

The ACSI study shows improvements across nearly all benchmarks of the customer experience for search engines. Notable areas of improvement include:

  • Ease of navigation
  • Ease of using the site on different devices
  • Loading speed performance and reliability
  • Variety of services and information
  • Freshness of content

These improvements suggest that AI enhancements positively impact various aspects of the search experience.

Social Media Sees Modest Gains

For the third year in a row, user satisfaction with social media platforms is on the rise, increasing 1% to an ACSI score of 74.

TikTok has emerged as the new industry leader among major sites, edging past YouTube with a score of 78. This underscores the platform’s effective use of AI-driven content recommendations.

Meta’s Facebook and Instagram have also seen significant improvements in user satisfaction, showing 3-point gains. While Facebook remains near the bottom of the industry at 69, Instagram’s score of 76 puts it within striking distance of the leaders.

Challenges Remain

Despite improvements, the study highlights ongoing privacy and advertising challenges for search engines and social media platforms. Privacy ratings for search engines remain relatively low but steady at 79, while social media platforms score even lower at 73.

Advertising experiences emerge as a key differentiator between higher- and lower-satisfaction brands, particularly in social media. New ACSI benchmarks reveal user concerns about advertising content’s trustworthiness and personal relevance.

Why This Matters For SEO Professionals

This study provides an independent perspective on how users are responding to the AI push in online services. For SEO professionals, these findings suggest that:

  1. AI-enhanced search features resonate with users, potentially changing search behavior and expectations.
  2. The improving satisfaction with alternative search engines like Bing may lead to a more diverse search landscape.
  3. The continued importance of factors like content freshness and site performance in user satisfaction aligns with long-standing SEO best practices.

As AI becomes more integrated into our online experiences, SEO strategies may need to adapt to changing user preferences.


Featured Image: kate3155/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google To Upgrade All Retailers To New Merchant Center By September

Published

on

By

Google To Upgrade All Retailers To New Merchant Center By September

Google has announced plans to transition all retailers to its updated Merchant Center platform by September.

This move will affect e-commerce businesses globally and comes ahead of the holiday shopping season.

The Merchant Center is a tool for online retailers to manage how their products appear across Google’s shopping services.

Key Changes & Features

The new Merchant Center includes several significant updates.

Product Studio

An AI-powered tool for content creation. Google reports that 80% of current users view it as improving efficiency.

This feature allows retailers to generate tailored product assets, animate still images, and modify existing product images to match brand aesthetics.

It also simplifies tasks like background removal and image resolution enhancement.

Centralized Analytics

A new tab consolidating various business insights, including pricing data and competitive analysis tools.

Retailers can access pricing recommendations, competitive visibility reports, and retail-specific search trends, enabling them to make data-driven decisions and capitalize on popular product categories.

Redesigned Navigation

Google claims the new interface is more intuitive and cites increased setup success rates for new merchants.

The platform now offers simplified website verification processes and can pre-populate product information during setup.

Initial User Response

According to Google, early adopters have shown increased engagement with the platform.

The company reports a 25% increase in omnichannel merchants adding product offers in the new system. However, these figures have yet to be independently verified.

Jeff Harrell, Google’s Senior Director of Merchant Shopping, states in an announcement:

“We’ve seen a significant increase in retention and engagement among existing online merchants who have moved to the new Merchant Center.”

Potential Challenges and Support

While Google emphasizes the upgrade’s benefits, some retailers, particularly those comfortable with the current version, may face challenges adapting to the new system.

The upgrade’s mandatory nature could raise concerns among users who prefer the existing interface or have integrated workflows based on the current system.

To address these concerns, Google has stated that it will provide resources and support to help with the transition. This includes tutorial videos, detailed documentation, and access to customer support teams for troubleshooting.

Industry Context

This update comes as e-commerce platforms evolve, with major players like Amazon and Shopify enhancing their seller tools. Google’s move is part of broader efforts to maintain competitiveness in the e-commerce services sector.

The upgrade could impact consumers by improving product listings and providing more accurate information across Google’s shopping services.

For the e-commerce industry as a whole, it signals a continued push towards AI-driven tools and data-centric decision-making.

Transition Timeline

Google states that retailers will be automatically upgraded by September if they still need to transition.

The company advises users to familiarize themselves with the new features before the busy holiday shopping period.


Featured Image: BestForBest/Shutterstock

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending