Connect with us

SEO

High-Quality Links vs. Low-Quality Links: What’s the Difference?

Published

on

High-Quality Links vs. Low-Quality Links: What’s the Difference
We spend a lot of time as SEO professionals going after links.

They are often seen as the most powerful way to rank a site.

But not every link is created equal.

Over time, the search engines have adapted their algorithms to account for links in different ways, narrowing their use for determining the suitability of a webpage as an answer to a search query.

In this post, you will learn what makes a high-quality link, where to find opportunities to build them, and how to evaluate whether a link is worth the budget and effort to get it.

How Do Search Engines Use Links?

Search engines use links pointing to a webpage to both discover its existence and also determine information about it.

Google mentions in its help documentation,

“Google interprets a link from page A to page B as a vote by page A for page B. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.”

Bing states in its Webmaster Help and How-To guide,

“Bing prefers to see links built organically. This essentially means the links are built by people linking to your content because they find value in your content. This is an important signal to a search engine because it is seen as a vote of confidence in the content.”

What Is Valuable About a Link?

We know that Google uses links like votes.

A link from a well-regarded website will have more clout than a lesser-regarded website.

Authority

This is often discussed as “authority.”

Many SEO tools will try to assign an authority metric to a website or webpage in an attempt to quantify the value of a link from them.

An authoritative webpage linking to your webpage can be a strong signal that it is itself an authoritative source.

In essence, an authoritative website is one that is considered by the search engines to be a reputable source of information about a subject – an authority in it.

Google will, in part, look at that site’s backlinks to determine its expertise and trustworthiness in a subject.

For instance, a website is considered an expert in interior design. It links to a lesser-known website about interior design.

The website by an expert in interior design is confident enough in the content of the lesser-known site that it’s willing to send its visitors there.

That’s a good, impartial way for the search engines to determine the reputation of a site and its authority on a subject.

Relevance

Authority isn’t everything, however.

Think of it like this… you’re going on holiday to a city you’ve never visited.

Who would you rather ask for restaurant recommendations: your friend who lives in the city, or a tour guide for a city 5 hours away from it?

Your friend who lives in the city is likely more of a relevant source of information on the restaurants in the area than the tour guide who doesn’t serve that area.

You might perceive a tour guide to be more knowledgeable about good restaurants, but not if it’s not their area of expertise.

In a similar way, the search engines will understand the value of a website in your industry linking to your webpage.

A website that reviews restaurants will be considered a more relevant source of information about restaurants than a local community group who had an outing to a restaurant.

Both sites may have a page talking about the “best sushi restaurant in New York,” but the restaurant review website will be more relevant in helping the search engines determine what to serve as an answer for “sushi restaurant in New York.”

Authority & Relevance

The best source of a link is a website that is both considered authoritative and relevant to your website.

What Makes a Link Low-Quality?

If we think of a quality link as one that is both relevant and authoritative, then it makes sense that the lowest quality link is one that is both irrelevant and not authoritative.

These sorts of links are usually easy to come by and can be self-created or requested.

For instance, a website that allows anyone to submit a link is unlikely to have highly curated content that would lend it to being authoritative.

The fact that anyone can add a link to the site means it isn’t likely to be particularly relevant to one industry or niche.

Links to your site from a website like this will be low-quality and generally useless.

At best, these links might have a little positive impact on your search rankings but at worst they could be perceived as part of a manipulative linking scheme.

Google has strict guidelines on what is considered a manipulative link.

You might want to familiarize yourself with Bing and Yandex’s definitions, too.

A Word About Paid Links

We all know by now that paying for links to aid rankings is against the guidelines of most big search engines.

In a best-case scenario, the link won’t be identified as having been paid for and you won’t see a penalty from it.

However, if Google detects that you’ve acquired links from websites that sell links, you may find the webpage it links to penalized.

There are legitimate reasons why links might be placed on websites for a fee.

It’s common practice to utilize banner advertising and affiliate marketing on the internet, for example.

In these instances, Google recommends that webmasters declare the links to be sponsored using the rel=”sponsored” attribute.

This indicates to Googlebots that the link is one that has been paid for and is not to be used for calculating PageRank.

These sorts of links have their own value for marketing and should not be discounted simply because they will not necessarily aid in search rankings.

A Word About NoFollow Links

Before Google introduced the use of the rel=”sponsored” attribute, it and other search engines were using the rel=”nofollow” attribute.

Putting a rel=”nofollow” attribute into the HTML for a link shows the search bots that they shouldn’t go to the destination of that link.

This is used by publishers to stop the search engines from visiting the page and ascribing any benefit of the link.

So, if a high-quality page links to your webpage with a link contain a rel=”nofollow” attribute, you won’t see any ranking benefit of that link.

Google announced recently that this attribute is a hint and therefore it might ignore it.

On the whole, this essentially makes a “nofollow” link useless for SEO link-building purposes as link equity will not pass through the link.

However, if people are following the link and discovering your webpage, I would argue it’s not useless at all!

What Do High-Quality Links Looks Like?

Low-quality links are usually those that are either:

  • Irrelevant in helping the search engines determine your site’s authority on a subject.
  • Or actually harmful.

I’m not addressing link penalties here, or even the sorts of link-building practices that will land you in hot water. For more information on that, see Chuck Price’s article on manual actions.

The low-quality links we’re talking about here are ones that you may well be going after but aren’t benefiting your site.

High-quality links are the Holy Grail of link-building.

They’re the links you show off in your “Team Wins” Slack channel and on Twitter.

They are hard to earn.

I also want to show you some “medium-quality” links.

These are the types of links that are good to get but perhaps won’t move the needle as much as you would like.

They form a part of a healthy backlink profile but aren’t worth your whole content marketing budget to land.

Low Quality: Low Authority/Low Relevance

The sorts of links you are likely to gain that are low-quality and low-relevance are ones that require no real effort to get.

For example, simply sourcing the links and asking for them or, in some cases, adding the link yourself.

Open Directories

These directory sites are very obviously low quality when you visit them. Typically they only offer one service – advertise your website here!

You do not need to pay for a link and everyone and their dog has taken advantage of this.

There will be links from websites in all sorts of industries with very little rhyme or reason as to why this directory exists.

Do note, however, that there are reputable local business directories that can help with verifying your business’s physical address and contact details—Yelp, for instance.

These listings are useful for local citations but are unlikely to really aid in boosting your site’s rankings.

The difference between reputable local directories and generic open directories is quite obvious when you visit them.

Comment Links

Forums ad blogs can be very relevant to a particular industry.

However, due to the ease with which anyone can add content to a forum page or blog comments, any links in that user-generated content are usually discounted by the search engines.

In recent SEO history, blog and forum comments were easy targets for squeezing in a link to a site.

The search engines became wise to this and started devaluing those links.

Alongside the rel=”sponsored” attribute, Google released rel=”ugc”.

This is a way for webmasters to indicate that the links within their forums are user-generated.

Low Quality: Low Effort & No Follow

Social Media Posts

Most large social media sites will use “no follow” tags on them.

However, Google did recently say that “nofollow” tags would be taken as hints rather than concretely respected.

Despite this, social media sites are not the place to go looking for backlinks to help your rankings.

Although social media sites themselves are often authoritative, they are full of uncurated content.

Businesses can set up their own social media pages with links back to their websites. They can talk about their sites in their posts.

These links are not unbiased. Due to this, they are largely ignored by search engines.

Medium Quality: Low Authority but High Relevancy

Small Industry Blogs

Most industries have a proliferation of blogs. Sites run by companies or individuals who want to share their knowledge and build their profile.

There are some highly relevant, niche blogs that might not be well-known enough to be getting their own authority-metric boosting backlinks.

They are, however, full of decent content and very relevant to the website you are trying to grow.

Small industry blog writers are often less over-run with requests to share content and add links than the well-known ones.

They are, however, keen to write and build community.

A smaller blog featuring your site is still a good reinforcement of your relevance to your industry.

This can help enormously with showing your relevance to search topics associated with that industry.

Small Industry Brands

There will be some staple brands in your industry that aren’t necessarily competitors but are tangentially related.

Think of paper manufacturers to your office supply store, for example.

A link from the paper manufacturer showing your store as their distributor can help show your authority in the industry.

Medium Quality: Medium Authority & Medium Low Relevancy

Local News Sites

Your local news site may report on anything to do with your community, or they might be more discerning.

Regardless, doing something considered locally newsworthy can get you featured a lot easier than in a national news website.

These are especially good links to get if you are trying to boost your local SEO efforts.

A link from a website known as a source of reliable local information could help the search engines to see your relevance to that physical area.

High Quality: High Authority but Medium/Low Relevancy

Some sites are extremely authoritative and hard to get a link from. These tend to be beneficial to your SEO efforts.

These sorts of links might not be highly relevant, however.

Although you will see a benefit to your search visibility, it may not help solidify your relevance for particular topics.

National News Sites

There are some national and international newspapers with extremely high authority websites. A link from these sites is worth the effort.

However, journalists are inundated with hundreds of press releases and article ideas every day.

It can be incredibly difficult to get featured, especially with a link.

The best way to get coverage in a national newspaper is to do something newsworthy.

Bringing it to the attention of the site’s journalists might help you get it covered, hopefully with a link back to your site.

High Quality: Medium Authority but High Relevancy

Big Industry Blogs

That website that everyone in the industry goes to for their news; your friends and family may not have heard of it, but your colleagues definitely have.

It’s likely to be a medium authority site according to authority metrics but it’s a leader in your industry.

It’s also very relevant to the website you’re promoting.

A link from a site like this will go a long way in showing your site’s expertise.

High Quality: High Authority & High Relevancy

Big Industry Brands

These are household names; the companies everyone in your industry (and possibly their families) know of.

These links are likely to be medium to high authority according to the tools but definitely leaders in your industry.

If you are linked to as a supplier or distributor, or even just mentioned in a favorable review, you are likely to see the ranking benefit.

Conclusion

A wide and varied link profile is good for SEO.

If you are actively looking to increase links to your site in an organic manner, it’s imperative you know how to generate high-quality links.

Don’t waste your time going for easy links on unrelated and low-quality sites.

Instead, focus your energy and budget on creating truly newsworthy content and bringing it to the attention of authoritative and relevant publishers.

More Resources:

Search Engine Journal

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEO

Google Search Leak: Conflicting Signals, Unanswered Questions

Published

on

By

Google Search Leak: Conflicting Signals, Unanswered Questions

An apparent leak of Google Search API documentation has sparked intense debate within the SEO community, with some claiming it proves Google’s dishonesty and others urging caution in interpreting the information.

As the industry grapples with the allegations, a balanced examination of Google’s statements and the perspectives of SEO experts is crucial to understanding the whole picture.

Leaked Documents Vs. Google’s Public Statements

Over the years, Google has consistently maintained that specific ranking signals, such as click data and user engagement metrics, aren’t used directly in its search algorithms.

In public statements and interviews, Google representatives have emphasized the importance of relevance, quality, and user experience while denying the use of specific metrics like click-through rates or bounce rates as ranking-related factors.

However, the leaked API documentation appears to contradict these statements.

It contains references to features like “goodClicks,” “badClicks,” “lastLongestClicks,” impressions, and unicorn clicks, tied to systems called Navboost and Glue, which Google VP Pandu Nayak confirmed in DOJ testimony are parts of Google’s ranking systems.

The documentation also alleges that Google calculates several metrics using Chrome browser data on individual pages and entire domains, suggesting the full clickstream of Chrome users is being leveraged to influence search rankings.

This contradicts past Google statements that Chrome data isn’t used for organic searches.

The Leak’s Origins & Authenticity

Erfan Azimi, CEO of digital marketing agency EA Eagle Digital, alleges he obtained the documents and shared them with Rand Fishkin and Mike King.

Azimi claims to have spoken with ex-Google Search employees who confirmed the authenticity of the information but declined to go on record due to the situation’s sensitivity.

While the leak’s origins remain somewhat ambiguous, several ex-Googlers who reviewed the documents have stated they appear legitimate.

Fishkin states:

“A critical next step in the process was verifying the authenticity of the API Content Warehouse documents. So, I reached out to some ex-Googler friends, shared the leaked docs, and asked for their thoughts.”

Three ex-Googlers responded, with one stating, “It has all the hallmarks of an internal Google API.”

However, without direct confirmation from Google, the authenticity of the leaked information is still debatable. Google has not yet publicly commented on the leak.

It’s important to note that, according to Fishkin’s article, none of the ex-Googlers confirmed that the leaked data was from Google Search. Only that it appears to have originated from within Google.

Industry Perspectives & Analysis

Many in the SEO community have long suspected that Google’s public statements don’t tell the whole story. The leaked API documentation has only fueled these suspicions.

Fishkin and King argue that if the information is accurate, it could have significant implications for SEO strategies and website search optimization.

Key takeaways from their analysis include:

  • Navboost and the use of clicks, CTR, long vs. Short clicks, and user data from Chrome appear to be among Google’s most powerful ranking signals.
  • Google employs safelists for sensitive topics like COVID-19, elections, and travel to control what sites appear.
  • Google uses Quality Rater feedback and ratings in its ranking systems, not just as a training set.
  • Click data influences how Google weights links for ranking purposes.
  • Classic ranking factors like PageRank and anchor text are losing influence compared to more user-centric signals.
  • Building a brand and generating search demand is more critical than ever for SEO success.

However, just because something is mentioned in API documentation doesn’t mean it’s being used to rank search results.

Other industry experts urge caution when interpreting the leaked documents.

They point out that Google may use the information for testing purposes or apply it only to specific search verticals rather than use it as active ranking signals.

There are also open questions about how much weight these signals carry compared to other ranking factors. The leak doesn’t provide the full context or algorithm details.

Unanswered Questions & Future Implications

As the SEO community continues to analyze the leaked documents, many questions still need to be answered.

Without official confirmation from Google, the authenticity and context of the information are still a matter of debate.

Key open questions include:

  • How much of this documented data is actively used to rank search results?
  • What is the relative weighting and importance of these signals compared to other ranking factors?
  • How have Google’s systems and use of this data evolved?
  • Will Google change its public messaging and be more transparent about using behavioral data?

As the debate surrounding the leak continues, it’s wise to approach the information with a balanced, objective mindset.

Unquestioningly accepting the leak as gospel truth or completely dismissing it are both shortsighted reactions. The reality likely lies somewhere in between.

Potential Implications For SEO Strategies and Website Optimization

It would be highly inadvisable to act on information shared from this supposed ‘leak’ without confirming whether it’s an actual Google search document.

Further, even if the content originates from search, the information is a year old and could have changed. Any insights derived from the leaked documentation should not be considered actionable now.

With that in mind, while the full implications remain unknown, here’s what we can glean from the leaked information.

1. Emphasis On User Engagement Metrics

If click data and user engagement metrics are direct ranking factors, as the leaked documents suggest, it could place greater emphasis on optimizing for these metrics.

This means crafting compelling titles and meta descriptions to increase click-through rates, ensuring fast page loads and intuitive navigation to reduce bounces, and strategically linking to keep users engaged on your site.

Driving traffic through other channels like social media and email can also help generate positive engagement signals.

However, it’s important to note that optimizing for user engagement shouldn’t come at the expense of creating reader-focused content. Gaming engagement metrics are unlikely to be a sustainable, long-term strategy.

Google has consistently emphasized the importance of quality and relevance in its public statements, and based on the leaked information, this will likely remain a key focus. Engagement optimization should support and enhance quality content, not replace it.

2. Potential Changes To Link-Building Strategies

The leaked documents contain information about how Google treats different types of links and their impact on search rankings.

This includes details about the use of anchor text, the classification of links into different quality tiers based on traffic to the linking page, and the potential for links to be ignored or demoted based on various spam factors.

If this information is accurate, it could influence how SEO professionals approach link building and the types of links they prioritize.

Links that drive real click-throughs may carry more weight than links on rarely visited pages.

The fundamentals of good link building still apply—create link-worthy content, build genuine relationships, and seek natural, editorially placed links that drive qualified referral traffic.

The leaked information doesn’t change this core approach but offers some additional nuance to be aware of.

3. Increased Focus On Brand Building and Driving Search Demand

The leaked documents suggest that Google uses brand-related signals and offline popularity as ranking factors. This could include metrics like brand mentions, searches for the brand name, and overall brand authority.

As a result, SEO strategies may emphasize building brand awareness and authority through both online and offline channels.

Tactics could include:

  • Securing brand mentions and links from authoritative media sources.
  • Investing in traditional PR, advertising, and sponsorships to increase brand awareness.
  • Encouraging branded searches through other marketing channels.
  • Optimizing for higher search volumes for your brand vs. unbranded keywords.
  • Building engaged social media communities around your brand.
  • Establishing thought leadership through original research, data, and industry contributions.

The idea is to make your brand synonymous with your niche and build an audience that seeks you out directly. The more people search for and engage with your brand, the stronger those brand signals may become in Google’s systems.

4. Adaptation To Vertical-Specific Ranking Factors

Some leaked information suggests that Google may use different ranking factors or algorithms for specific search verticals, such as news, local search, travel, or e-commerce.

If this is the case, SEO strategies may need to adapt to each vertical’s unique ranking signals and user intents.

For example, local search optimization may focus more heavily on factors like Google My Business listings, local reviews, and location-specific content.

Travel SEO could emphasize collecting reviews, optimizing images, and directly providing booking/pricing information on your site.

News SEO requires focusing on timely, newsworthy content and optimized article structure.

While the core principles of search optimization still apply, understanding your particular vertical’s nuances, based on the leaked information and real-world testing, can give you a competitive advantage.

The leaks suggest a vertical-specific approach to SEO could give you an advantage.

Conclusion

The Google API documentation leak has created a vigorous discussion about Google’s ranking systems.

As the SEO community continues to analyze and debate the leaked information, it’s important to remember a few key things:

  1. The information isn’t fully verified and lacks context. Drawing definitive conclusions at this stage is premature.
  2. Google’s ranking algorithms are complex and constantly evolving. Even if entirely accurate, this leak only represents a snapshot in time.
  3. The fundamentals of good SEO – creating high-quality, relevant, user-centric content and promoting it effectively – still apply regardless of the specific ranking factors at play.
  4. Real-world testing and results should always precede theorizing based on incomplete information.

What To Do Next

As an SEO professional, the best course of action is to stay informed about the leak.

Because details about the document remain unknown, it’s not a good idea to consider any takeaways actionable.

Most importantly, remember that chasing algorithms is a losing battle.

The only winning strategy in SEO is to make your website the best result for your message and audience. That’s Google’s endgame, and that’s where your focus should be, regardless of what any particular leaked document suggests.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google’s AI Overviews Shake Up Ecommerce Search Visibility

Published

on

By

Google's AI Overviews Shake Up Ecommerce Search Visibility

An analysis of 25,000 ecommerce queries by Bartosz Góralewicz, founder of Onely, reveals the impact of Google’s AI overviews on search visibility for online retailers.

The study found that 16% of eCommerce queries now return an AI overview in search results, accounting for 13% of total search volume in this sector.

Notably, 80% of the sources listed in these AI overviews do not rank organically for the original query.

“Ranking #1-3 gives you only an 8% chance of being a source in AI overviews,” Góralewicz stated.

Shift Toward “Accelerated” Product Experiences

International SEO consultant Aleyda Solis analyzed the disconnect between traditional organic ranking and inclusion in AI overviews.

According to Solis, for product-related queries, Google is prioritizing an “accelerated” approach over summarizing currently ranking pages.

She commented Góralewicz’ findings, stating:

“… rather than providing high level summaries of what’s already ranked organically below, what Google does with e-commerce is “accelerate” the experience by already showcasing what the user would get next.”

Solis explains that for queries where Google previously ranked category pages, reviews, and buying guides, it’s now bypassing this level of results with AI overviews.

Assessing AI Overview Traffic Impact

To help retailers evaluate their exposure, Solis has shared a spreadsheet that analyzes the potential traffic impact of AI overviews.

As Góralewicz notes, this could be an initial rollout, speculating that “Google will expand AI overviews for high-cost queries when enabling ads” based on data showing they are currently excluded for high cost-per-click keywords.

An in-depth report across ecommerce and publishing is expected soon from Góralewicz and Onely, with additional insights into this search trend.

Why SEJ Cares

AI overviews represent a shift in how search visibility is achieved for ecommerce websites.

With most overviews currently pulling product data from non-ranking sources, the traditional connection between organic rankings and search traffic is being disrupted.

Retailers may need to adapt their SEO strategies for this new search environment.

How This Can Benefit You

While unsettling for established brands, AI overviews create new opportunities for retailers to gain visibility without competing for the most commercially valuable keywords.

Ecommerce sites can potentially circumvent traditional ranking barriers by optimizing product data and detail pages for Google’s “accelerated” product displays.

The detailed assessment framework provided by Solis enables merchants to audit their exposure and prioritize optimization needs accordingly.


FAQ

What are the key findings from the analysis of AI overviews & ecommerce queries?

Góralewicz’s analysis of 25,000 ecommerce queries found:

  • 16% of ecommerce queries now return an AI overview in the search results.
  • 80% of the sources listed in these AI overviews do not rank organically for the original query.
  • Ranking positions #1-3 only provides an 8% chance of being a source in AI overviews.

These insights reveal significant shifts in how ecommerce sites need to approach search visibility.

Why are AI overviews pulling product data from non-ranking sources, and what does this mean for retailers?

Google’s AI overviews prioritize “accelerated” experiences over summarizing currently ranked pages for product-related queries.

This shift focuses on showcasing directly what users seek instead of traditional organic results.

For retailers, this means:

  • A need to optimize product pages beyond traditional SEO practices, catering to the data requirements of AI overviews.
  • Opportunities to gain visibility without necessarily holding top organic rankings.
  • Potential to bypass traditional ranking barriers by focusing on enhanced product data integration.

Retailers must adapt quickly to remain competitive in this evolving search environment.

What practical steps can retailers take to evaluate and improve their search visibility in light of AI overview disruptions?

Retailers can take several practical steps to evaluate and improve their search visibility:

  • Utilize the spreadsheet provided by Aleyda Solis to assess the potential traffic impact of AI overviews.
  • Optimize product and detail pages to align with the data and presentation style preferred by AI overviews.
  • Continuously monitor changes and updates to AI overviews, adapting strategies based on new data and trends.

These steps can help retailers navigate the impact of AI overviews and maintain or improve their search visibility.


Featured Image: Marco Lazzarini/Shutterstock



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEO

Google’s AI Overviews Go Viral, Draw Mainstream Media Scrutiny

Published

on

By

Google's AI Overviews Go Viral, Draw Mainstream Media Scrutiny

Google’s rollout of AI-generated overviews in US search results is taking a disastrous turn, with mainstream media outlets like The New York Times, BBC, and CNBC reporting on numerous inaccuracies and bizarre responses.

On social media, users are sharing endless examples of the feature’s nonsensical and sometimes dangerous output.

From recommending non-toxic glue on pizza to suggesting that eating rocks provides nutritional benefits, the blunders would be amusing if they weren’t so alarming.

Mainstream Media Coverage

As reported by The New York Times, Google’s AI overviews struggle with basic facts, claiming that Barack Obama was the first Muslim president of the United States and stating that Andrew Jackson graduated from college in 2005.

These errors undermine trust in Google’s search engine, which more than two billion people rely on for authoritative information worldwide.

Manual Removal & System Refinements

As reported by The Verge, Google is now scrambling to remove the bizarre AI-generated responses and improve its systems manually.

A Google spokesperson confirmed that the company is taking “swift action” to remove problematic responses and using the examples to refine its AI overview feature.

Google’s Rush To AI Integration

The flawed rollout of AI overviews isn’t an isolated incident for Google.

As CNBC notes in its report, Google made several missteps in a rush to integrate AI into its products.

In February, Google was forced to pause its Gemini chatbot after it generated inaccurate images of historical figures and refused to depict white people in most instances.

Before that, the company’s Bard chatbot faced ridicule for sharing incorrect information about outer space, leading to a $100 billion drop in Google’s market value.

Despite these setbacks, industry experts cited by The New York Times suggest that Google has little choice but to continue advancing AI integration to remain competitive.

However, the challenges of taming large language models, which ingest false information and satirical posts, are now more apparent.

The Debate Over AI In Search

The controversy surrounding AI overviews adds fuel to the debate over the risks and limitations of AI.

While the technology holds potential, these missteps remind everyone that more testing is needed before unleashing it on the public.

The BBC notes that Google’s rivals face similar backlash over their attempts to cram more AI tools into their consumer-facing products.

The UK’s data watchdog is investigating Microsoft after it announced a feature that would take continuous screenshots of users’ online activity.

At the same time, actress Scarlett Johansson criticized OpenAI for using a voice likened to her own without permission.

What This Means For Websites & SEO Professionals

Mainstream media coverage of Google’s erroneous AI overviews brings the issue of declining search quality to public attention.

As the company works to address inaccuracies, the incident serves as a cautionary tale for the entire industry.

Important takeaway: Prioritize responsible use of AI technology to ensure the benefits outweigh its risks.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending