Connect with us

GOOGLE

Google: Low Traffic Does Not Always Mean Low Quality

Published

on

Google: Low Traffic Does Not Always Mean Low Quality

Google’s John Mueller answered a question about what to do with low traffic pages that have poor search visibility and traffic. He acknowledged that there could be quality issues but also noted that low traffic in itself is not mean the pages  themselves are low quality.

John Mueller offered solutions to the problem of low traffic web pages.

What to Do About Low Traffic Pages?

The person asking the question was concerned about hundreds of thousands of web pages that are indexed but have minimal search visibility.

He communicated that perhaps these pages lacked authority and asked if he should de-index the pages or canonicalize them because he was concerned about the website’s quality score.

How does Google Treat Low Traffic Pages in Terms of Quality?

This is the question asked:

“We have a site that has a hub and spoke architecture.

A hub page might be Eric Clapton and the spokes are what guitars he uses, and each of those pages are relatively small.

Advertisement

The value from them is from embedded videos or pictures with relatively little unique font content.

Over time those pages have become the majority of our indexed pages, with well over a hundred thousand.

But only a third of those are getting traffic through search.

In the past I’ve heard you say that to affect your website’s quality score, we were considering de-indexing those pages …the pages that are not getting traffic…

However, we were also considering canonicalizing these.

So I was curious how Google would treat that from a quality score perspective.”

Advertisement

Google Does Not Have a Quality Score for Organic Search

Many in the search industry and Google discuss site quality. Web pages, groups of web pages and entire websites can be judged to be of low quality.

But Google does not have a “quality score” for the organic search results. John Mueller affirmed this important point.

Google’s John Mueller first addressed the issue of the quality score by noting that Google does not give sites a quality score.

Mueller:

“We don’t really have a quality score, in that sense.

I think that’s something that comes from the ad side.

So that’s one thing to keep in mind there.”

Advertisement

How to Deal with Low Quality Web Pages

Mueller next discussed the different approaches to dealing with pages that have low search visibility.

John Mueller continued:

“I think there are multiple things to think about here.

On the one hand, I would consider taking some action if you feel that these pages are low quality.

Taking action could be something like removing those pages, improving those pages, combining those kinds of pages together.

Anything along those lines could be something that you could do if these are low quality pages.”

Low Traffic is Not a Signal of Low Quality

John Mueller next offered the insight that low search visibility is not a symptom of low quality.

Advertisement

The question of low quality is a good one so it’s always useful to hear what John Mueller or any other Googler has to say about this issue of page and site quality.

Mueller offered the following insights:

“If these are pages that tend not to get a lot of traffic but they’re actually useful on their own then I wouldn’t necessarily see them as low quality. That’s one thing to keep in mind.

On some websites, pages that get low traffic are often almost like correlated with low quality as well but that doesn’t have to be the case.

On other websites it might must just be that a lot of traffic goes to the head pages and the tail pages are just as useful but they’re useful for a much smaller audience.

So they get barely any traffic.

From our point of view, those websites are still useful and it’s still high quality.

Advertisement

I wouldn’t remove it just because it doesn’t get traffic.”

How to Fix Low Quality Pages at Scale

Mueller next discusses the difficult issue of dealing with low quality pages at scale in terms of hundreds of thousands of pages.

Mueller offered these suggestions:

“With regards to the different kinds of approaches there, when I ask the search quality teams about this, usually they say well you should just improve the quality of your pages. which kind of makes sense…

But at the same time if you’re talking about hundreds of thousands of pages that’s really hard to do at scale.

So sometimes people do opt for removing the pages or combining the pages.

The thing to keep in mind with using a canonical to combine pages is that we only take into account the canonical page then.

Advertisement

So if you have one page for example about Eric Clapton’s guitars and another page about Eric Clapton’s shoes, and you say that the guitar page is the canonical for the shoes page then we wouldn’t have that shoe page or any of its content in our index anymore. We would essentially just focus on the guitars.

So if someone were searching for Eric Clapton shoes, they wouldn’t be able to find those pages at all.

So that’s (kind of) with the different approaches, something to keep in mind, so that in a case like that, what I would do is take the content from the page that you want to kind of remove or clean up and include that into kind of a bigger page and make that bigger page stronger.

And by that you’re also making sure that you still have that content indexable somewhere.”

Identifying Quality Issues and Traffic Issues

In a way, this question was really about two topics.

One topic was about content quality. The other concern was search traffic.

Advertisement

If one decouples the issue of “quality” from the concern about pages lacking search traffic, then the answer to the question of what do with the pages becomes a little clearer.

The question becomes, “What can I do to make these pages perform better in search?”

Google’s John Mueller suggested combining the pages to make stronger pages out of hundreds of weaker pages, if the content itself is useful.

But of course, if the content is inherent useless, it’s possible to rewrite it to make it more useful, get rid of it or redirect it to a page that has a similar topic but is better.

Citation

Pages with Low Traffic Aren’t Always Low Quality

Watch John Mueller answer the question at the 40 second mark:

Advertisement

Searchenginejournal.com

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

AI

Exploring the Evolution of Language Translation: A Comparative Analysis of AI Chatbots and Google Translate

Published

on

By

A Comparative Analysis of AI Chatbots and Google Translate

According to an article on PCMag, while Google Translate makes translating sentences into over 100 languages easy, regular users acknowledge that there’s still room for improvement.

In theory, large language models (LLMs) such as ChatGPT are expected to bring about a new era in language translation. These models consume vast amounts of text-based training data and real-time feedback from users worldwide, enabling them to quickly learn to generate coherent, human-like sentences in a wide range of languages.

However, despite the anticipation that ChatGPT would revolutionize translation, previous experiences have shown that such expectations are often inaccurate, posing challenges for translation accuracy. To put these claims to the test, PCMag conducted a blind test, asking fluent speakers of eight non-English languages to evaluate the translation results from various AI services.

The test compared ChatGPT (both the free and paid versions) to Google Translate, as well as to other competing chatbots such as Microsoft Copilot and Google Gemini. The evaluation involved comparing the translation quality for two test paragraphs across different languages, including Polish, French, Korean, Spanish, Arabic, Tagalog, and Amharic.

In the first test conducted in June 2023, participants consistently favored AI chatbots over Google Translate. ChatGPT, Google Bard (now Gemini), and Microsoft Bing outperformed Google Translate, with ChatGPT receiving the highest praise. ChatGPT demonstrated superior performance in converting colloquialisms, while Google Translate often provided literal translations that lacked cultural nuance.

For instance, ChatGPT accurately translated colloquial expressions like “blow off steam,” whereas Google Translate produced more literal translations that failed to resonate across cultures. Participants appreciated ChatGPT’s ability to maintain consistent levels of formality and its consideration of gender options in translations.

Advertisement

The success of AI chatbots like ChatGPT can be attributed to reinforcement learning with human feedback (RLHF), which allows these models to learn from human preferences and produce culturally appropriate translations, particularly for non-native speakers. However, it’s essential to note that while AI chatbots outperformed Google Translate, they still had limitations and occasional inaccuracies.

In a subsequent test, PCMag evaluated different versions of ChatGPT, including the free and paid versions, as well as language-specific AI agents from OpenAI’s GPTStore. The paid version of ChatGPT, known as ChatGPT Plus, consistently delivered the best translations across various languages. However, Google Translate also showed improvement, performing surprisingly well compared to previous tests.

Overall, while ChatGPT Plus emerged as the preferred choice for translation, Google Translate demonstrated notable improvement, challenging the notion that AI chatbots are always superior to traditional translation tools.


Source: https://www.pcmag.com/articles/google-translate-vs-chatgpt-which-is-the-best-language-translator

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

GOOGLE

Google Implements Stricter Guidelines for Mass Email Senders to Gmail Users

Published

on

1280x924 gmail

Beginning in April, Gmail senders bombarding users with unwanted mass emails will encounter a surge in message rejections unless they comply with the freshly minted Gmail email sender protocols, Google cautions.

Fresh Guidelines for Dispatching Mass Emails to Gmail Inboxes In an elucidative piece featured on Forbes, it was highlighted that novel regulations are being ushered in to shield Gmail users from the deluge of unsolicited mass emails. Initially, there were reports surfacing about certain marketers receiving error notifications pertaining to messages dispatched to Gmail accounts. Nonetheless, a Google representative clarified that these specific errors, denoted as 550-5.7.56, weren’t novel but rather stemmed from existing authentication prerequisites.

Moreover, Google has verified that commencing from April, they will initiate “the rejection of a portion of non-compliant email traffic, progressively escalating the rejection rate over time.” Google elaborates that, for instance, if 75% of the traffic adheres to the new email sender authentication criteria, then a portion of the remaining non-conforming 25% will face rejection. The exact proportion remains undisclosed. Google does assert that the implementation of the new regulations will be executed in a “step-by-step fashion.”

This cautious and methodical strategy seems to have already kicked off, with transient errors affecting a “fraction of their non-compliant email traffic” coming into play this month. Additionally, Google stipulates that bulk senders will be granted until June 1 to integrate “one-click unsubscribe” in all commercial or promotional correspondence.

Exclusively Personal Gmail Accounts Subject to Rejection These alterations exclusively affect bulk emails dispatched to personal Gmail accounts. Entities sending out mass emails, specifically those transmitting a minimum of 5,000 messages daily to Gmail accounts, will be mandated to authenticate outgoing emails and “refrain from dispatching unsolicited emails.” The 5,000 message threshold is tabulated based on emails transmitted from the same principal domain, irrespective of the employment of subdomains. Once the threshold is met, the domain is categorized as a permanent bulk sender.

These guidelines do not extend to communications directed at Google Workspace accounts, although all senders, including those utilizing Google Workspace, are required to adhere to the updated criteria.

Advertisement

Augmented Security and Enhanced Oversight for Gmail Users A Google spokesperson emphasized that these requisites are being rolled out to “fortify sender-side security and augment user control over inbox contents even further.” For the recipient, this translates to heightened trust in the authenticity of the email sender, thus mitigating the risk of falling prey to phishing attempts, a tactic frequently exploited by malevolent entities capitalizing on authentication vulnerabilities. “If anything,” the spokesperson concludes, “meeting these stipulations should facilitate senders in reaching their intended recipients more efficiently, with reduced risks of spoofing and hijacking by malicious actors.”

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

GOOGLE

Google’s Next-Gen AI Chatbot, Gemini, Faces Delays: What to Expect When It Finally Launches

Published

on

By

Google AI Chatbot Gemini

In an unexpected turn of events, Google has chosen to postpone the much-anticipated debut of its revolutionary generative AI model, Gemini. Initially poised to make waves this week, the unveiling has now been rescheduled for early next year, specifically in January.

Gemini is set to redefine the landscape of conversational AI, representing Google’s most potent endeavor in this domain to date. Positioned as a multimodal AI chatbot, Gemini boasts the capability to process diverse data types. This includes a unique proficiency in comprehending and generating text, images, and various content formats, even going so far as to create an entire website based on a combination of sketches and written descriptions.

Originally, Google had planned an elaborate series of launch events spanning California, New York, and Washington. Regrettably, these events have been canceled due to concerns about Gemini’s responsiveness to non-English prompts. According to anonymous sources cited by The Information, Google’s Chief Executive, Sundar Pichai, personally decided to postpone the launch, acknowledging the importance of global support as a key feature of Gemini’s capabilities.

Gemini is expected to surpass the renowned ChatGPT, powered by OpenAI’s GPT-4 model, and preliminary private tests have shown promising results. Fueled by significantly enhanced computing power, Gemini has outperformed GPT-4, particularly in FLOPS (Floating Point Operations Per Second), owing to its access to a multitude of high-end AI accelerators through the Google Cloud platform.

SemiAnalysis, a research firm affiliated with Substack Inc., expressed in an August blog post that Gemini appears poised to “blow OpenAI’s model out of the water.” The extensive compute power at Google’s disposal has evidently contributed to Gemini’s superior performance.

Google’s Vice President and Manager of Bard and Google Assistant, Sissie Hsiao, offered insights into Gemini’s capabilities, citing examples like generating novel images in response to specific requests, such as illustrating the steps to ice a three-layer cake.

Advertisement

While Google’s current generative AI offering, Bard, has showcased noteworthy accomplishments, it has struggled to achieve the same level of consumer awareness as ChatGPT. Gemini, with its unparalleled capabilities, is expected to be a game-changer, demonstrating impressive multimodal functionalities never seen before.

During the initial announcement at Google’s I/O developer conference in May, the company emphasized Gemini’s multimodal prowess and its developer-friendly nature. An application programming interface (API) is under development, allowing developers to seamlessly integrate Gemini into third-party applications.

As the world awaits the delayed unveiling of Gemini, the stakes are high, with Google aiming to revolutionize the AI landscape and solidify its position as a leader in generative artificial intelligence. The postponed launch only adds to the anticipation surrounding Gemini’s eventual debut in the coming year.

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS