Connect with us

SEARCHENGINES

Google On Translated Content & Garbage Parameters In URLs

Published

on

Google On Translated Content & Garbage Parameters In URLs

You may end up confusing Google when you have “garbage” parameters trailing in your URLs, espesially when it comes to translated content parameters. There is this interesting conversation about a large multilingual site that found its translated content excluded from Google Search with a “crawled currently not indexed” status within Google Search Console.

The SEO seemed very knowledgable and he did do his homework before coming to John Mueller of Google for help. John basically said this might be related to the the parameter at the end with the language code. John said “what can happen is that when we recognize that there are a lot of these parameters there that lead to the same content, then our systems can kind of get stuck into a situation well maybe this parameter is not very useful and we should just ignore it.”

John then gave some tips on how to use the URL parameter tool in Search Console to help Google know that those URLs should be indexed. And also, maybe how to use redirects and clean URLs to enforce that when Google crawls those URLs.

Here is the video, it starts at the 53:14 mark:

Advertisement

Here is the transcript:

Question:

I work on a fairly large multilingual site and in April last year, just all in one go all of our translation content or translated content moved from valid to excluded crawled currently not indexed and there it has stayed since April. You know because it happened all at once we thought maybe there was some systemic change on our side we get a massive change to our hosting platform, content management system, etc. We combed through the code extensively, we can’t find anything, we can’t find any change to content, we don’t see any notes in the google search release notes that look like they’re they’ll be affecting us as far as we can tell. We’ve also been pretty thorough going through and just doing best practice searches with Search Console . We’ve cleaned up our hreflang, canonicals, URL parameters, manual actions and and every other tool that’s listed on developers.google.com/search. I’m just about out of ideas. I don’t know what’s happened or what to do next to try to fix the issue but I’d really like to get our translated content back in the index.


Answer:

I took a look at that briefly before and passed some of that on to the team here as well. One of the things that I think is sometimes tricky is you have the parameter at the end with the language code, I think hl equals whatever. From our point of view what can happen is that when we recognize that there are a lot of these parameters there that lead to the same content, then our systems can kind of get stuck into a situation well maybe this parameter is not very useful and we should just ignore it. And to me it sounds a lot like something around that line happened.

And partially you can help this with the URL parameter tool in Search Console to make sure that that parameter is actually set – I do want to have everything indexed.

Advertisement

Partially what you could also do is maybe to crawl a portion of your website with, I don’t know, local crawler to see what what kind of parameter URLs actually get picked up and then double check that those pages actually have useful content for those languages. In particular things like like a common one that i’ve seen on sites is maybe you have all languages linked up and the Japanese version says oh we don’t have a Japanese version here’s our English one instead. Then our systems could say well the Japanese version is the same as the English version maybe there are some other languages the same as the English version we should just ignore them.

And sometimes this is from links within the website, sometimes it’s also external links, people who are linking to your site. If the parameter is at the end of your URL, then it’s very common that there’s some kind of garbage attached to the parameter as well. And if we crawl all of those URLs with that garbage and we say oh well this is not a valid language here’s the English version, then it again kind of kind of reinforces that loop where systems say well maybe this parameter is not so useful.

So the cleaner approach there would be if you have kind of garbage parameters, to redirect to the cleaner ones. Or to maybe even show a 404 page and say well we don’t we don’t know what you’re talking about with this URL. And to really cleanly make sure that whichever URLs we find we actually get some useful content that is not the same as other content which we’ve already seen.

Forum discussion at YouTube Community.


Source link
Advertisement
Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

SEARCHENGINES

Google Won’t Change The 301 Signals For Ranking & SEO

Published

on

Google Tracks

Gary Illyes from Google said on stage at the SERP conference last week that there is no way that Google would change how the 301 redirect signal works for SEO or search rankings. Gary added that it’s a very reliable signal.

Nikola Minkov quoted Gary Illyes as saying, “It is a very reliable signal, and there is no way we could change that signal,” when asked if a 301 redirect not working is a myth. Honestly, I am not sure the context of this question, as it is not clear from the post on X, but here it is:

We’ve covered 301 redirects here countless times – but I never saw a myth that Google does not use 301 redirects as a signal for canonicalization or for passing signals from an old URL to the redirected URL.

Forum discussion at X.

Advertisement

Note: This was pre-written and scheduled to be posted today, I am currently offline for Passover.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEARCHENGINES

Google Again Says Ignore Link Spam Especially To 404 Pages

Published

on

Google Robot Blindfolds

I am not sure how many times Google has said that you do not need to disavow spammy links, that you can ignore link spam attacks and that links pointing to pages that 404/410 are links that do not count – but John Mueller from Google said it again.

In a thread on X, John Mueller from Google wrote, “if the links are going to URLs that 404 on your site, they’re already dropped.” “They do nothing,” he added, “If there’s no indexable destination URL, there’s no link.”

John then added, “I’d generally ignore link-spam, and definitely ignore link-spam to 404s.”

Asking if it would hurt to disavow, after responding with the messages above, John wrote:

It will do absolutely nothing. I would take the time to rework a holistic & forward-looking strategy for the site overall instead of working on incremental tweaks (other tweaks might do something, but you probably need real change, not tweaks).

Earlier this year we had tons of SEOs notice spammy links to 404 error pages, John said ignore them. In 2021, Google said links to 404 pages do not count, Google also said that in 2012 and many other times.

Advertisement

Plus, outside of links to 404 pages, Google has said to ignore spammy links, time and time again – even the toxic links – ignore them. The messaging around this changed in 2016 when Penguin 4.0 was released and Google began devaluing links over demoting them.

Here are those new posts in context:

And in general, Google says it ignores spammy links, so you should too (not new) but this post from John Mueller is:

And then also on Mastodon wrote about a similar situation, “Google has 2 decades of practice of ignoring spammy links. There’s no need to do anything for those links.”

Forum discussion at X.

Note: This was pre-written and scheduled to be posted today, I am currently offline for Passover.

Advertisement



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

SEARCHENGINES

Google Needs Very Few Links To Rank Pages; Links Are Less Important

Published

on

Gary Illyes Serp Conf

Gary Illyes from Google spoke at the SERP Conf on Friday and he said what he said numerous times before, that Google values links a lot less today than it did in the past. He added that Google Search “needs very few links to rank pages.”

Gary reportedly said, “We need very few links to rank pages… Over the years we’ve made links less important.”

I am quoting Patrick Stox who is quoting what he heard Gary say on stage at the event. Here is Patrick’s post where Gary did a rare reply:

Gary said this a year ago, also in 2022 and other times as well. We previously covered that Google said links would likely become even less important in the future. And even Matt Cutts, the former Googler, said something similar about eight years ago and the truth is, links are weighted a lot less than it was eight years ago and that trend continues. A couple of years ago, Google said links are not the most important Google search ranking factor.

Advertisement

Of course, many SEOs think Google lies about this.

Judith Lewis interviewed Gary Illyes at the SERP Conf this past Friday.

Forum discussion at X and image credit to @n_minkov.



Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending

Follow by Email
RSS