SEO
Site Dependence On JavaScript A Problem For Googlebot?
In a recent Google Search Central SEO office-hours hangout, a question was submitted to Google’s Search Advocate John Mueller asking if it’s a bad for a website to be dependent on JavaScript for basic functionality.
Might this have a negative effect on Googlebot when it comes to crawling and indexing?
Mueller observed that it’s probably fine but also suggested things to do to make sure that both Google and users have no problems with the site.
Site Is Not User Friendly Without JavaScript
The person asking the question noted that a great deal of functionality of the site depended on JavaScript and was concerned about the impact on both user-friendliness and SEO-friendliness.
This is the question:
“Our website is not very user friendly if JavaScript is turned off.
Most of the images are not loaded. Out flyout menu can’t be opened.
However the Chrome Inspect feature, in there all menu links are there in the source code.
Might our dependence on JavaScript still be a problem for Googlebot?”
What the person means about the “Chrome Inspect feature” is probably the View Page Source code inspection tool built into Chrome.
So what they mean to say is that, although the links are not accessible when JavaScript is turned off in a browser, the links are still there in the HTML code.
Mueller Recommends Site Testing
Mueller’s answer acknowledged that Google could probably handle the site.
But what was left unspoken is the fact that the functionality of many sites depends on JavaScript and that the experience of the person asking the question is pretty much normal.
Visit most any site with JavaScript turned off on a browser and many images won’t load, the layout may become broken and some of the menus won’t work.
Below is a screenshot of SearchEngineJournal as viewed with JavaScript disabled:
While Mueller hinted at this fact with his answer, it should probably be put to the forefront of the answer that most sites are user-unfriendly without JavaScript enabled on a browser and that the experience of the person asking the question is not out of the ordinary but is in fact quite common.
Mueller acknowledged that everything would probably be fine.
He said:
“And, from my point of view …I would test it.
So probably everything will be okay.
And probably, I would assume if you’re using JavaScript in a reasonable way, if you’re not doing anything special to block the JavaScript on your pages, then probably it will just work.”
Test To See How Site Performs
Mueller next encouraged the person to run tests in order to be sure the site is functioning optimally and mentioned that “we” have tools but he didn’t mention specific tools.
Presumably he’s speaking of tools available on Google Search Console that can provide feedback on whether Google is able to crawl pages and images.
Mueller continued his answer:
“But you’re much better off not just believing me, but rather using a testing tool to try it out.
And the testing tools that we have available are quite well documented.
There are lots of …variations on things that we recommend with regards to improving things if you run into problems.
So I would double-check our guides on JavaScript and SEO and think about maybe, …trying things out, making sure that they actually work the way that you want and then taking that to improve your website overall.”
User Friendly Site Experiences
Mueller next discussed the issue of user-friendliness because the person asking the question mentioned that the site is user-unfriendly with JavaScript turned off.
The overwhelming majority of sites on the Internet use JavaScript, W3Techs publishes a statistic that 97.9% of sites use JavaScript.
HTTPArchive, which uses actual Chrome user data from opted-in users notes in its annual report on JavaScript use that the median number of JavaScript downloads for mobile devices is 20 and as high as 33 first-party JavaScript and 34 third-party scripts for the 90th percentile of websites.
HttpArchive further points out that for the median average of websites, 36.2% of JavaScript forced onto a site visitor’s browser goes unused, it’s just wasted bandwidth.
As you can see, the issue is not about users with JavaScript turned off visiting a site, as the person asking the question was concerned about. Their concern was misplaced.
The real problem centers on users encountering a site that is forcing too much JavaScript on site visitors and thereby creating a poor user experience.
Mueller didn’t mention the nuance of how the person’s concerns were misplaced. But he did recommend useful ways to figure out if users are having a negative experience due to JavaScript issues.
Mueller continued his answer:
“And you mentioned user-friendly with regards to JavaScript, so from our point of view, the guidance that we have is essentially very technical in the sense that we need to make sure that Googlebot can see the content from a technical point of view, and that it can see the links on your pages from a technical point of view.
It doesn’t primarily care about user-friendliness.
But of course your users care about user-friendliness.
And that’s something where maybe it makes sense to do a little bit more so that your users are really for sure having a good experience on your pages.
And this is often something that isn’t just a matter of a simple testing tool.
But rather something where maybe you have to do a small user study or kind of interview some users or at least do a survey on your website to understand where do they get stuck, what kind of problems are they facing.
Is it because of these …you mentioned the fly-out menus. Or is it something maybe completely different where they’re seeing problems, that maybe the text is too small, or they can’t click the buttons properly, those kinds of things which don’t really align with technical problems but are more, kind of, user-side things that if you can improve those and if you can make your users happier, they’ll stick around and they’ll come back and they’ll invite more people to visit your website as well.”
Test For Users And Google
Mueller didn’t explicitly reference any tools for carrying out any of the recommended tests. It’s fairly obvious that Search Console is the best tool to diagnose crawling issues with Google. Search Console alerts publishers of how many URLs are discovered, for example.
As for user experience tools, one of the best is the free Microsoft Clarity user experience analytics tool. This GDPR compliant analytics tool provides insights into how users experience your site and can signal when they’re having a bad user experience.
So it can be very useful for diagnosing possible site issues that John Mueller discussed.
Citation
Watch John Mueller at the 10:23 minute mark:
Featured Image: Elle Aon/Shutterstock
!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window,document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);
if( typeof sopp !== “undefined” && sopp === ‘yes’ ){
fbq(‘dataProcessingOptions’, [‘LDU’], 1, 1000);
}else{
fbq(‘dataProcessingOptions’, []);
}
fbq(‘init’, ‘1321385257908563’);
fbq(‘track’, ‘PageView’);
fbq(‘trackSingle’, ‘1321385257908563’, ‘ViewContent’, {
content_name: ‘site-dependence-on-javascript-a-problem-for-googlebot’,
content_category: ‘news seo ‘
});
SEO
Google On Hyphens In Domain Names
Google’s John Mueller answered a question on Reddit about why people don’t use hyphens with domains and if there was something to be concerned about that they were missing.
Domain Names With Hyphens For SEO
I’ve been working online for 25 years and I remember when using hyphens in domains was something that affiliates did for SEO when Google was still influenced by keywords in the domain, URL, and basically keywords anywhere on the webpage. It wasn’t something that everyone did, it was mainly something that was popular with some affiliate marketers.
Another reason for choosing domain names with keywords in them was that site visitors tended to convert at a higher rate because the keywords essentially prequalified the site visitor. I know from experience how useful two-keyword domains (and one word domain names) are for conversions, as long as they didn’t have hyphens in them.
A consideration that caused hyphenated domain names to fall out of favor is that they have an untrustworthy appearance and that can work against conversion rates because trustworthiness is an important factor for conversions.
Lastly, hyphenated domain names look tacky. Why go with tacky when a brandable domain is easier for building trust and conversions?
Domain Name Question Asked On Reddit
This is the question asked on Reddit:
“Why don’t people use a lot of domains with hyphens? Is there something concerning about it? I understand when you tell it out loud people make miss hyphen in search.”
And this is Mueller’s response:
“It used to be that domain names with a lot of hyphens were considered (by users? or by SEOs assuming users would? it’s been a while) to be less serious – since they could imply that you weren’t able to get the domain name with fewer hyphens. Nowadays there are a lot of top-level-domains so it’s less of a thing.
My main recommendation is to pick something for the long run (assuming that’s what you’re aiming for), and not to be overly keyword focused (because life is too short to box yourself into a corner – make good things, course-correct over time, don’t let a domain-name limit what you do online). The web is full of awkward, keyword-focused short-lived low-effort takes made for SEO — make something truly awesome that people will ask for by name. If that takes a hyphen in the name – go for it.”
Pick A Domain Name That Can Grow
Mueller is right about picking a domain name that won’t lock your site into one topic. When a site grows in popularity the natural growth path is to expand the range of topics the site coves. But that’s hard to do when the domain is locked into one rigid keyword phrase. That’s one of the downsides of picking a “Best + keyword + reviews” domain, too. Those domains can’t grow bigger and look tacky, too.
That’s why I’ve always recommended brandable domains that are memorable and encourage trust in some way.
Read the post on Reddit:
Read Mueller’s response here.
Featured Image by Shutterstock/Benny Marty
SEO
Reddit Post Ranks On Google In 5 Minutes
Google’s Danny Sullivan disputed the assertions made in a Reddit discussion that Google is showing a preference for Reddit in the search results. But a Redditor’s example proves that it’s possible for a Reddit post to rank in the top ten of the search results within minutes and to actually improve rankings to position #2 a week later.
Discussion About Google Showing Preference To Reddit
A Redditor (gronetwork) complained that Google is sending so many visitors to Reddit that the server is struggling with the load and shared an example that proved that it can only take minutes for a Reddit post to rank in the top ten.
That post was part of a 79 post Reddit thread where many in the r/SEO subreddit were complaining about Google allegedly giving too much preference to Reddit over legit sites.
The person who did the test (gronetwork) wrote:
“…The website is already cracking (server down, double posts, comments not showing) because there are too many visitors.
…It only takes few minutes (you can test it) for a post on Reddit to appear in the top ten results of Google with keywords related to the post’s title… (while I have to wait months for an article on my site to be referenced). Do the math, the whole world is going to spam here. The loop is completed.”
Reddit Post Ranked Within Minutes
Another Redditor asked if they had tested if it takes “a few minutes” to rank in the top ten and gronetwork answered that they had tested it with a post titled, Google SGE Review.
gronetwork posted:
“Yes, I have created for example a post named “Google SGE Review” previously. After less than 5 minutes it was ranked 8th for Google SGE Review (no quotes). Just after Washingtonpost.com, 6 authoritative SEO websites and Google.com’s overview page for SGE (Search Generative Experience). It is ranked third for SGE Review.”
It’s true, not only does that specific post (Google SGE Review) rank in the top 10, the post started out in position 8 and it actually improved ranking, currently listed beneath the number one result for the search query “SGE Review”.
Screenshot Of Reddit Post That Ranked Within Minutes
Anecdotes Versus Anecdotes
Okay, the above is just one anecdote. But it’s a heck of an anecdote because it proves that it’s possible for a Reddit post to rank within minutes and get stuck in the top of the search results over other possibly more authoritative websites.
hankschrader79 shared that Reddit posts outrank Toyota Tacoma forums for a phrase related to mods for that truck.
Google’s Danny Sullivan responded to that post and the entire discussion to dispute that Reddit is not always prioritized over other forums.
Danny wrote:
“Reddit is not always prioritized over other forums. [super vhs to mac adapter] I did this week, it goes Apple Support Community, MacRumors Forum and further down, there’s Reddit. I also did [kumo cloud not working setup 5ghz] recently (it’s a nightmare) and it was the Netgear community, the SmartThings Community, GreenBuildingAdvisor before Reddit. Related to that was [disable 5g airport] which has Apple Support Community above Reddit. [how to open an 8 track tape] — really, it was the YouTube videos that helped me most, but it’s the Tapeheads community that comes before Reddit.
In your example for [toyota tacoma], I don’t even get Reddit in the top results. I get Toyota, Car & Driver, Wikipedia, Toyota again, three YouTube videos from different creators (not Toyota), Edmunds, a Top Stories unit. No Reddit, which doesn’t really support the notion of always wanting to drive traffic just to Reddit.
If I guess at the more specific query you might have done, maybe [overland mods for toyota tacoma], I get a YouTube video first, then Reddit, then Tacoma World at third — not near the bottom. So yes, Reddit is higher for that query — but it’s not first. It’s also not always first. And sometimes, it’s not even showing at all.”
hankschrader79 conceded that they were generalizing when they wrote that Google always prioritized Reddit. But they also insisted that that didn’t diminish what they said is a fact that Google’s “prioritization” forum content has benefitted Reddit more than actual forums.
Why Is The Reddit Post Ranked So High?
It’s possible that Google “tested” that Reddit post in position 8 within minutes and that user interaction signals indicated to Google’s algorithms that users prefer to see that Reddit post. If that’s the case then it’s not a matter of Google showing preference to Reddit post but rather it’s users that are showing the preference and the algorithm is responding to those preferences.
Nevertheless, an argument can be made that user preferences for Reddit can be a manifestation of Familiarity Bias. Familiarity Bias is when people show a preference for things that are familiar to them. If a person is familiar with a brand because of all the advertising they were exposed to then they may show a bias for the brand products over unfamiliar brands.
Users who are familiar with Reddit may choose Reddit because they don’t know the other sites in the search results or because they have a bias that Google ranks spammy and optimized websites and feel safer reading Reddit.
Google may be picking up on those user interaction signals that indicate a preference and satisfaction with the Reddit results but those results may simply be biases and not an indication that Reddit is trustworthy and authoritative.
Is Reddit Benefiting From A Self-Reinforcing Feedback Loop?
It may very well be that Google’s decision to prioritize user generated content may have started a self-reinforcing pattern that draws users in to Reddit through the search results and because the answers seem plausible those users start to prefer Reddit results. When they’re exposed to more Reddit posts their familiarity bias kicks in and they start to show a preference for Reddit. So what could be happening is that the users and Google’s algorithm are creating a self-reinforcing feedback loop.
Is it possible that Google’s decision to show more user generated content has kicked off a cycle where more users are exposed to Reddit which then feeds back into Google’s algorithm which in turn increases Reddit visibility, regardless of lack of expertise and authoritativeness?
Featured Image by Shutterstock/Kues
SEO
WordPress Releases A Performance Plugin For “Near-Instant Load Times”
WordPress released an official plugin that adds support for a cutting edge technology called speculative loading that can help boost site performance and improve the user experience for site visitors.
Speculative Loading
Rendering means constructing the entire webpage so that it instantly displays (rendering). When your browser downloads the HTML, images, and other resources and puts it together into a webpage, that’s rendering. Prerendering is putting that webpage together (rendering it) in the background.
What this plugin does is to enable the browser to prerender the entire webpage that a user might navigate to next. The plugin does that by anticipating which webpage the user might navigate to based on where they are hovering.
Chrome lists a preference for only prerendering when there is an at least 80% probability of a user navigating to another webpage. The official Chrome support page for prerendering explains:
“Pages should only be prerendered when there is a high probability the page will be loaded by the user. This is why the Chrome address bar prerendering options only happen when there is such a high probability (greater than 80% of the time).
There is also a caveat in that same developer page that prerendering may not happen based on user settings, memory usage and other scenarios (more details below about how analytics handles prerendering).
The Speculative Loading API solves a problem that previous solutions could not because in the past they were simply prefetching resources like JavaScript and CSS but not actually prerendering the entire webpage.
The official WordPress announcement explains it like this:
Introducing the Speculation Rules API
The Speculation Rules API is a new web API that solves the above problems. It allows defining rules to dynamically prefetch and/or prerender URLs of certain structure based on user interaction, in JSON syntax—or in other words, speculatively preload those URLs before the navigation. This API can be used, for example, to prerender any links on a page whenever the user hovers over them.”
The official WordPress page about this new functionality describes it:
“The Speculation Rules API is a new web API… It allows defining rules to dynamically prefetch and/or prerender URLs of certain structure based on user interaction, in JSON syntax—or in other words, speculatively preload those URLs before the navigation.
This API can be used, for example, to prerender any links on a page whenever the user hovers over them. Also, with the Speculation Rules API, “prerender” actually means to prerender the entire page, including running JavaScript. This can lead to near-instant load times once the user clicks on the link as the page would have most likely already been loaded in its entirety. However that is only one of the possible configurations.”
The new WordPress plugin adds support for the Speculation Rules API. The Mozilla developer pages, a great resource for HTML technical understanding describes it like this:
“The Speculation Rules API is designed to improve performance for future navigations. It targets document URLs rather than specific resource files, and so makes sense for multi-page applications (MPAs) rather than single-page applications (SPAs).
The Speculation Rules API provides an alternative to the widely-available <link rel=”prefetch”> feature and is designed to supersede the Chrome-only deprecated <link rel=”prerender”> feature. It provides many improvements over these technologies, along with a more expressive, configurable syntax for specifying which documents should be prefetched or prerendered.”
See also: Are Websites Getting Faster? New Data Reveals Mixed Results
Performance Lab Plugin
The new plugin was developed by the official WordPress performance team which occasionally rolls out new plugins for users to test ahead of possible inclusion into the actual WordPress core. So it’s a good opportunity to be first to try out new performance technologies.
The new WordPress plugin is by default set to prerender “WordPress frontend URLs” which are pages, posts, and archive pages. How it works can be fine-tuned under the settings:
Settings > Reading > Speculative Loading
Browser Compatibility
The Speculative API is supported by Chrome 108 however the specific rules used by the new plugin require Chrome 121 or higher. Chrome 121 was released in early 2024.
Browsers that do not support will simply ignore the plugin and will have no effect on the user experience.
Check out the new Speculative Loading WordPress plugin developed by the official core WordPress performance team.
How Analytics Handles Prerendering
A WordPress developer commented with a question asking how Analytics would handle prerendering and someone else answered that it’s up to the Analytics provider to detect a prerender and not count it as a page load or site visit.
Fortunately both Google Analytics and Google Publisher Tags (GPT) both are able to handle prerenders. The Chrome developers support page has a note about how analytics handles prerendering:
“Google Analytics handles prerender by delaying until activation by default as of September 2023, and Google Publisher Tag (GPT) made a similar change to delay triggering advertisements until activation as of November 2023.”
Possible Conflict With Ad Blocker Extensions
There are a couple things to be aware of about this plugin, aside from the fact that it’s an experimental feature that requires Chrome 121 or higher.
A comment by a WordPress plugin developer that this feature may not work with browsers that are using the uBlock Origin ad blocking browser extension.
Download the plugin:
Speculative Loading Plugin by the WordPress Performance Team
Read the announcement at WordPress
Speculative Loading in WordPress
See also: WordPress, Wix & Squarespace Show Best CWV Rate Of Improvement
-
PPC6 days ago
Competitor Monitoring: 7 ways to keep watch on the competition
-
SEARCHENGINES6 days ago
More Google March 2024 Core Update Ranking Volatility
-
PPC6 days ago
31 Ready-to-Go Mother’s Day Messages for Social Media, Email, & More
-
PPC6 days ago
A History of Google AdWords and Google Ads: Revolutionizing Digital Advertising & Marketing Since 2000
-
WORDPRESS7 days ago
Thrive Architect vs Divi vs Elementor
-
WORDPRESS5 days ago
Turkish startup ikas attracts $20M for its e-commerce platform designed for small businesses
-
MARKETING5 days ago
Roundel Media Studio: What to Expect From Target’s New Self-Service Platform
-
SEARCHENGINES5 days ago
Google Search Results Can Be Harmful & Dangerous In Some Cases
You must be logged in to post a comment Login