Google’s John Mueller responded to a question on Reddit where the person asked, “Anyone know of any case studies looking at switching from dynamic rendering to server side rendering?” Adding “This will be a heavy lift that was struck down before so I hoping to find good data.”
John responded right away by saying, “There are no SEO ranking bonuses for implementing it one way or another.” He added “they’re just different ways of making the content indexable (as is client side rendering).”
Here is the full block of what he said:
There are no SEO ranking-bonuses for implementing it one way or another – they’re just different ways of making the content indexable (as is client side rendering). The differences between dynamic rendering and server side rendering from my POV are more in terms of practical infrastructure setup & maintenance (it can also affect speed, depending on how you have things set up). There’s no rush to switch away from dynamic rendering, it’s not going to become unsupported or cause issues from Google. The change over time is just that nowadays, if you have a JS-based site, there are better options (either good CSR or SSR) available, so doing things dynamically based on the user agent is often not the most efficient approach.
If you’re doing dynamic rendering now, it’s fine to look at the options and write up the pros & cons for you, of course. I imagine most won’t be convincing for a stretched engineering team. However, if you’re planning on doing a rebuild of the site, let them know that they don’t need to spend too much time on dynamically rendering the content. At the same time, know what to watch out for too :-). This is where knowing some JS as an SEO really pays out – you don’t have to do the coding, but JS is a part of all modern websites, and it’s up to you to be able to figure out if there are issues with how it’s implemented.
So yea, if you have this all implemented today, on your next redesign, go without dynamic rendering. If you have the resources to change it out, then it might make sense to do that. Otherwise, add it to the queue and do it when you can.
Forum discussion at Reddit.
Google Says They Have Algorithms To Detect & Demote AI Plagiarized Content
Duy Nguyen from Google’s search quality team said in the Google office hours video that Google has “algorithms to go after” those who post AI plagiarized content, then the algorithms can “demote site scraping content from other sites.”
The question was asked at the 9:19 mark which was “How should content creators respond to sites that use AI to plagiarize the content, modify it, and then outrank them in search results?”
Duy Nguyen said, “Scraping content, even with some modification, is against our spam policy.” Duy added that Google has “many algorithms to go after such behaviors and demote site scraping content from other sites.”
If Google messes up, and “if you come across sites that repeatedly scrape content, that perform well on Search, please feel free to report them to us, with our spam report form so that we can further improve our systems, both in detecting the spam and also ranking overall,” he added.
Here is the video embed:
Later on in the video, at 17:05 mark a similar question was asked and answered by Duy:
Kunal asked why Google is not taking action on copy or spun web stories? Can you check on Discover?
Thank you for the report. We are aware of these attempts and we are looking into them. In general, sites with spammy scraped content violate our spam policy, and our algorithms do a pretty good job of demoting them in search results.
Forum discussion at Twitter.