This week’s news about the purchase of TV-ratings giant Nielsen is spotlighting ongoing problems with its products and raising questions about the company’s value.
What happened. On Tuesday, Nielsen announced it was selling itself to a consortium led by Elliott Management Corp.’s private-equity arm and Brookfield Asset Management Inc. for about $10 billion. That is close to Nielsen’s current market cap, but that is based on a stock price that has jumped more than 20% since news of the deal broke. The consortium’s offer of $28 per share is a 60% premium over the pre-jump price. The group will also be taking on the company’s $5.7 billion in debt, bringing the deal’s total cost to about $16 billion. Last year Nielsen reported $894 million in revenue and a 23.94% net profit margin
“It is remarkable however that this is a 60% premium to Nielsen’s recent stock price,” said Todd Krizelman, CEO and co-founder of MediaRadar. “This implies the future buyer has some very specific path to improve performance, or perhaps to break up the company further to unlock value.”
The problem … part 1. Nielsen’s primary product is TV ratings and there are serious concerns about the reliability of those numbers. Last April a trade group representing the major television networks said Nielsen undercounted TV viewers during the pandemic because technicians were not able to get into panelists’ homes to fix devices. The Media Ratings Council, which enforces measurement standards in media, confirmed this. In May Nielsen reported that it had been undercounting out-of-home viewership, blaming a software error. Four months later, the ratings council suspended Nielsen’s accreditation.
Get the daily newsletter digital marketers rely on.
While networks have long complained about Nielsen’s ratings, they have had to use them because there wasn’t another source. This is beginning to change. Recently, NBCUniversal recently announced it is moving to iSpot.tv for national ratings.
Even without the accuracy concerns, ongoing trends in viewership have made Nielsen’s ratings less valuable. Since 2011 major network broadcast ratings have dropped more than 80%, according to SpoilerTV. Further, the cord-cutting trend continues apace. The share of Americans who say they watch television via cable or satellite has plunged from 76% in 2015 to 56% last year, according to a Pew survey.
The problem … part2. The criticism of Nielsen’s digital ratings is even harsher than for its TV numbers.
“On the digital side, Facebook canceled its partnership with Nielsen around digital ad rating early last year,” Mike Woosley, COO for data solutions company Lotame said in a statement. “Since the cancellation — for over a year — Nielsen hasn’t been able to explain to Lotame how its product worked. For Lotame, there is a wide perception that the product is now less accurate than the system we use it to benchmark, creating chaos with marketers, customers, and other partners.”
Nielsen, Elliot Management and Brookfield Asset Management all failed to respond to requests for comments.
Why we care. While traditional TV viewership is declining it continues to be a very significant part of the advertising ecosystem. Inaccurate ratings are a huge problem that must be addressed. Further, because of Nielsen’s status, its issues cast a shadow on other companies.
About The Author
Constantine von Hoffman is managing editor of MarTech. A veteran journalist, Con has covered business, finance, marketing and tech for CBSNews.com, Brandweek, CMO, and Inc. He has been city editor of the Boston Herald, news producer at NPR, and has written for Harvard Business Review, Boston Magazine, Sierra, and many other publications. He has also been a professional stand-up comedian, given talks at anime and gaming conventions on everything from My Neighbor Totoro to the history of dice and boardgames, and is author of the magical realist novel John Henry the Revelator. He lives in Boston with his wife, Jennifer, and either too many or too few dogs.
The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
Nike.com uses infinite scrolling to load more products on its category pages. And because of that, Nike risks its loaded content not getting indexed.
For the sake of testing, I entered one of their category pages and scrolled down to choose a product triggered by scrolling. Then, I used the “site:” command to check if the URL is indexed in Google. And as you can see on a screenshot below, this URL is impossible to find on Google:
Of course, Google can still reach your products through sitemaps. However, finding your content in any other way than through links makes it harder for Googlebot to understand your site structure and dependencies between the pages.
To make it even more apparent to you, think about all the products that are visible only when you scroll for them on Nike.com. If there’s no link for bots to follow, they will see only 24 products on a given category page. Of course, for the sake of users, Nike can’t serve all of its products on one viewport. But still, there are better ways of optimizing infinite scrolling to be both comfortable for users and accessible for bots.
Unlike Nike, Douglas.de uses a more SEO-friendly way of serving its content on category pages.
They provide bots with page navigation based on <a href> links to enable crawling and indexing of the next paginated pages. As you can see in the source code below, there’s a link to the second page of pagination included:
Moreover, the paginated navigation may be even more user-friendly than infinite scrolling. The numbered list of category pages may be easier to follow and navigate, especially on large e-commerce websites. Just think how long the viewport would be on Douglas.de if they used infinite scrolling on the page below:
Let’s check if that’s the case here. Again, I used the “site:” command and typed the title of one of Otto.de’s product carousels:
As you can see, Google couldn’t find that product carousel in its index. And the fact that Google can’t see that element means that accessing additional products will be more complex. Also, if you prevent crawlers from reaching your product carousels, you’ll make it more difficult for them to understand the relationship between your pages.
To find out, check what the HTML version of the page looks like for bots by analyzing the cache version.
When scrolling, you’ll see that the links to related products can also be found in its cache. If you see them here, it means bots don’t struggle to find them, either.
However, keep in mind that the links to the exact products you can see in the cache may differ from the ones on the live version of the page. It’s normal for the products in the carousels to rotate, so you don’t need to worry about discrepancies in specific links.
But what exactly does Target.com do differently? They take advantage of dynamic rendering. They serve the initial HTML, and the links to products in the carousels as the static HTML bots can process.
However, you must remember that dynamic rendering adds an extra layer of complexity that may quickly get out of hand with a large website. I recently wrote an article about dynamic rendering that’s a must-read if you are considering this solution.
Also, the fact that crawlers can access the product carousels doesn’t guarantee these products will get indexed. However, it will significantly help them flow through the site structure and understand the dependencies between your pages.
It’s impossible to fully evaluate a website without a proper site crawl. But looking at its robots.txt file can already allow you to identify any critical content that’s blocked.
This disallow directive misuse may result in rendering problems on your entire website.
To check if it applies in this case, I used Google’s Mobile-Friendly Test. This tool can help you navigate rendering issues by giving you insight into the rendered source code and the screenshot of a rendered page on mobile.
But let’s find out if those rendering problems affected the website’s indexing. I used the “site:” command to check if the main content (product description) of the analyzed page is indexed on Google. As you can see, no results were found:
The layout is essential for Google to understand the context of your page. If you’d like to know more about this crossroads of web technology and layout, I highly recommend looking into a new field of technical SEO called rendering SEO.
Lidl.de proves that a well-organized robots.txt file can help you control your website’s crawling. The crucial thing is to use the disallow directive consciously.
Having a large e-commerce website, you may easily lose track of all the added directives. Always include as many path fragments of a URL you want to block from crawling as possible. It will help you avoid blocking some crucial pages by mistake.
Will users get obsessed with finding that particular product via Walmart.com? They may, but they can also head to any other store selling this item instead.
To fix this problem, Walmart has two solutions:
Implementing dynamic rendering (prerendering) which is, in most cases, the easiest from an implementation standpoint.
IKEA proves that you can present your main content in a way that is accessible for bots and interactive for users.
When browsing IKEA.com’s product pages, their product descriptions are served behind clickable panels. When you click on them, they dynamically appear on the right-hand side of the viewport.
Take care of your indexing pipeline and check if: