Connect with us


Core Web Vitals: What Next?



Core Web Vitals: What Next?

The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

The promised page experience updates from Google that caused such a stir last year are far from being done, so Tom takes a look at where we are now and what happens next for the algorithm’s infamous Core Web Vitals.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Happy Friday, Moz fans, and welcome to another Whiteboard Friday. This week’s video is about Core Web Vitals. Before you immediately pull that or pause the video, press Back, something like that, no, we haven’t got stuck in time. This isn’t a video from 2020. This is looking forwards. I’m not going to cover what the basic metrics are or how they work, that kind of thing in this video. There is a very good Whiteboard Friday about a year ago from Cyrus about all of those things, which hopefully will be linked below. What this video is going to be looking at is where we are now and what happens next, because the page experience updates from Google are very much not done. They are still coming. This is still ongoing. This is probably going to get more important over time, not less, even though the hype has kind of subsided a little bit.

Historical context

So, firstly, I want to look at some of the historical context in terms of how we got where we are. So I’ve got this timeline on this side of the board. You can see in May 2020, which is nearly two years ago now, Google first announced this. This is an extraordinary long time really in SEO and Google update terms. But they announced it, and then we had these two delays and it felt like it was taking forever. I think there are some important implications here because my theory is that — I’ve written about this before and again, hopefully, that will also be linked  below — but my theory is that the reason for the delays was that too few pages would have been getting a boost if they had rolled out when they originally intended to, partly because too few sites had actually improved their performance and partly because Google is getting data from Chrome, the Chrome user experience or CrUX data. It’s from real users using Chrome.

For lots of pages for a long time, including now really, they didn’t really have a good enough sample size to draw conclusions. The coverage is not incredible. So because of that, initially when they had even less data, they were in an even worse position to roll out something. They don’t want to make a change to their algorithm that rewards a small number of pages disproportionately, because that would just distort their results. It will make their results worse for users, which is not what they’re aiming for with their own metrics.

So because of these delays, we were sort of held up until June last year. But what I’ve just explained, this system of only having enough sample size for more heavily visited pages, this is important for webmasters, not just Google. We’ll come back to it later when we talk about what’s going to happen next I think, but this is why whenever we display Core Web Vitals data in Moz Pro and whenever we talk about it publicly, we encourage you to look at your highest traffic or most important pages or your highest ranking pages, that kind of thing, rather than just looking at your slowest pages or something like that. You need to prioritize and triage. So we encourage you to sort by traffic and look at that alongside performance or something like that.

So anyway, June 2021, we did start having this rollout, and it was all rolled out within two or three months. But it wasn’t quite what we expected or what we were told to expect. 

What happened after the rollout? 

In the initial FAQ and the initial documentation from from Google, they talked about sites getting a boost if they passed a certain threshold for all three of the new metrics they were introducing. Although they kind of started to become more ambiguous about that over time, that definitely isn’t what happened with the rollout.

So we track this with MozCast data. So between the start and the end of when Google said they were rolling it out, we looked at the pages ranking top 20 in MozCast that had passes for zero, one, two, or three of the metrics against the thresholds that Google published. 

Hand drawing of average ranking across sites that passed between 0 and all 3 core web vital metrics.

Now one thing that’s worth noticing about this chart, before you even look at it anymore closely, is that all of these lines trend downwards, and that’s because of what I was talking about with the sample sizes increasing, with Google getting data on more pages over time. So as they got more pages, they started incorporating more low traffic or in other words low ranking pages into the CrUX data, and that meant that the average rank of a page that has CrUX data will go down, because when we first started looking at this, even though this is top 20 rankings for competitive keywords, only about 30% of them even had CrUX data in the first place when we first looked at this. It’s gone up a lot since then. So it now includes more low ranking pages. So that’s why there’s this sort of general downwards shift.

So the thing to notice here is the pages passing all three thresholds, these are the ones that Google said were going to get a big boost, and these went down by 0.2, which is about the same as the pages that were passing one or two thresholds. So I’m going to go out on a limb and say that that was just the general fit caused by incorporating more pages into CrUX data. 

The really noticeable thing was the pages that passed zero. The pages that passed zero thresholds, they went down by 1.1. They went down by 1.1 positions. So instead of it being pass all three and get a boost, it’s more like pass zero and get a penalty. Or you could rephrase that positively and say the exact same thing, as pass one and get a boost relative to these ones that are falling off the cliff and dropping over one ranking position.

So there was a big impact it seems from the rollout, but not necessarily the one that we were told to expect, which is interesting. I suspect that’s because Google perhaps was more confident about the data on the sites performing very badly than about the data on the sites performing very well.

What happens next? 

Desktop rollout

Now, in terms of what happens next, I think this is relevant because in February and March, probably as you’re watching this video, Google have said they’re going to be rolling out this same expect page experience update on desktop. So we assume it will work the same way. So what you’ve seen here on a smartphone only, this will be replicated on desktop at the start of this year. So you’ll probably see something very similar with very poorly performing sites. If you’re already watching this video, you probably have little or no time to get this fixed or they’ll see a ranking drop, which if maybe that’s one of your competitors, that could be good news.

But I don’t think it will stop there. There are two other things I expect to happen. 

Increased impact

So one is you might remember with HTTPS updates and particularly with Mobilegeddon, we expected this really big seismic change. But what actually happened was when the update rolled out, it was very toned down. Not much noticeable shifted. But then, over time, Google sort of quietly turned up the wick. These days, we would all expect a very mobile-unfriendly site to perform very poorly in search, even though the initial impact of that algorithm update was very minor. I think something similar will happen here. The slower sites will feel a bigger and bigger penalty gradually building. I don’t mean like a manual penalty, but a bigger disadvantage gradually building over time, until in a few years’ time we would all intuitively understand that a site that doesn’t pass three thresholds or something is going to perform horribly.

New metrics

The last change I’m expecting to see, which Google hinted about initially, is new metrics. So they initially said that they would probably update this annually. You can already see on that Google is talking about a couple of new metrics. Those are smoothness and responsiveness. So smoothness is to do with the sort of frames per second of animations on the page. So when you’re scrolling up and down the page, is it more like a slideshow or a sort of fluid video? Then responsiveness is how quickly the page interacts or responds to your interactions. So we already have one of the current metrics is first input delay, but, as it says in the name, that’s only the first input. So I’m expecting this to care more about things that happen further into your browsing experience on that page.

So these are things I think you have to think about going forwards through 2022 and beyond for Core Web Vitals. I think the main lesson to take away is you don’t want to over-focus on the three metrics we have now, because if you just leave your page that’s currently having a terrible user experience but somehow sort of wiggling its way through these three metrics, that’s only going to punish you in the long run. It will be like the old-school link builders that are just constantly getting penalized as they find their way around every new update rather than finding a more sustainable technique. I think you have to do the same. You have to aim for a genuinely good user experience or this isn’t going to work out for you.

Anyway, that’s all for today. Hope the rest of your Friday is enjoyable. See you next time.

Video transcription by

Source link


Is Twitter Still a Thing for Content Marketers in 2023?



Is Twitter Still a Thing for Content Marketers in 2023?

The world survived the first three months of Elon Musk’s Twitter takeover.

But what are marketers doing now? Did your brand follow the shift Dennis Shiao made for his personal brand? As he recently shared, he switched his primary platform from Twitter to LinkedIn after the 2022 ownership change. (He still uses Twitter but posts less frequently.)

Are those brands that altered their strategy after the new ownership maintaining that plan? What impact do Twitter’s service changes (think Twitter Blue subscriptions) have?

We took those questions to the marketing community. No big surprise? Most still use Twitter. But from there, their responses vary from doing nothing to moving away from the platform.

Lowest points

At the beginning of the Elon era, more than 500 big-name advertisers stopped buying from the platform. Some (like Amazon and Apple) resumed their buys before the end of 2022. Brand accounts’ organic activity seems similar.

In November, Emplifi research found a 26% dip in organic posting behavior by U.S. and Canadian brands the week following a significant spike in the negative sentiment of an Elon tweet. But that drop in posting wasn’t a one-time thing.

Kyle Wong, chief strategy officer at Emplifi, shares a longer analysis of well-known fast-food brands. When comparing December 2021 to December 2022 activity, the brands posted 74% less, and December was the least active month of 2022.

Fast-food brands posted 74% less on @Twitter in December 2022 than they did in December 2021, according to @emplifi_io analysis via @AnnGynn @CMIContent. Click To Tweet

When Emplifi analyzed brand accounts across industries (2,330 from U.S. and Canada and 6,991 elsewhere in the world), their weekly Twitter activity also fell to low points in November and December. But by the end of the year, their activity was inching up.

“While the percentage of brands posting weekly is on the rise once again, the number is still lower than the consistent posting seen in earlier months,” Kyle says.

Quiet-quitting Twitter

Lacey Reichwald, marketing manager at Aha Media Group, says the company has been quiet-quitting Twitter for two months, simply monitoring and posting the occasional link. “It seems like the turmoil has settled down, but the overall impact of Twitter for brands has not recovered,” she says.

@ahamediagroup quietly quit @Twitter for two months and saw their follower count go up, says Lacey Reichwald via @AnnGynn @CMIContent. Click To Tweet

She points to their firm’s experience as a potential explanation. Though they haven’t been posting, their follower count has gone up, and many of those new follower accounts don’t seem relevant to their topic or botty. At the same time, Aha Media saw engagement and follows from active accounts in the customer segment drop.

Blue bonus

One change at Twitter has piqued some brands’ interest in the platform, says Dan Gray, CEO of Vendry, a platform for helping companies find agency partners to help them scale.

“Now that getting a blue checkmark is as easy as paying a monthly fee, brands are seeing this as an opportunity to build thought leadership quickly,” he says.

Though it remains to be seen if that strategy is viable in the long term, some companies, particularly those in the SaaS and tech space, are reallocating resources to energize their previously dormant accounts.

Automatic verification for @TwitterBlue subscribers led some brands to renew their interest in the platform, says Dan Gray of Vendry via @AnnGynn @CMIContent. Click To Tweet

These reenergized accounts also are seeing an increase in followers, though Dan says it’s difficult to tell if it’s an effect of the blue checkmark or their renewed emphasis on content. “Engagement is definitely up, and clients and agencies have both noted the algorithm seems to be favoring their content more,” he says.

New horizon

Faizan Fahim, marketing manager at Breeze, is focused on the future. They’re producing videos for small screens as part of their Twitter strategy. “We are guessing soon Elon Musk is going to turn Twitter into TikTok/YouTube to create more buzz,” he says. “We would get the first moving advantage in our niche.”

He’s not the only one who thinks video is Twitter’s next bet. Bradley Thompson, director of marketing at DigiHype Media and marketing professor at Conestoga College, thinks video content will be the next big thing. Until then, text remains king.

“The approach is the same, which is a focus on creating and sharing high-quality content relevant to the industry,” Bradley says. “Until Twitter comes out with drastically new features, then marketing and managing brands on Twitter will remain the same.

James Coulter, digital marketing director at Sole Strategies, says, “Twitter definitely still has a space in the game. The question is can they keep it, or will they be phased out in favor of a more reliable platform.”

Interestingly given the thoughts of Faizan and Bradley, James sees businesses turning to video as they limit their reliance on Twitter and diversify their social media platforms. They are now willing to invest in the resource-intensive format given the exploding popularity of TikTok, Instagram Reels, and other short-form video content.

“We’ve seen a really big push on getting vendors to help curate video content with the help of staff. Requesting so much media requires building a new (social media) infrastructure, but once the expectations and deliverables are in place, it quickly becomes engrained in the weekly workflow,” James says.

What now

“We are waiting to see what happens before making any strong decisions,” says Baruch Labunski, CEO at Rank Secure. But they aren’t sitting idly by. “We’ve moved a lot of our social media efforts to other platforms while some of these things iron themselves out.”

What is your brand doing with Twitter? Are you stepping up, stepping out, or standing still? I’d love to know. Please share in the comments.

Want more content marketing tips, insights, and examples? Subscribe to workday or weekly emails from CMI.


Cover image by Joseph Kalinowski/Content Marketing Institute

Source link

Continue Reading


45 Free Content Writing Tools to Love [for Writing, Editing & Content Creation]



45 Free Content Writing Tools to Love [for Writing, Editing & Content Creation]

Creating content isn’t always a walk in the park. (In fact, it can sometimes feel more like trying to swim against the current.)

While other parts of business and marketing are becoming increasingly automated, content creation is still a very manual job. (more…)

Continue Reading


How data clean rooms might help keep the internet open



How data clean rooms might help keep the internet open

Are data clean rooms the solution to what IAB CEO David Cohen has called the “slow-motion train wreck” of addressability? Voices at the IAB will tell you that they have a big role to play.

“The issue with addressability is that once cookies go away, and with the loss of identifiers, about 80% of the addressable market will become unknown audiences which is why there is a need for privacy-centric consent and a better consent-value exchange,” said Jeffrey Bustos, VP, measurement, addressability and data at the IAB.

“Everyone’s talking about first-party data, and it is very valuable,” he explained, “but most publishers who don’t have sign-on, they have about 3 to 10% of their readership’s first-party data.” First-party data, from the perspective of advertisers who want to reach relevant and audiences, and publishers who want to offer valuable inventory, just isn’t enough.

Why we care. Two years ago, who was talking about data clean rooms? The surge of interest is recent and significant, according to the IAB. DCRs have the potential, at least, to keep brands in touch with their audiences on the open internet; to maintain viability for publishers’ inventories; and to provide sophisticated measurement capabilities.

How data clean rooms can help. DCRs are a type of privacy-enhancing technology that allows data owners (including brands and publishers) to share customer first-party data in a privacy-compliant way. Clean rooms are secure spaces where first-party data from a number of sources can be resolved to the same customer’s profile while that profile remains anonymized.

In other words, a DCR is a kind of Switzerland — a space where a truce is called on competition while first-party data is enriched without compromising privacy.

“The value of a data clean room is that a publisher is able to collaborate with a brand across both their data sources and the brand is able to understand audience behavior,” said Bestos. For example, a brand selling eye-glasses might know nothing about their customers except basic transactional data — and that they wear glasses. Matching profiles with a publisher’s behavioral data provides enrichment.

“If you’re able to understand behavioral context, you’re able to understand what your customers are reading, what they’re interested in, what their hobbies are,” said Bustos. Armed with those insights, a brand has a better idea of what kind of content they want to advertise against.

The publisher does need to have a certain level of first-party data for the matching to take place, even if it doesn’t have a universal requirement for sign-ins like The New York Times. A publisher may be able to match only a small percentage of the eye-glass vendor’s customers, but if they like reading the sports and arts sections, at least that gives some directional guidance as to what audience the vendor should target.

Dig deeper: Why we care about data clean rooms

What counts as good matching? In its “State of Data 2023” report, which focuses almost exclusively on data clean rooms, concern is expressed that DCR efficacy might be threatened by poor match rates. Average match rates hover around 50% (less for some types of DCR).

Bustos is keen to put this into context. “When you are matching data from a cookie perspective, match rates are usually about 70-ish percent,” he said, so 50% isn’t terrible, although there’s room for improvement.

One obstacle is a persistent lack of interoperability between identity solutions — although it does exist; LiveRamp’s RampID is interoperable, for example, with The Trade Desk’s UID2.

Nevertheless, said Bustos, “it’s incredibly difficult for publishers. They have a bunch of identity pixels firing for all these different things. You don’t know which identity provider to use. Definitely a long road ahead to make sure there’s interoperability.”

Maintaining an open internet. If DCRs can contribute to solving the addressability problem they will also contribute to the challenge of keeping the internet open. Walled gardens like Facebook do have rich troves of first-party and behavioral data; brands can access those audiences, but with very limited visibility into them.

“The reason CTV is a really valuable proposition for advertisers is that you are able to identify the user 1:1 which is really powerful,” Bustos said. “Your standard news or editorial publisher doesn’t have that. I mean, the New York Times has moved to that and it’s been incredibly successful for them.” In order to compete with the walled gardens and streaming services, publishers need to offer some degree of addressability — and without relying on cookies.

But DCRs are a heavy lift. Data maturity is an important qualification for getting the most out of a DCR. The IAB report shows that, of the brands evaluating or using DCRs, over 70% have other data-related technologies like CDPs and DMPs.

“If you want a data clean room,” Bustos explained, “there are a lot of other technological solutions you have to have in place before. You need to make sure you have strong data assets.” He also recommends starting out by asking what you want to achieve, not what technology would be nice to have. “The first question is, what do you want to accomplish? You may not need a DCR. ‘I want to do this,’ then see what tools would get you to that.”

Understand also that implementation is going to require talent. “It is a demanding project in terms of the set-up,” said Bustos, “and there’s been significant growth in consulting companies and agencies helping set up these data clean rooms. You do need a lot of people, so it’s more efficient to hire outside help for the set up, and then just have a maintenance crew in-house.”

Underuse of measurement capabilities. One key finding in the IAB’s research is that DCR users are exploiting the audience matching capabilities much more than realizing the potential for measurement and attribution. “You need very strong data scientists and engineers to build advanced models,” Bustos said.

“A lot of brands that look into this say, ‘I want to be able to do a predictive analysis of my high lifetime value customers that are going to buy in the next 90 days.’ Or ‘I want to be able to measure which channels are driving the most incremental lift.’ It’s very complex analyses they want to do; but they don’t really have a reason as to why. What is the point? Understand your outcome and develop a sequential data strategy.”

Trying to understand incremental lift from your marketing can take a long time, he warned. “But you can easily do a reach and frequency and overlap analysis.” That will identify wasted investment in channels and as a by-product suggest where incremental lift is occurring. “There’s a need for companies to know what they want, identify what the outcome is, and then there are steps that are going to get you there. That’s also going to help to prove out ROI.”

Dig deeper: Failure to get the most out of data clean rooms is costing marketers money

Get MarTech! Daily. Free. In your inbox.

Source link

Continue Reading