Connect with us


The deprecation of Google Analytics (as we’ve known it)



The deprecation of Google Analytics (as we've known it)

It’s time to be excited about the great migration.

The biggest shake-up in the marketing analytics world is that Google Analytics as we know it is going to be sunset and will eventually stop collecting data in July 2023 (October 2023 for GA360 customers). There were mixed responses, to say the least – conflicting tweets, memes and disappointed forum posts were generally the first reactions to the news, and it proves that this drastic move needed to happen at some point. 

As more practitioners and marketers adopt the new Google Analytics 4 (GA4), the benefits are starting to flip the mood from nervousness to excitement.

The version of GA that’s been around for over a decade, Universal Analytics, is hard to leave behind since it’s such an embedded part of web measurement. GA4 was announced in October 2020 but wasn’t met with widespread eagerness that would be expected for a new, robust product. To be fair, there were quite a few other things going on in the world at that time, but in any case, marketers weren’t rushing to make the switch, and the industry seems to be going through the stages of grief for the familiar product:

  • Denial – “I don’t need to change platforms, so I will ignore GA4 for now.”
  • Anger – “How could Google get rid of Universal Analytics?”
  • Bargaining – “What if the deadline is extended? Can we ask for more time?”
  • Sadness – “It will take so much effort to migrate and learn a new tool.”
  • Acceptance – “This is more advanced and helps with the cookieless future that I keep getting asked about. I’m in.”

Get the daily newsletter digital marketers rely on.

It may be a challenging road ahead to migrate, but the move to Google Analytics 4 shouldn’t be considered bad news. It comes with new features, tracking methodology, a lower price point for 360 and has perks for users on the free version. The most important aspect is that it fits more appropriately into the current app landscape and is built to face compliance and privacy changes for what’s in effect and what’s coming.

Ultimately, GA4 is the solution to the internet now, not the internet from a decade ago.

Understanding the past, understanding the future

The news is disorienting, so the timeline should be put in perspective.

Google acquired a product in 2005 called Urchin. Before GA, web analytics was based on server log files, and it was not as intuitive or marketer-friendly. There are still relics of that era with things like UTM parameters (Urchin Tracking Modules) and the property IDs themselves. The “UA” in an ID like UA-12345-1 doesn’t represent Universal Analytics. It stands for Urchin Analytics. Since then, there have been new iterations of GA for the web. Here’s a list of where we’ve been:

There’s one thing that these tracking methods from 2005 to 2022 have in common – all of them still process data and show up in reports, no matter which tracking library you’re using. So, Google is still processing data from the time when the internet looked like this

It’s been 10 years since the release of Universal Analytics. In 2012, Google Tag Manager had yet to be released, and mobile-first web design was a new concept. App tracking was still in beta, and it would be six years before DoubleClick products would evolve to become part of the new Google Marketing Platform. We’ve come a long way, so tracking had to be completely rebuilt, and the Google Analytics from 2005 to 2020 will be taken away and put on a shelf next to Google+ and Google Local.

The UA version of Google Analytics was designed to embrace multi-device behavior, collect more user data, and allow offline and cross-channel measurement. However, culture is no longer multi-screen – it’s multi-multi-multi-screen. The average number of connected devices per person in North America alone will reach 13 in 2023. Universal Analytics cannot easily track different platforms together, and it was not meant to do so. Now that we’re in a more app-centric phase of connectivity, GA4 is a better solution since it was built for that type of analysis. Instead of gathering more data, the goal is to use data that is modeled and as anonymous as possible.

Universal Analytics will be disappearing coincidentally around the same time as the death of the cookie. The hyperconnected landscape called for a necessary pivot for users to have more control over their data, more privacy considerations and more transparent analytics practice. Google Analytics 4 has answered that call with a variety of customizations and settings to establish trust with your visitors while continuing to activate on rich data. User tracking will now be supplemented with machine-learning data baked right into Google Analytics 4. Users’ current trends in behavior will be automatically analyzed to predict future behavior and provide modeled conversions. The privacy-centric features are a core component, but there are other reasons to embrace the change.

What to get excited about

In addition to being the first Google Analytics product to have the built-in capability to collect data from multiple sources, it is a better evolution for enterprise-level while also offering more to small- and mid-size businesses.

The free version of GA has turned into a freemium product. Standard non-paid users now have access (although limited) to tools like BigQuery, GMP integrations, more unsampled data, and access to advanced visualizations through Exploration Reports (formerly called Advanced Analysis).

For Google Analytics 360 customers, those features are much less limited, and some of the additional perks are:

  • Enterprise-level data and user governance through roll-up and sub-properties.
  • More control over data retention.
  • Streaming and nearly unlimited BigQuery exports.
  • Quicker processing, even for large data sets in the billions.
  • The ability to use up to 400 advanced audiences to pass to marketing platforms.
  • Unsampled custom reports, explorations, and the ability to use longer date ranges in advanced reports.
  • Higher level of custom data collection for events, conversions, custom dimensions, and user properties.

Migrations were strongly suggested throughout these iterations but never forced (except for the Google Analytics app tracking SDK). However, older versions of tracking will not be as useful in 2023. It’s symbolic that even the echo of Urchin Analytics in those “UA-12345-1” properties is gone for good and replaced with Measurement IDs and data streams.

Deadlines and timelines

As a reminder, Universal Analytics will officially sunset in July 2023 for those on the free version and October 2023 for GA360 users. This means that properties will be read-only, and data sent to Google will not be processed. There won’t be exceptions, so migrating will be the top priority for everyone. Even if you’re not currently using the platform but have used it in the past, it’s still a time for action. We’re not just moving on. We’re also moving out – historical data will eventually be erased, so data must be saved and exported. The deletion won’t happen until at least six months after the sunset date, but it’s a crucial step in the migration process.

All web and app data should be 100% in Google Analytics 4 by the shutoff date, but ideally sooner. Parallel tracking should be in place and refined now so that data can be available on both platforms. The GA4 numbers won’t match 1:1 to Universal Analytics. Having year-over-year reports comparing UA to GA4 may be misleading, and reports will not be able to use the same data source. With GA4 tracking in parallel, next year’s reports will be comparing apples to apples. Depending on your organization, seasonality can guide how quickly to ramp up and set priorities for the most critical metrics and events. Whether it’s higher education enrollment, holiday e-commerce, or tax season, yearly activity is a consideration for building as much parity as possible between UA and GA4.

Next steps

The first step is to get GA4 on your websites and apps. It’s not too late to get started on a new strategy to fit the new tracking method and create your Google Analytics 4 properties, but delaying parallel tracking may cause reporting, remarketing, and compliance difficulties. After that, learning about how you can take advantage of the durable Google Analytics 4 should spark ideas and conversations beyond migration.

Opinions expressed in this article are those of the guest author and not necessarily MarTech. Staff authors are listed here.

About The Author

Samantha has been working with web analytics and implementation for over 10 years. She is a data advocate and consultant for companies ranging from small businesses to Fortune 100 corporations. As a trainer, she has led courses for over 1000 attendees over the past 6 years across the United States. Whether it’s tag management, analytics strategy, data visualization, or coding, she loves the excitement of developing bespoke solutions across a vast variety of verticals.

Source link


Is Twitter Still a Thing for Content Marketers in 2023?



Is Twitter Still a Thing for Content Marketers in 2023?

The world survived the first three months of Elon Musk’s Twitter takeover.

But what are marketers doing now? Did your brand follow the shift Dennis Shiao made for his personal brand? As he recently shared, he switched his primary platform from Twitter to LinkedIn after the 2022 ownership change. (He still uses Twitter but posts less frequently.)

Are those brands that altered their strategy after the new ownership maintaining that plan? What impact do Twitter’s service changes (think Twitter Blue subscriptions) have?

We took those questions to the marketing community. No big surprise? Most still use Twitter. But from there, their responses vary from doing nothing to moving away from the platform.

Lowest points

At the beginning of the Elon era, more than 500 big-name advertisers stopped buying from the platform. Some (like Amazon and Apple) resumed their buys before the end of 2022. Brand accounts’ organic activity seems similar.

In November, Emplifi research found a 26% dip in organic posting behavior by U.S. and Canadian brands the week following a significant spike in the negative sentiment of an Elon tweet. But that drop in posting wasn’t a one-time thing.

Kyle Wong, chief strategy officer at Emplifi, shares a longer analysis of well-known fast-food brands. When comparing December 2021 to December 2022 activity, the brands posted 74% less, and December was the least active month of 2022.

Fast-food brands posted 74% less on @Twitter in December 2022 than they did in December 2021, according to @emplifi_io analysis via @AnnGynn @CMIContent. Click To Tweet

When Emplifi analyzed brand accounts across industries (2,330 from U.S. and Canada and 6,991 elsewhere in the world), their weekly Twitter activity also fell to low points in November and December. But by the end of the year, their activity was inching up.

“While the percentage of brands posting weekly is on the rise once again, the number is still lower than the consistent posting seen in earlier months,” Kyle says.

Quiet-quitting Twitter

Lacey Reichwald, marketing manager at Aha Media Group, says the company has been quiet-quitting Twitter for two months, simply monitoring and posting the occasional link. “It seems like the turmoil has settled down, but the overall impact of Twitter for brands has not recovered,” she says.

@ahamediagroup quietly quit @Twitter for two months and saw their follower count go up, says Lacey Reichwald via @AnnGynn @CMIContent. Click To Tweet

She points to their firm’s experience as a potential explanation. Though they haven’t been posting, their follower count has gone up, and many of those new follower accounts don’t seem relevant to their topic or botty. At the same time, Aha Media saw engagement and follows from active accounts in the customer segment drop.

Blue bonus

One change at Twitter has piqued some brands’ interest in the platform, says Dan Gray, CEO of Vendry, a platform for helping companies find agency partners to help them scale.

“Now that getting a blue checkmark is as easy as paying a monthly fee, brands are seeing this as an opportunity to build thought leadership quickly,” he says.

Though it remains to be seen if that strategy is viable in the long term, some companies, particularly those in the SaaS and tech space, are reallocating resources to energize their previously dormant accounts.

Automatic verification for @TwitterBlue subscribers led some brands to renew their interest in the platform, says Dan Gray of Vendry via @AnnGynn @CMIContent. Click To Tweet

These reenergized accounts also are seeing an increase in followers, though Dan says it’s difficult to tell if it’s an effect of the blue checkmark or their renewed emphasis on content. “Engagement is definitely up, and clients and agencies have both noted the algorithm seems to be favoring their content more,” he says.

New horizon

Faizan Fahim, marketing manager at Breeze, is focused on the future. They’re producing videos for small screens as part of their Twitter strategy. “We are guessing soon Elon Musk is going to turn Twitter into TikTok/YouTube to create more buzz,” he says. “We would get the first moving advantage in our niche.”

He’s not the only one who thinks video is Twitter’s next bet. Bradley Thompson, director of marketing at DigiHype Media and marketing professor at Conestoga College, thinks video content will be the next big thing. Until then, text remains king.

“The approach is the same, which is a focus on creating and sharing high-quality content relevant to the industry,” Bradley says. “Until Twitter comes out with drastically new features, then marketing and managing brands on Twitter will remain the same.

James Coulter, digital marketing director at Sole Strategies, says, “Twitter definitely still has a space in the game. The question is can they keep it, or will they be phased out in favor of a more reliable platform.”

Interestingly given the thoughts of Faizan and Bradley, James sees businesses turning to video as they limit their reliance on Twitter and diversify their social media platforms. They are now willing to invest in the resource-intensive format given the exploding popularity of TikTok, Instagram Reels, and other short-form video content.

“We’ve seen a really big push on getting vendors to help curate video content with the help of staff. Requesting so much media requires building a new (social media) infrastructure, but once the expectations and deliverables are in place, it quickly becomes engrained in the weekly workflow,” James says.

What now

“We are waiting to see what happens before making any strong decisions,” says Baruch Labunski, CEO at Rank Secure. But they aren’t sitting idly by. “We’ve moved a lot of our social media efforts to other platforms while some of these things iron themselves out.”

What is your brand doing with Twitter? Are you stepping up, stepping out, or standing still? I’d love to know. Please share in the comments.

Want more content marketing tips, insights, and examples? Subscribe to workday or weekly emails from CMI.


Cover image by Joseph Kalinowski/Content Marketing Institute

Source link

Continue Reading


45 Free Content Writing Tools to Love [for Writing, Editing & Content Creation]



45 Free Content Writing Tools to Love [for Writing, Editing & Content Creation]

Creating content isn’t always a walk in the park. (In fact, it can sometimes feel more like trying to swim against the current.)

While other parts of business and marketing are becoming increasingly automated, content creation is still a very manual job. (more…)

Continue Reading


How data clean rooms might help keep the internet open



How data clean rooms might help keep the internet open

Are data clean rooms the solution to what IAB CEO David Cohen has called the “slow-motion train wreck” of addressability? Voices at the IAB will tell you that they have a big role to play.

“The issue with addressability is that once cookies go away, and with the loss of identifiers, about 80% of the addressable market will become unknown audiences which is why there is a need for privacy-centric consent and a better consent-value exchange,” said Jeffrey Bustos, VP, measurement, addressability and data at the IAB.

“Everyone’s talking about first-party data, and it is very valuable,” he explained, “but most publishers who don’t have sign-on, they have about 3 to 10% of their readership’s first-party data.” First-party data, from the perspective of advertisers who want to reach relevant and audiences, and publishers who want to offer valuable inventory, just isn’t enough.

Why we care. Two years ago, who was talking about data clean rooms? The surge of interest is recent and significant, according to the IAB. DCRs have the potential, at least, to keep brands in touch with their audiences on the open internet; to maintain viability for publishers’ inventories; and to provide sophisticated measurement capabilities.

How data clean rooms can help. DCRs are a type of privacy-enhancing technology that allows data owners (including brands and publishers) to share customer first-party data in a privacy-compliant way. Clean rooms are secure spaces where first-party data from a number of sources can be resolved to the same customer’s profile while that profile remains anonymized.

In other words, a DCR is a kind of Switzerland — a space where a truce is called on competition while first-party data is enriched without compromising privacy.

“The value of a data clean room is that a publisher is able to collaborate with a brand across both their data sources and the brand is able to understand audience behavior,” said Bestos. For example, a brand selling eye-glasses might know nothing about their customers except basic transactional data — and that they wear glasses. Matching profiles with a publisher’s behavioral data provides enrichment.

“If you’re able to understand behavioral context, you’re able to understand what your customers are reading, what they’re interested in, what their hobbies are,” said Bustos. Armed with those insights, a brand has a better idea of what kind of content they want to advertise against.

The publisher does need to have a certain level of first-party data for the matching to take place, even if it doesn’t have a universal requirement for sign-ins like The New York Times. A publisher may be able to match only a small percentage of the eye-glass vendor’s customers, but if they like reading the sports and arts sections, at least that gives some directional guidance as to what audience the vendor should target.

Dig deeper: Why we care about data clean rooms

What counts as good matching? In its “State of Data 2023” report, which focuses almost exclusively on data clean rooms, concern is expressed that DCR efficacy might be threatened by poor match rates. Average match rates hover around 50% (less for some types of DCR).

Bustos is keen to put this into context. “When you are matching data from a cookie perspective, match rates are usually about 70-ish percent,” he said, so 50% isn’t terrible, although there’s room for improvement.

One obstacle is a persistent lack of interoperability between identity solutions — although it does exist; LiveRamp’s RampID is interoperable, for example, with The Trade Desk’s UID2.

Nevertheless, said Bustos, “it’s incredibly difficult for publishers. They have a bunch of identity pixels firing for all these different things. You don’t know which identity provider to use. Definitely a long road ahead to make sure there’s interoperability.”

Maintaining an open internet. If DCRs can contribute to solving the addressability problem they will also contribute to the challenge of keeping the internet open. Walled gardens like Facebook do have rich troves of first-party and behavioral data; brands can access those audiences, but with very limited visibility into them.

“The reason CTV is a really valuable proposition for advertisers is that you are able to identify the user 1:1 which is really powerful,” Bustos said. “Your standard news or editorial publisher doesn’t have that. I mean, the New York Times has moved to that and it’s been incredibly successful for them.” In order to compete with the walled gardens and streaming services, publishers need to offer some degree of addressability — and without relying on cookies.

But DCRs are a heavy lift. Data maturity is an important qualification for getting the most out of a DCR. The IAB report shows that, of the brands evaluating or using DCRs, over 70% have other data-related technologies like CDPs and DMPs.

“If you want a data clean room,” Bustos explained, “there are a lot of other technological solutions you have to have in place before. You need to make sure you have strong data assets.” He also recommends starting out by asking what you want to achieve, not what technology would be nice to have. “The first question is, what do you want to accomplish? You may not need a DCR. ‘I want to do this,’ then see what tools would get you to that.”

Understand also that implementation is going to require talent. “It is a demanding project in terms of the set-up,” said Bustos, “and there’s been significant growth in consulting companies and agencies helping set up these data clean rooms. You do need a lot of people, so it’s more efficient to hire outside help for the set up, and then just have a maintenance crew in-house.”

Underuse of measurement capabilities. One key finding in the IAB’s research is that DCR users are exploiting the audience matching capabilities much more than realizing the potential for measurement and attribution. “You need very strong data scientists and engineers to build advanced models,” Bustos said.

“A lot of brands that look into this say, ‘I want to be able to do a predictive analysis of my high lifetime value customers that are going to buy in the next 90 days.’ Or ‘I want to be able to measure which channels are driving the most incremental lift.’ It’s very complex analyses they want to do; but they don’t really have a reason as to why. What is the point? Understand your outcome and develop a sequential data strategy.”

Trying to understand incremental lift from your marketing can take a long time, he warned. “But you can easily do a reach and frequency and overlap analysis.” That will identify wasted investment in channels and as a by-product suggest where incremental lift is occurring. “There’s a need for companies to know what they want, identify what the outcome is, and then there are steps that are going to get you there. That’s also going to help to prove out ROI.”

Dig deeper: Failure to get the most out of data clean rooms is costing marketers money

Get MarTech! Daily. Free. In your inbox.

Source link

Continue Reading