MARKETING
What is a CRO Test? [+ the 5 Steps to Perform Them Yourself]
Looking for a way to supercharge your marketing campaigns and boost conversions? Well, then it’s time to start running a conversion rate optimization test.
It’s an incredibly powerful toolset that can help marketers unlock valuable insights from user behavior – and significantly optimize their campaigns in the process.
In this blog post, we’ll explain what a CRO test is and the steps to run them for maximum impact.
What is a CRO test?
A conversion rate optimization (CRO) test is an experiment designed to test strategies in an effort to maximize your conversion rate.
CRO tests involve adding, re-arranging, and redesigning elements on your website. They can focus on optimizing the copy, design, or placement of your CTAs, or the length of your headlines, among other elements.
When done right, a CRO test will help you identify where to make improvements and maximize the return on your investment.
At worst, this test will serve as a gut check to ensure your current path is optimized and at the best, it will unlock new opportunities.
How to Perform CRO Tests
1. Research.
One step marketers often miss before running a CRO test is research, jumping straight from the idea to the test itself.
Once you have an idea for a test, you’ll first need to validate it through research. This can be both internal – reviewing past experiments, user research data, and analytics insights – and external by reviewing your competitors’ strategies.
The goal is to discover what has resonated with your audience in the past and if your suggested test aligns with that.
2. Design your experiment.
While you’re in the planning stage, it’s helpful to write an experiment doc.
It should include:
- Your objective – What do you aim to achieve with this CRO test?
- Your hypothesis – What do you anticipate will happen with this test? Be as specific as possible by stating the current state, what you want to test, the metric you’re measuring, and your anticipated outcome.
- Your design – This is where all the details of your experiment will live, such as:
- The type of test it is (E.g. A/B, A/B/n, multivariate)
- The pages on which the test will run
- The control and variant groups
- Duration Estimation
- Primary and secondary metrics
- Predicted impact
- Special considerations.
- Results – Once your test is complete, you can drop details of its performance in the document.
This document will serve as your source of truth for your CRO test and keep stakeholders in the know. Plus, you can reference it for future CRO tests.
3. Design your variants and build the test.
Now that you have all your ducks in a row, you can get started with building your experiment.
This step will likely take the most time as it will likely require cross-collaboration between your team, designers, and developers.
Timeline-wise, it can look something like this:
- Work with designers to develop the look and feel of the test.
- Develop copy, if necessary.
- Create tickets and assign them to team members.
- Work with developers, if applicable, to determine dev work and timeline.
- Set up the experiment in your testing tool (like HotJar or Convert) and the analytics to track results.
- Perform quality assurance (QA) tests to ensure it’s working as expected.
Once these steps are complete, you’re ready for launch.
4. Launch your test.
Once your experiment is live, the first thing you’ll want to do is QA it to ensure it’s still working as expected.
Even if you did this pre-launch, it’s not uncommon to catch bugs once the test is live. You’ll also want to check your analytics page to ensure your tracking is set up correctly.
Once that’s done, alert your stakeholders. Your test may impact other teams and their metrics so it’s important to let them know.
This also gives you an extra set of eyes who can report any issues they spot.
5. Review results.
Once your test has reached statistical significance, you can confidently review the results.
How were your metrics impacted? Was your hypothesis satisfied? What insights did you learn?
If your variation won, you can then work on implementing it. If it didn’t, there’s still opportunity there.
Even if your test produced negative results – i.e. your conversion rate decreased – you’re still gaining valuable insights about your audience.
Now that we’ve covered the steps to running a CRO test, see below a few brand examples.
CRO Test Examples
HubSpot’s Content Offer Form Design
The purpose of this experiment was to see if altering the submission form design affects users.
The hypothesis was that by redesigning forms, the user experience will improve and increase user clarity. In turn, form submission CVR would increase. The primary metric measured was form submission CVR.
The test featured four different variations of sign-up forms, which is an A/B/C/D/E design. The image below is the control variant.
Results were significant as variations B and D outperformed the control variables at 96% and 100% confidence, respectively.
The image below shows variation B on the left and variation D on the right.
This demonstrates that, in the future, conversions on the blog could increase if winning form submission designs were applied to blog posts.
Optimizely’s Landing Page Headline
Optimizely was running a few PPC ads with several different types of messaging on one landing page. The landing page did not use the same terminology as the ad – instead, it read “Try it Out for Free.
So Optimizely decided to test the following theory: Aligning the copy on the landing page to the ad will result in more leads (AKA higher conversion).
It worked! While the control had a 12% conversion rate, the variation led to a 39.1% increase in conversions.
HubSpot Blog’s Slide-In CTAs
Most successful blogs include a call-to-action at the end of their blog posts. It’s usually full-width – large enough for people to notice the offer and hopefully convert on it.
But are people noticing that CTA, or are they learning to tune them out?
Here at HubSpot, we were curious if our readers were developing static CTA blindness. So, we decided to run a test to see if we could increase our CTA clickthrough and conversion rates.
To accomplish this goal, we tested slide-in CTAs that would appear halfway to three-quarters of the way through a blog post.
Here’s an example of the slide-in:
To test this out, we added slide-in CTAs to 10 of HubSpot’s highest-traffic blog posts. After reaching statistically significant results, we looked at the following stats for the slide-in CTA and the static CTA at the end of the post:
- Clickthrough rate (CTR) – What percentage of visitors clicked each CTA?
- Conversion rate (CVR) – What percentage of those visitors who clicked ultimately converted on the landing page form?
- Submissions – How many total leads did each CTA ultimately generate?
In this test, the slide-in CTA had a 192% higher CTR and generated 27% more submissions – mission accomplished.
Sidekick’s Landing Page Design
This test was done many moons ago when HubSpot Sales was still Sidekick but the value’s still there.
Back then, Sidekick was a chrome extension and the original landing page included a list of all the features from the software:
- See Who Opens & Clicks on Your Emails
- Schedule Emails to be Sent Later
- Access Valuable Information About Your Contacts
But the team was curious to know if those details actually mattered. For a product as low-touch as a Chrome extension, do consumers need a laundry list of features to convert?
To answer this question, the experiment involved replacing the feature list with user testimonials.
The testimonial beat out the feature list by 28%.
Their theory on why this change took place? The former didn’t make people curious enough to click through to the Chrome Extension installation page.
Another theory is that consumers wanted more social proof before downloading a new tool into their browser.
There you have it – a rundown of all things CRO testing. If you want more details on how to run a test of your own, check out our A/B test kit below.
MARKETING
YouTube Ad Specs, Sizes, and Examples [2024 Update]
Introduction
With billions of users each month, YouTube is the world’s second largest search engine and top website for video content. This makes it a great place for advertising. To succeed, advertisers need to follow the correct YouTube ad specifications. These rules help your ad reach more viewers, increasing the chance of gaining new customers and boosting brand awareness.
Types of YouTube Ads
Video Ads
- Description: These play before, during, or after a YouTube video on computers or mobile devices.
- Types:
- In-stream ads: Can be skippable or non-skippable.
- Bumper ads: Non-skippable, short ads that play before, during, or after a video.
Display Ads
- Description: These appear in different spots on YouTube and usually use text or static images.
- Note: YouTube does not support display image ads directly on its app, but these can be targeted to YouTube.com through Google Display Network (GDN).
Companion Banners
- Description: Appears to the right of the YouTube player on desktop.
- Requirement: Must be purchased alongside In-stream ads, Bumper ads, or In-feed ads.
In-feed Ads
- Description: Resemble videos with images, headlines, and text. They link to a public or unlisted YouTube video.
Outstream Ads
- Description: Mobile-only video ads that play outside of YouTube, on websites and apps within the Google video partner network.
Masthead Ads
- Description: Premium, high-visibility banner ads displayed at the top of the YouTube homepage for both desktop and mobile users.
YouTube Ad Specs by Type
Skippable In-stream Video Ads
- Placement: Before, during, or after a YouTube video.
- Resolution:
- Horizontal: 1920 x 1080px
- Vertical: 1080 x 1920px
- Square: 1080 x 1080px
- Aspect Ratio:
- Horizontal: 16:9
- Vertical: 9:16
- Square: 1:1
- Length:
- Awareness: 15-20 seconds
- Consideration: 2-3 minutes
- Action: 15-20 seconds
Non-skippable In-stream Video Ads
- Description: Must be watched completely before the main video.
- Length: 15 seconds (or 20 seconds in certain markets).
- Resolution:
- Horizontal: 1920 x 1080px
- Vertical: 1080 x 1920px
- Square: 1080 x 1080px
- Aspect Ratio:
- Horizontal: 16:9
- Vertical: 9:16
- Square: 1:1
Bumper Ads
- Length: Maximum 6 seconds.
- File Format: MP4, Quicktime, AVI, ASF, Windows Media, or MPEG.
- Resolution:
- Horizontal: 640 x 360px
- Vertical: 480 x 360px
In-feed Ads
- Description: Show alongside YouTube content, like search results or the Home feed.
- Resolution:
- Horizontal: 1920 x 1080px
- Vertical: 1080 x 1920px
- Square: 1080 x 1080px
- Aspect Ratio:
- Horizontal: 16:9
- Square: 1:1
- Length:
- Awareness: 15-20 seconds
- Consideration: 2-3 minutes
- Headline/Description:
- Headline: Up to 2 lines, 40 characters per line
- Description: Up to 2 lines, 35 characters per line
Display Ads
- Description: Static images or animated media that appear on YouTube next to video suggestions, in search results, or on the homepage.
- Image Size: 300×60 pixels.
- File Type: GIF, JPG, PNG.
- File Size: Max 150KB.
- Max Animation Length: 30 seconds.
Outstream Ads
- Description: Mobile-only video ads that appear on websites and apps within the Google video partner network, not on YouTube itself.
- Logo Specs:
- Square: 1:1 (200 x 200px).
- File Type: JPG, GIF, PNG.
- Max Size: 200KB.
Masthead Ads
- Description: High-visibility ads at the top of the YouTube homepage.
- Resolution: 1920 x 1080 or higher.
- File Type: JPG or PNG (without transparency).
Conclusion
YouTube offers a variety of ad formats to reach audiences effectively in 2024. Whether you want to build brand awareness, drive conversions, or target specific demographics, YouTube provides a dynamic platform for your advertising needs. Always follow Google’s advertising policies and the technical ad specs to ensure your ads perform their best. Ready to start using YouTube ads? Contact us today to get started!
MARKETING
Why We Are Always ‘Clicking to Buy’, According to Psychologists
Amazon pillows.
MARKETING
A deeper dive into data, personalization and Copilots
Salesforce launched a collection of new, generative AI-related products at Connections in Chicago this week. They included new Einstein Copilots for marketers and merchants and Einstein Personalization.
To better understand, not only the potential impact of the new products, but the evolving Salesforce architecture, we sat down with Bobby Jania, CMO, Marketing Cloud.
Dig deeper: Salesforce piles on the Einstein Copilots
Salesforce’s evolving architecture
It’s hard to deny that Salesforce likes coming up with new names for platforms and products (what happened to Customer 360?) and this can sometimes make the observer wonder if something is brand new, or old but with a brand new name. In particular, what exactly is Einstein 1 and how is it related to Salesforce Data Cloud?
“Data Cloud is built on the Einstein 1 platform,” Jania explained. “The Einstein 1 platform is our entire Salesforce platform and that includes products like Sales Cloud, Service Cloud — that it includes the original idea of Salesforce not just being in the cloud, but being multi-tenancy.”
Data Cloud — not an acquisition, of course — was built natively on that platform. It was the first product built on Hyperforce, Salesforce’s new cloud infrastructure architecture. “Since Data Cloud was on what we now call the Einstein 1 platform from Day One, it has always natively connected to, and been able to read anything in Sales Cloud, Service Cloud [and so on]. On top of that, we can now bring in, not only structured but unstructured data.”
That’s a significant progression from the position, several years ago, when Salesforce had stitched together a platform around various acquisitions (ExactTarget, for example) that didn’t necessarily talk to each other.
“At times, what we would do is have a kind of behind-the-scenes flow where data from one product could be moved into another product,” said Jania, “but in many of those cases the data would then be in both, whereas now the data is in Data Cloud. Tableau will run natively off Data Cloud; Commerce Cloud, Service Cloud, Marketing Cloud — they’re all going to the same operational customer profile.” They’re not copying the data from Data Cloud, Jania confirmed.
Another thing to know is tit’s possible for Salesforce customers to import their own datasets into Data Cloud. “We wanted to create a federated data model,” said Jania. “If you’re using Snowflake, for example, we more or less virtually sit on your data lake. The value we add is that we will look at all your data and help you form these operational customer profiles.”
Let’s learn more about Einstein Copilot
“Copilot means that I have an assistant with me in the tool where I need to be working that contextually knows what I am trying to do and helps me at every step of the process,” Jania said.
For marketers, this might begin with a campaign brief developed with Copilot’s assistance, the identification of an audience based on the brief, and then the development of email or other content. “What’s really cool is the idea of Einstein Studio where our customers will create actions [for Copilot] that we hadn’t even thought about.”
Here’s a key insight (back to nomenclature). We reported on Copilot for markets, Copilot for merchants, Copilot for shoppers. It turns out, however, that there is just one Copilot, Einstein Copilot, and these are use cases. “There’s just one Copilot, we just add these for a little clarity; we’re going to talk about marketing use cases, about shoppers’ use cases. These are actions for the marketing use cases we built out of the box; you can build your own.”
It’s surely going to take a little time for marketers to learn to work easily with Copilot. “There’s always time for adoption,” Jania agreed. “What is directly connected with this is, this is my ninth Connections and this one has the most hands-on training that I’ve seen since 2014 — and a lot of that is getting people using Data Cloud, using these tools rather than just being given a demo.”
What’s new about Einstein Personalization
Salesforce Einstein has been around since 2016 and many of the use cases seem to have involved personalization in various forms. What’s new?
“Einstein Personalization is a real-time decision engine and it’s going to choose next-best-action, next-best-offer. What is new is that it’s a service now that runs natively on top of Data Cloud.” A lot of real-time decision engines need their own set of data that might actually be a subset of data. “Einstein Personalization is going to look holistically at a customer and recommend a next-best-action that could be natively surfaced in Service Cloud, Sales Cloud or Marketing Cloud.”
Finally, trust
One feature of the presentations at Connections was the reassurance that, although public LLMs like ChatGPT could be selected for application to customer data, none of that data would be retained by the LLMs. Is this just a matter of written agreements? No, not just that, said Jania.
“In the Einstein Trust Layer, all of the data, when it connects to an LLM, runs through our gateway. If there was a prompt that had personally identifiable information — a credit card number, an email address — at a mimum, all that is stripped out. The LLMs do not store the output; we store the output for auditing back in Salesforce. Any output that comes back through our gateway is logged in our system; it runs through a toxicity model; and only at the end do we put PII data back into the answer. There are real pieces beyond a handshake that this data is safe.”
-
SEARCHENGINES7 days ago
Daily Search Forum Recap: September 6, 2024
-
SEARCHENGINES6 days ago
Google August Core Update Done, Google Interview, Google Ads & Merchant Center News & The YouTube Algorithm SEO
-
SEO6 days ago
Plot Up To Five Metrics At Once
-
SEO7 days ago
Top 10 Affiliate Marketing Platforms To Maximize Sales In 2024
-
SEO5 days ago
Google’s Guidance About The Recent Ranking Update
-
SEARCHENGINES5 days ago
Google Search Volatility Still Heated After August Core Update Rollout
-
AFFILIATE MARKETING6 days ago
Best US Cities to Start a Business, Entrepreneurship: Report
-
SEO4 days ago
Mediavine Bans Publisher For Overuse Of AI-Generated Content