Connect with us

MARKETING

11 A/B Testing Examples From Real Businesses

Published

on

11 A/B Testing Examples From Real Businesses

Whether you’re looking to increase revenue, sign-ups, social shares, or engagement, A/B testing and optimization can help you get there.

But for many marketers out there, the tough part about A/B testing is often finding the right test to drive the biggest impact — especially when you’re just getting started. So, what’s the recipe for high-impact success?

Free Download: A/B Testing Guide and Kit

Truthfully, there is no one-size-fits-all recipe. What works for one business won’t work for another — and finding the right metrics and timing to test can be a tough problem to solve. That’s why you need inspiration from A/B testing examples.

In this post, let’s review how a hypothesis will get you started with your testing, and check out excellent examples from real businesses using A/B testing. While the same tests may not get you the same results, they can help you run creative tests of your own.

A/B Testing Hypothesis Examples

A hypothesis can make or break your experiment, especially when it comes to A/B testing. When creating your hypothesis, you want to make sure that it’s:

  1. Focused on one specific problem you want to solve or understand
  2. Able to be proven or disproven
  3. Focused on making an impact (bringing higher conversion rates, lower bounce rate, etc.)

When creating a hypothesis, following the “If, then” structure can be helpful, where if you changed a specific variable, then a particular result would happen.

Here are some examples of what that would look like in an A/B testing hypothesis:

  • Shortening contact submission forms to only contain required fields would increase the number of sign-ups.
  • Changing the call-to-action text from “Download now” to “Download this free guide” would increase the number of downloads.
  • Reducing the frequency of mobile app notifications from five times per day to two times per day will increase mobile app retention rates.
  • Using featured images that are more contextually related to our blog posts will contribute to a lower bounce rate.
  • Greeting customers by name in emails will increase the total number of clicks.

Let’s go over some real-life examples of A/B testing to prepare you for your own.

A/B Testing Examples

Website A/B Testing Examples

1. HubSpot Academy’s Homepage Hero Image

Most websites have a homepage hero image that inspires users to engage and spend more time on the site. This A/B testing example shows how hero image changes can impact user behavior and conversions.

Problem

Based on previous data, HubSpot Academy found that out of more than 55,000 page views, only .9% of those users were watching the video on the homepage. Of those viewers, almost 50% watched the full video.

Chat transcripts also highlighted the need for clearer messaging for this useful and free resource.

That’s why the HubSpot team decided to test how clear value propositions could improve user engagement and delight.

A/B Test Method

HubSpot used three variants for this test, using HubSpot Academy conversion rate (CVR) as the primary metric. Secondary metrics included CTA clicks and engagement.

Variant A was the control.

A/B testing examples: HubSpot Academy's Homepage Hero

For variant B, the team added more vibrant images and colorful text and shapes. It also included an animated “typing” headline.

A/B testing examples: HubSpot Academy's Homepage Hero

Variant C also added color and movement, as well as animated images on the right-hand side of the page.

A/B testing examples: HubSpot Academy's Homepage Hero

Results

As a result, HubSpot found that variant B outperformed the control by 6%. In contrast, variant C underperformed the control by 1%. From those numbers, HubSpot was able to project that using variant B would lead to about 375 more sign ups each month.

2. FSAstore.com’s Site Navigation

Every marketer will have to focus on conversion at some point. But building a website that converts is tough.

Problem

FSAstore.com is an ecommerce company supplying home goods for Americans with a flexible spending account.

This useful site could help the 35 million+ customers that have an FSA. But the website funnel was overwhelming. It had too many options, especially on category pages. The team felt that customers weren’t making purchases because of that issue.

A/B Test Method

To figure out how to appeal to its customers, this company tested a simplified version of its website. The current site included an information-packed subheader in the site navigation.

To test the hypothesis, this A/B testing example compared the current site to an update without the subheader.

A/B testing examples: FSAstore.com

Results

This update showed a clear boost in conversions and FSAstore.com saw a 53.8% increase in revenue per visitor.

3. Expoze’s Web Page Background

The visuals on your web page are important because they help users decide whether they want to spend more time on your site.

In this A/B testing example, Expoze.io decided to test the background on its homepage.

Problem

The website home page was difficult for some users to read because of low contrast. The team also needed to figure out how to improve page navigation while still representing the brand.

A/B Test Method

First, the team did some research and created several different designs. The goals of the redesign were to improve the visuals and increase attention to specific sections of the home page, like the video thumbnail.

A/B testing examples: Expoze.io

They used AI-generated eye tracking as they designed to find the best designs before A/B testing. Then they ran an A/B heatmap test to see whether the new or current design got the most attention from visitors.

A/B testing examples: Expoze.io heatmaps

Results

The new design showed a big increase in attention, with version B bringing over 40% more attention to the desired sections of the home page.

This design change also brought a 25% increase in CTA clicks. The team believes this is due to the added contrast on the page bringing more attention to the CTA button, which was not changed.

4. Thrive Themes’ Sales Page Optimization

Many landing pages showcase testimonials. That’s valuable content and it can boost conversion.

That’s why Thrive Themes decided to test a new feature on its landing pages — customer testimonials.

Problem

In the control, Thrive Themes had been using a banner that highlighted product features, but not how customers felt about the product.

The team decided to test whether adding testimonials to a sales landing page could improve conversion rates.

A/B Test Method

In this A/B test example, the team ran a 6-week test with the control against an updated landing page with testimonials.

A/B testing examples: Thrive Themes

Results

This change netted a 13% increase in sales. The control page had a 2.2% conversion rate, but the new variant showed a 2.75% conversion rate.

Email A/B Testing Examples

5. HubSpot’s Email Subscriber Experience

Getting users to engage with email isn’t an easy task. That’s why HubSpot decided to A/B test how alignment impacts CTA clicks.

Problem

HubSpot decided to change text alignment in the weekly emails for subscribers to improve the user experience. Ideally, this improved experience would result in a higher click rate.

A/B Test Method

For the control, HubSpot sent centered email text to users.

A/B test examples: HubSpot, centered text alignment

For variant B, HubSpot sent emails with left-justified text.

A/B test examples: HubSpot, left-justified text alignment

Results

HubSpot found that emails with left-aligned text got fewer clicks than the control. And of the total left-justified emails sent, less than 25% got more clicks than the control.

6. Neurogan’s Deal Promotion

Making the most of email promotion is important for any company, especially those in competitive industries.

This example uses the power of current customers for increasing email engagement.

Problem

Neurogan wasn’t always offering the right content to its audience and it was having a hard time competing with a flood of other new brands.

A/B Test Method

An email agency audited this brand’s email marketing, then focused efforts on segmentation. This A/B testing example starts with creating product-specific offers. Then, this team used testing to figure out which deals were best for each audience.

Results

These changes brought higher revenue for promotions and higher click rates. It also led to a new workflow with a 37% average open rate and a click rate of 3.85%.

For more on how to run A/B testing for your campaigns, check out this free A/B testing kit.

Social Media A/B Testing Examples

7. Vestiaire’s TikTok Awareness Campaign

A/B testing examples like the one below can help you think creatively about what to test and when. This is extra helpful if your business is working with influencers and doesn’t want to impact their process while working toward business goals.

Problem

Fashion brand Vestaire wanted help growing the brand on TikTok. It was also hoping to increase awareness with Gen Z audiences for its new direct shopping feature.

A/B Test Method

Vestaire’s influencer marketing agency asked eight influencers to create content with specific CTAs to meet the brand’s goals. Each influencer had extensive creative freedom and created a range of different social media posts.

Then, the agency used A/B testing to choose the best-performing content and promoted this content with paid advertising.

A/B testing examples: Vestaire

Results

This testing example generated over 4,000 installs. It also decreased the cost per install by 50% compared to the brand’s existing presence on Instagram and YouTube.

8. Underoutfit’s Promotion of User-Generated Content on Facebook

Paid advertising is getting more expensive, and clickthrough rates decreased through the end of 2022.

To make the most of social ad spend, marketers are using A/B testing to improve ad performance. This approach helps them test creative content before launching paid ad campaigns, like in the examples below.

Problem

Underoutfit wanted to increase brand awareness on Facebook.

A/B Test Method

To meet this goal, it decided to try adding branded user-generated content. This brand worked with an agency and several creators to create branded content to drive conversion.

Then, Underoutfit ran split testing between product ads and the same ads combined with the new branded content ads. Both groups in the split test contained key marketing messages and clear CTA copy.

The brand and agency also worked with Meta Creative Shop to make sure the videos met best practice standards.

A/B testing examples: Underoutfit

Results

The test showed impressive results for the branded content variant, including a 47% higher clickthrough rate and 28% higher return on ad spend.

9. Databricks’ Ad Performance on LinkedIn

Pivoting to a new strategy quickly can be difficult for organizations. This A/B testing example shows how you can use split testing to figure out the best new approach to a problem.

Problem

Databricks, a cloud software tool, needed to raise awareness for an event that was shifting from in-person to online.

A/B Test Method

To connect with a large group of new people in a personalized way, the team decided to create a LinkedIn Message Ads campaign. To make sure the messages were effective, it used A/B testing to tweak the subject line and message copy.

Results

A/B testing examples: Databricks

The third variant of the copy featured a hyperlink in the first sentence of the invitation. Compared to the other two variants, this version got nearly twice as many clicks and conversions.

Mobile A/B Testing Example

7. HubSpot’s Mobile Calls-to-Action

On this blog, you’ll notice anchor text in the introduction, a graphic CTA at the bottom, and a slide-in CTA when you scroll through the post. Once you click on one of these offers, you’ll land on a content offer page.

While many users access these offers from a desktop or laptop computer, many others plan to download these offers to mobile devices.

Problem

But on mobile, users weren’t finding the CTA buttons as quickly as they could on a computer. That’s why HubSpot tested mobile design changes to improve the user experience.

Previous A/B tests revealed that HubSpot’s mobile audience was 27% less likely to click through to download an offer. Also, less than 75% of mobile users were scrolling down far enough to see the CTA button.

A/B Test Method

So, HubSpot decided to test different versions of the offer page CTA, using conversion rate (CVR) as the primary metric. For secondary metrics, the team measured CTA clicks for each CTA, as well as engagement.

HubSpot used four variants for this test.

For variant A, the control, the traditional placement of CTAs remained unchanged.

For variant B, the team redesigned the hero image and added a sticky CTA bar.

A/B testing examples: HubSpot mobile, A & B

For variant C, the redesigned hero was the only change.

For variant D, the team redesigned the hero image and repositioned the slider.

A/B testing examples: HubSpot mobile, C & D

Results

All variants outperformed the control for the primary metric, CVR. Variant C saw a 10% increase, variant B saw a 9% increase, and variant D saw an 8% increase.

From those numbers, HubSpot was able to project that using variant C on mobile would lead to about 1,400 more content leads and almost 5,700 more form submissions each month.

11. Hospitality.net’s Mobile Booking

Businesses need to keep up with quick shifts in mobile devices to create a consistently strong customer experience.

A/B testing examples like the one below can help your business streamline this process.

Problem

Hospitality.net offered both simplified and dynamic mobile booking experiences. The simplified experience showed a limited number of available dates and the design is for smaller screens. The dynamic experience is for the larger mobile device screens. It shows a wider range of dates and prices.

But the brand wasn’t sure which mobile optimization strategy would be better for conversion.

A/B Test Method

This brand believed that customers would prefer the dynamic experience and that it would get more conversions. But it chose to test these ideas with a simple A/B test. Over 34 days, it sent half of the mobile visitors to the simplified mobile experience, and half to the dynamic experience, with over 100,000 visitors total.

A/B testing examples: Hospitality.net

Results

This A/B testing example showed a 33% improvement in conversion. It also helped confirm the brand’s educated guesses about mobile booking preferences.

A/B Testing Takeaways for Marketers

A lot of different factors can go into A/B testing, depending on your business needs. But there are a few key things to keep in mind:

  • Every A/B test should start with a hypothesis focused on one specific problem that you can test.
  • Make sure you’re testing a control variable (your original version) and a treatment variable (a new version that you think will perform better).
  • You can test various things, like landing pages, CTAs, emails, or mobile app designs.
  • The best way to understand if your results mean something is to figure out the statistical significance of your test.
  • There are a variety of goals to focus on for A/B testing (increased site traffic, lower bounce rates, etc.), but you should be able to test, support, prove, and disprove your hypothesis.
  • When testing, make sure you’re splitting your sample groups equally and randomly, so your data is viable and not due to chance.
  • Take action based on the results you observe.

Start Your Next A/B Test Today

You can see amazing results from the A/B testing examples above. These businesses were able to take action on goals because they started testing. If you want to get great results, you’ve got to get started, too.

Editor’s note: This post was originally published in October 2014 and has been updated for comprehensiveness.

The Ultimate A/B Testing Kit

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address

MARKETING

Why We Are Always ‘Clicking to Buy’, According to Psychologists

Published

on

Why We Are Always 'Clicking to Buy', According to Psychologists

Amazon pillows.

(more…)

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

MARKETING

A deeper dive into data, personalization and Copilots

Published

on

A deeper dive into data, personalization and Copilots

Salesforce launched a collection of new, generative AI-related products at Connections in Chicago this week. They included new Einstein Copilots for marketers and merchants and Einstein Personalization.

To better understand, not only the potential impact of the new products, but the evolving Salesforce architecture, we sat down with Bobby Jania, CMO, Marketing Cloud.

Dig deeper: Salesforce piles on the Einstein Copilots

Salesforce’s evolving architecture

It’s hard to deny that Salesforce likes coming up with new names for platforms and products (what happened to Customer 360?) and this can sometimes make the observer wonder if something is brand new, or old but with a brand new name. In particular, what exactly is Einstein 1 and how is it related to Salesforce Data Cloud?

“Data Cloud is built on the Einstein 1 platform,” Jania explained. “The Einstein 1 platform is our entire Salesforce platform and that includes products like Sales Cloud, Service Cloud — that it includes the original idea of Salesforce not just being in the cloud, but being multi-tenancy.”

Data Cloud — not an acquisition, of course — was built natively on that platform. It was the first product built on Hyperforce, Salesforce’s new cloud infrastructure architecture. “Since Data Cloud was on what we now call the Einstein 1 platform from Day One, it has always natively connected to, and been able to read anything in Sales Cloud, Service Cloud [and so on]. On top of that, we can now bring in, not only structured but unstructured data.”

That’s a significant progression from the position, several years ago, when Salesforce had stitched together a platform around various acquisitions (ExactTarget, for example) that didn’t necessarily talk to each other.

“At times, what we would do is have a kind of behind-the-scenes flow where data from one product could be moved into another product,” said Jania, “but in many of those cases the data would then be in both, whereas now the data is in Data Cloud. Tableau will run natively off Data Cloud; Commerce Cloud, Service Cloud, Marketing Cloud — they’re all going to the same operational customer profile.” They’re not copying the data from Data Cloud, Jania confirmed.

Another thing to know is tit’s possible for Salesforce customers to import their own datasets into Data Cloud. “We wanted to create a federated data model,” said Jania. “If you’re using Snowflake, for example, we more or less virtually sit on your data lake. The value we add is that we will look at all your data and help you form these operational customer profiles.”

Let’s learn more about Einstein Copilot

“Copilot means that I have an assistant with me in the tool where I need to be working that contextually knows what I am trying to do and helps me at every step of the process,” Jania said.

For marketers, this might begin with a campaign brief developed with Copilot’s assistance, the identification of an audience based on the brief, and then the development of email or other content. “What’s really cool is the idea of Einstein Studio where our customers will create actions [for Copilot] that we hadn’t even thought about.”

Here’s a key insight (back to nomenclature). We reported on Copilot for markets, Copilot for merchants, Copilot for shoppers. It turns out, however, that there is just one Copilot, Einstein Copilot, and these are use cases. “There’s just one Copilot, we just add these for a little clarity; we’re going to talk about marketing use cases, about shoppers’ use cases. These are actions for the marketing use cases we built out of the box; you can build your own.”

It’s surely going to take a little time for marketers to learn to work easily with Copilot. “There’s always time for adoption,” Jania agreed. “What is directly connected with this is, this is my ninth Connections and this one has the most hands-on training that I’ve seen since 2014 — and a lot of that is getting people using Data Cloud, using these tools rather than just being given a demo.”

What’s new about Einstein Personalization

Salesforce Einstein has been around since 2016 and many of the use cases seem to have involved personalization in various forms. What’s new?

“Einstein Personalization is a real-time decision engine and it’s going to choose next-best-action, next-best-offer. What is new is that it’s a service now that runs natively on top of Data Cloud.” A lot of real-time decision engines need their own set of data that might actually be a subset of data. “Einstein Personalization is going to look holistically at a customer and recommend a next-best-action that could be natively surfaced in Service Cloud, Sales Cloud or Marketing Cloud.”

Finally, trust

One feature of the presentations at Connections was the reassurance that, although public LLMs like ChatGPT could be selected for application to customer data, none of that data would be retained by the LLMs. Is this just a matter of written agreements? No, not just that, said Jania.

“In the Einstein Trust Layer, all of the data, when it connects to an LLM, runs through our gateway. If there was a prompt that had personally identifiable information — a credit card number, an email address — at a mimum, all that is stripped out. The LLMs do not store the output; we store the output for auditing back in Salesforce. Any output that comes back through our gateway is logged in our system; it runs through a toxicity model; and only at the end do we put PII data back into the answer. There are real pieces beyond a handshake that this data is safe.”

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

MARKETING

Why The Sales Team Hates Your Leads (And How To Fix It)

Published

on

Why The Sales Team Hates Your Leads (And How To Fix It)

Why The Sales Team Hates Your Leads And How To

You ask the head of marketing how the team is doing and get a giant thumbs up. 👍

“Our MQLs are up!”

“Website conversion rates are at an all-time high!”

“Email click rates have never been this good!”

But when you ask the head of sales the same question, you get the response that echoes across sales desks worldwide — the leads from marketing suck. 

If you’re in this boat, you’re not alone. The issue of “leads from marketing suck” is a common situation in most organizations. In a HubSpot survey, only 9.1% of salespeople said leads they received from marketing were of very high quality.

Why do sales teams hate marketing-generated leads? And how can marketers help their sales peers fall in love with their leads? 

Let’s dive into the answers to these questions. Then, I’ll give you my secret lead gen kung-fu to ensure your sales team loves their marketing leads. 

Marketers Must Take Ownership

“I’ve hit the lead goal. If sales can’t close them, it’s their problem.”

How many times have you heard one of your marketers say something like this? When your teams are heavily siloed, it’s not hard to see how they get to this mindset — after all, if your marketing metrics look strong, they’ve done their part, right?

Not necessarily. 

The job of a marketer is not to drive traffic or even leads. The job of the marketer is to create messaging and offers that lead to revenue. Marketing is not a 100-meter sprint — it’s a relay race. The marketing team runs the first leg and hands the baton to sales to sprint to the finish.

​​

via GIPHY

To make leads valuable beyond the vanity metric of watching your MQLs tick up, you need to segment and nurture them. Screen the leads to see if they meet the parameters of your ideal customer profile. If yes, nurture them to find out how close their intent is to a sale. Only then should you pass the leads to sales. 

Lead Quality Control is a Bitter Pill that Works

Tighter quality control might reduce your overall MQLs. Still, it will ensure only the relevant leads go to sales, which is a win for your team and your organization.

This shift will require a mindset shift for your marketing team: instead of living and dying by the sheer number of MQLs, you need to create a collaborative culture between sales and marketing. Reinforce that “strong” marketing metrics that result in poor leads going to sales aren’t really strong at all.  

When you foster this culture of collaboration and accountability, it will be easier for the marketing team to receive feedback from sales about lead quality without getting defensive. 

Remember, the sales team is only holding marketing accountable so the entire organization can achieve the right results. It’s not sales vs marketing — it’s sales and marketing working together to get a great result. Nothing more, nothing less. 

We’ve identified the problem and where we need to go. So, how you do you get there?

Fix #1: Focus On High ROI Marketing Activities First

What is more valuable to you:

  • One more blog post for a few more views? 
  • One great review that prospective buyers strongly relate to?

Hopefully, you’ll choose the latter. After all, talking to customers and getting a solid testimonial can help your sales team close leads today.  Current customers talking about their previous issues, the other solutions they tried, why they chose you, and the results you helped them achieve is marketing gold.

On the other hand, even the best blog content will take months to gain enough traction to impact your revenue.

Still, many marketers who say they want to prioritize customer reviews focus all their efforts on blog content and other “top of the funnel” (Awareness, Acquisition, and Activation) efforts. 

The bottom half of the growth marketing funnel (Retention, Reputation, and Revenue) often gets ignored, even though it’s where you’ll find some of the highest ROI activities.

1716755163 123 Why The Sales Team Hates Your Leads And How To1716755163 123 Why The Sales Team Hates Your Leads And How To

Most marketers know retaining a customer is easier than acquiring a new one. But knowing this and working with sales on retention and account expansion are two different things. 

When you start focusing on retention, upselling, and expansion, your entire organization will feel it, from sales to customer success. These happier customers will increase your average account value and drive awareness through strong word of mouth, giving you one heck of a win/win.

Winning the Retention, Reputation, and Referral game also helps feed your Awareness, Acquisition, and Activation activities:

  • Increasing customer retention means more dollars stay within your organization to help achieve revenue goals and fund lead gen initiatives.
  • A fully functioning referral system lowers your customer acquisition cost (CAC) because these leads are already warm coming in the door.
  • Case studies and reviews are powerful marketing assets for lead gen and nurture activities as they demonstrate how you’ve solved identical issues for other companies.

Remember that the bottom half of your marketing and sales funnel is just as important as the top half. After all, there’s no point pouring leads into a leaky funnel. Instead, you want to build a frictionless, powerful growth engine that brings in the right leads, nurtures them into customers, and then delights those customers to the point that they can’t help but rave about you.

So, build a strong foundation and start from the bottom up. You’ll find a better return on your investment. 

Fix #2: Join Sales Calls to Better Understand Your Target Audience

You can’t market well what you don’t know how to sell.

Your sales team speaks directly to customers, understands their pain points, and knows the language they use to talk about those pains. Your marketing team needs this information to craft the perfect marketing messaging your target audience will identify with.

When marketers join sales calls or speak to existing customers, they get firsthand introductions to these pain points. Often, marketers realize that customers’ pain points and reservations are very different from those they address in their messaging. 

Once you understand your ideal customers’ objections, anxieties, and pressing questions, you can create content and messaging to remove some of these reservations before the sales call. This effort removes a barrier for your sales team, resulting in more SQLs.

Fix #3: Create Collateral That Closes Deals

One-pagers, landing pages, PDFs, decks — sales collateral could be anything that helps increase the chance of closing a deal. Let me share an example from Lean Labs. 

Our webinar page has a CTA form that allows visitors to talk to our team. Instead of a simple “get in touch” form, we created a drop-down segmentation based on the user’s challenge and need. This step helps the reader feel seen, gives them hope that they’ll receive real value from the interaction, and provides unique content to users based on their selection.

1716755163 298 Why The Sales Team Hates Your Leads And How To1716755163 298 Why The Sales Team Hates Your Leads And How To

So, if they select I need help with crushing it on HubSpot, they’ll get a landing page with HubSpot-specific content (including a video) and a meeting scheduler. 

Speaking directly to your audience’s needs and pain points through these steps dramatically increases the chances of them booking a call. Why? Because instead of trusting that a generic “expert” will be able to help them with their highly specific problem, they can see through our content and our form design that Lean Labs can solve their most pressing pain point. 

Fix #4: Focus On Reviews and Create an Impact Loop

A lot of people think good marketing is expensive. You know what’s even more expensive? Bad marketing

To get the best ROI on your marketing efforts, you need to create a marketing machine that pays for itself. When you create this machine, you need to think about two loops: the growth loop and the impact loop.

1716755163 789 Why The Sales Team Hates Your Leads And How To1716755163 789 Why The Sales Team Hates Your Leads And How To
  • Growth loop — Awareness ➡ Acquisition ➡ Activation ➡ Revenue ➡ Awareness: This is where most marketers start. 
  • Impact loop — Results ➡ Reviews ➡ Retention ➡ Referrals ➡ Results: This is where great marketers start. 

Most marketers start with their growth loop and then hope that traction feeds into their impact loop. However, the reality is that starting with your impact loop is going to be far more likely to set your marketing engine up for success

Let me share a client story to show you what this looks like in real life.

Client Story: 4X Website Leads In A Single Quarter

We partnered with a health tech startup looking to grow their website leads. One way to grow website leads is to boost organic traffic, of course, but any organic play is going to take time. If you’re playing the SEO game alone, quadrupling conversions can take up to a year or longer.

But we did it in a single quarter. Here’s how.

We realized that the startup’s demos were converting lower than industry standards. A little more digging showed us why: our client was new enough to the market that the average person didn’t trust them enough yet to want to invest in checking out a demo. So, what did we do?

We prioritized the last part of the funnel: reputation.

We ran a 5-star reputation campaign to collect reviews. Once we had the reviews we needed, we showcased them at critical parts of the website and then made sure those same reviews were posted and shown on other third-party review platforms. 

Remember that reputation plays are vital, and they’re one of the plays startups often neglect at best and ignore at worst. What others say about your business is ten times more important than what you say about yourself

By providing customer validation at critical points in the buyer journey, we were able to 4X the website leads in a single quarter!

1716755164 910 Why The Sales Team Hates Your Leads And How To1716755164 910 Why The Sales Team Hates Your Leads And How To

So, when you talk to customers, always look for opportunities to drive review/referral conversations and use them in marketing collateral throughout the buyer journey. 

Fix #5: Launch Phantom Offers for Higher Quality Leads 

You may be reading this post thinking, okay, my lead magnets and offers might be way off the mark, but how will I get the budget to create a new one that might not even work?

It’s an age-old issue: marketing teams invest way too much time and resources into creating lead magnets that fail to generate quality leads

One way to improve your chances of success, remain nimble, and stay aligned with your audience without breaking the bank is to create phantom offers, i.e., gauge the audience interest in your lead magnet before you create them.

For example, if you want to create a “World Security Report” for Chief Security Officers, don’t do all the research and complete the report as Step One. Instead, tease the offer to your audience before you spend time making it. Put an offer on your site asking visitors to join the waitlist for this report. Then wait and see how that phantom offer converts. 

This is precisely what we did for a report by Allied Universal that ended up generating 80 conversions before its release.

1716755164 348 Why The Sales Team Hates Your Leads And How To1716755164 348 Why The Sales Team Hates Your Leads And How To

The best thing about a phantom offer is that it’s a win/win scenario: 

  • Best case: You get conversions even before you create your lead magnet.
  • Worst case: You save resources by not creating a lead magnet no one wants.  

Remember, You’re On The Same Team 

We’ve talked a lot about the reasons your marketing leads might suck. However, remember that it’s not all on marketers, either. At the end of the day, marketing and sales professionals are on the same team. They are not in competition with each other. They are allies working together toward a common goal. 

Smaller companies — or anyone under $10M in net new revenue — shouldn’t even separate sales and marketing into different departments. These teams need to be so in sync with one another that your best bet is to align them into a single growth team, one cohesive front with a single goal: profitable customer acquisition.

Interested in learning more about the growth marketing mindset? Check out the Lean Labs Growth Playbook that’s helped 25+ B2B SaaS marketing teams plan, budget, and accelerate growth.


Disruptive Design Raising the Bar of Content Marketing with Graphic

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading

Trending