Connect with us


11 A/B Testing Examples From Real Businesses



11 A/B Testing Examples From Real Businesses

Whether you’re looking to increase revenue, sign-ups, social shares, or engagement, A/B testing and optimization can help you get there.

But for many marketers out there, the tough part about A/B testing is often finding the right test to drive the biggest impact — especially when you’re just getting started. So, what’s the recipe for high-impact success?

Free Download: A/B Testing Guide and Kit

Truthfully, there is no one-size-fits-all recipe. What works for one business won’t work for another — and finding the right metrics and timing to test can be a tough problem to solve. That’s why you need inspiration from A/B testing examples.

In this post, let’s review how a hypothesis will get you started with your testing, and check out excellent examples from real businesses using A/B testing. While the same tests may not get you the same results, they can help you run creative tests of your own.

A/B Testing Hypothesis Examples

A hypothesis can make or break your experiment, especially when it comes to A/B testing. When creating your hypothesis, you want to make sure that it’s:

  1. Focused on one specific problem you want to solve or understand
  2. Able to be proven or disproven
  3. Focused on making an impact (bringing higher conversion rates, lower bounce rate, etc.)

When creating a hypothesis, following the “If, then” structure can be helpful, where if you changed a specific variable, then a particular result would happen.

Here are some examples of what that would look like in an A/B testing hypothesis:

  • Shortening contact submission forms to only contain required fields would increase the number of sign-ups.
  • Changing the call-to-action text from “Download now” to “Download this free guide” would increase the number of downloads.
  • Reducing the frequency of mobile app notifications from five times per day to two times per day will increase mobile app retention rates.
  • Using featured images that are more contextually related to our blog posts will contribute to a lower bounce rate.
  • Greeting customers by name in emails will increase the total number of clicks.

Let’s go over some real-life examples of A/B testing to prepare you for your own.

A/B Testing Examples

Website A/B Testing Examples

1. HubSpot Academy’s Homepage Hero Image

Most websites have a homepage hero image that inspires users to engage and spend more time on the site. This A/B testing example shows how hero image changes can impact user behavior and conversions.


Based on previous data, HubSpot Academy found that out of more than 55,000 page views, only .9% of those users were watching the video on the homepage. Of those viewers, almost 50% watched the full video.

Chat transcripts also highlighted the need for clearer messaging for this useful and free resource.

That’s why the HubSpot team decided to test how clear value propositions could improve user engagement and delight.

A/B Test Method

HubSpot used three variants for this test, using HubSpot Academy conversion rate (CVR) as the primary metric. Secondary metrics included CTA clicks and engagement.

Variant A was the control.

A/B testing examples: HubSpot Academy's Homepage Hero

For variant B, the team added more vibrant images and colorful text and shapes. It also included an animated “typing” headline.

A/B testing examples: HubSpot Academy's Homepage Hero

Variant C also added color and movement, as well as animated images on the right-hand side of the page.

A/B testing examples: HubSpot Academy's Homepage Hero


As a result, HubSpot found that variant B outperformed the control by 6%. In contrast, variant C underperformed the control by 1%. From those numbers, HubSpot was able to project that using variant B would lead to about 375 more sign ups each month.

2.’s Site Navigation

Every marketer will have to focus on conversion at some point. But building a website that converts is tough.

Problem is an ecommerce company supplying home goods for Americans with a flexible spending account.

This useful site could help the 35 million+ customers that have an FSA. But the website funnel was overwhelming. It had too many options, especially on category pages. The team felt that customers weren’t making purchases because of that issue.

A/B Test Method

To figure out how to appeal to its customers, this company tested a simplified version of its website. The current site included an information-packed subheader in the site navigation.

To test the hypothesis, this A/B testing example compared the current site to an update without the subheader.

A/B testing examples:


This update showed a clear boost in conversions and saw a 53.8% increase in revenue per visitor.

3. Expoze’s Web Page Background

The visuals on your web page are important because they help users decide whether they want to spend more time on your site.

In this A/B testing example, decided to test the background on its homepage.


The website home page was difficult for some users to read because of low contrast. The team also needed to figure out how to improve page navigation while still representing the brand.

A/B Test Method

First, the team did some research and created several different designs. The goals of the redesign were to improve the visuals and increase attention to specific sections of the home page, like the video thumbnail.

A/B testing examples:

They used AI-generated eye tracking as they designed to find the best designs before A/B testing. Then they ran an A/B heatmap test to see whether the new or current design got the most attention from visitors.

A/B testing examples: heatmaps


The new design showed a big increase in attention, with version B bringing over 40% more attention to the desired sections of the home page.

This design change also brought a 25% increase in CTA clicks. The team believes this is due to the added contrast on the page bringing more attention to the CTA button, which was not changed.

4. Thrive Themes’ Sales Page Optimization

Many landing pages showcase testimonials. That’s valuable content and it can boost conversion.

That’s why Thrive Themes decided to test a new feature on its landing pages — customer testimonials.


In the control, Thrive Themes had been using a banner that highlighted product features, but not how customers felt about the product.

The team decided to test whether adding testimonials to a sales landing page could improve conversion rates.

A/B Test Method

In this A/B test example, the team ran a 6-week test with the control against an updated landing page with testimonials.

A/B testing examples: Thrive Themes


This change netted a 13% increase in sales. The control page had a 2.2% conversion rate, but the new variant showed a 2.75% conversion rate.

Email A/B Testing Examples

5. HubSpot’s Email Subscriber Experience

Getting users to engage with email isn’t an easy task. That’s why HubSpot decided to A/B test how alignment impacts CTA clicks.


HubSpot decided to change text alignment in the weekly emails for subscribers to improve the user experience. Ideally, this improved experience would result in a higher click rate.

A/B Test Method

For the control, HubSpot sent centered email text to users.

A/B test examples: HubSpot, centered text alignment

For variant B, HubSpot sent emails with left-justified text.

A/B test examples: HubSpot, left-justified text alignment


HubSpot found that emails with left-aligned text got fewer clicks than the control. And of the total left-justified emails sent, less than 25% got more clicks than the control.

6. Neurogan’s Deal Promotion

Making the most of email promotion is important for any company, especially those in competitive industries.

This example uses the power of current customers for increasing email engagement.


Neurogan wasn’t always offering the right content to its audience and it was having a hard time competing with a flood of other new brands.

A/B Test Method

An email agency audited this brand’s email marketing, then focused efforts on segmentation. This A/B testing example starts with creating product-specific offers. Then, this team used testing to figure out which deals were best for each audience.


These changes brought higher revenue for promotions and higher click rates. It also led to a new workflow with a 37% average open rate and a click rate of 3.85%.

For more on how to run A/B testing for your campaigns, check out this free A/B testing kit.

Social Media A/B Testing Examples

7. Vestiaire’s TikTok Awareness Campaign

A/B testing examples like the one below can help you think creatively about what to test and when. This is extra helpful if your business is working with influencers and doesn’t want to impact their process while working toward business goals.


Fashion brand Vestaire wanted help growing the brand on TikTok. It was also hoping to increase awareness with Gen Z audiences for its new direct shopping feature.

A/B Test Method

Vestaire’s influencer marketing agency asked eight influencers to create content with specific CTAs to meet the brand’s goals. Each influencer had extensive creative freedom and created a range of different social media posts.

Then, the agency used A/B testing to choose the best-performing content and promoted this content with paid advertising.

A/B testing examples: Vestaire


This testing example generated over 4,000 installs. It also decreased the cost per install by 50% compared to the brand’s existing presence on Instagram and YouTube.

8. Underoutfit’s Promotion of User-Generated Content on Facebook

Paid advertising is getting more expensive, and clickthrough rates decreased through the end of 2022.

To make the most of social ad spend, marketers are using A/B testing to improve ad performance. This approach helps them test creative content before launching paid ad campaigns, like in the examples below.


Underoutfit wanted to increase brand awareness on Facebook.

A/B Test Method

To meet this goal, it decided to try adding branded user-generated content. This brand worked with an agency and several creators to create branded content to drive conversion.

Then, Underoutfit ran split testing between product ads and the same ads combined with the new branded content ads. Both groups in the split test contained key marketing messages and clear CTA copy.

The brand and agency also worked with Meta Creative Shop to make sure the videos met best practice standards.

A/B testing examples: Underoutfit


The test showed impressive results for the branded content variant, including a 47% higher clickthrough rate and 28% higher return on ad spend.

9. Databricks’ Ad Performance on LinkedIn

Pivoting to a new strategy quickly can be difficult for organizations. This A/B testing example shows how you can use split testing to figure out the best new approach to a problem.


Databricks, a cloud software tool, needed to raise awareness for an event that was shifting from in-person to online.

A/B Test Method

To connect with a large group of new people in a personalized way, the team decided to create a LinkedIn Message Ads campaign. To make sure the messages were effective, it used A/B testing to tweak the subject line and message copy.


A/B testing examples: Databricks

The third variant of the copy featured a hyperlink in the first sentence of the invitation. Compared to the other two variants, this version got nearly twice as many clicks and conversions.

Mobile A/B Testing Example

7. HubSpot’s Mobile Calls-to-Action

On this blog, you’ll notice anchor text in the introduction, a graphic CTA at the bottom, and a slide-in CTA when you scroll through the post. Once you click on one of these offers, you’ll land on a content offer page.

While many users access these offers from a desktop or laptop computer, many others plan to download these offers to mobile devices.


But on mobile, users weren’t finding the CTA buttons as quickly as they could on a computer. That’s why HubSpot tested mobile design changes to improve the user experience.

Previous A/B tests revealed that HubSpot’s mobile audience was 27% less likely to click through to download an offer. Also, less than 75% of mobile users were scrolling down far enough to see the CTA button.

A/B Test Method

So, HubSpot decided to test different versions of the offer page CTA, using conversion rate (CVR) as the primary metric. For secondary metrics, the team measured CTA clicks for each CTA, as well as engagement.

HubSpot used four variants for this test.

For variant A, the control, the traditional placement of CTAs remained unchanged.

For variant B, the team redesigned the hero image and added a sticky CTA bar.

A/B testing examples: HubSpot mobile, A & B

For variant C, the redesigned hero was the only change.

For variant D, the team redesigned the hero image and repositioned the slider.

A/B testing examples: HubSpot mobile, C & D


All variants outperformed the control for the primary metric, CVR. Variant C saw a 10% increase, variant B saw a 9% increase, and variant D saw an 8% increase.

From those numbers, HubSpot was able to project that using variant C on mobile would lead to about 1,400 more content leads and almost 5,700 more form submissions each month.

11.’s Mobile Booking

Businesses need to keep up with quick shifts in mobile devices to create a consistently strong customer experience.

A/B testing examples like the one below can help your business streamline this process.

Problem offered both simplified and dynamic mobile booking experiences. The simplified experience showed a limited number of available dates and the design is for smaller screens. The dynamic experience is for the larger mobile device screens. It shows a wider range of dates and prices.

But the brand wasn’t sure which mobile optimization strategy would be better for conversion.

A/B Test Method

This brand believed that customers would prefer the dynamic experience and that it would get more conversions. But it chose to test these ideas with a simple A/B test. Over 34 days, it sent half of the mobile visitors to the simplified mobile experience, and half to the dynamic experience, with over 100,000 visitors total.

A/B testing examples:


This A/B testing example showed a 33% improvement in conversion. It also helped confirm the brand’s educated guesses about mobile booking preferences.

A/B Testing Takeaways for Marketers

A lot of different factors can go into A/B testing, depending on your business needs. But there are a few key things to keep in mind:

  • Every A/B test should start with a hypothesis focused on one specific problem that you can test.
  • Make sure you’re testing a control variable (your original version) and a treatment variable (a new version that you think will perform better).
  • You can test various things, like landing pages, CTAs, emails, or mobile app designs.
  • The best way to understand if your results mean something is to figure out the statistical significance of your test.
  • There are a variety of goals to focus on for A/B testing (increased site traffic, lower bounce rates, etc.), but you should be able to test, support, prove, and disprove your hypothesis.
  • When testing, make sure you’re splitting your sample groups equally and randomly, so your data is viable and not due to chance.
  • Take action based on the results you observe.

Start Your Next A/B Test Today

You can see amazing results from the A/B testing examples above. These businesses were able to take action on goals because they started testing. If you want to get great results, you’ve got to get started, too.

Editor’s note: This post was originally published in October 2014 and has been updated for comprehensiveness.

The Ultimate A/B Testing Kit

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address


The Complete Guide to Becoming an Authentic Thought Leader



The Complete Guide to Becoming an Authentic Thought Leader

Introduce your processes: If you’ve streamlined a particular process, share it. It could be the solution someone else is looking for.

Jump on trends and news: If there’s a hot topic or emerging trend, offer your unique perspective.

Share industry insights: Attended a webinar or podcast that offered valuable insights. Summarize the key takeaways and how they can be applied.

Share your successes: Write about strategies that have worked exceptionally well for you. Your audience will appreciate the proven advice. For example, I shared the process I used to help a former client rank for a keyword with over 2.2 million monthly searches.

Question outdated strategies: If you see a strategy that’s losing steam, suggest alternatives based on your experience and data.

5. Establish communication channels (How)

Once you know who your audience is and what they want to hear, the next step is figuring out how to reach them. Here’s how:

Choose the right platforms: You don’t need to have a presence on every social media platform. Pick two platforms where your audience hangs out and create content for that platform. For example, I’m active on LinkedIn and X because my target audience (SEOs, B2B SaaS, and marketers) is active on these platforms.

Repurpose content: Don’t limit yourself to just one type of content. Consider repurposing your content on Quora, Reddit, or even in webinars and podcasts. This increases your reach and reinforces your message.

Follow Your audience: Go where your audience goes. If they’re active on X, that’s where you should be posting. If they frequent industry webinars, consider becoming a guest on these webinars.

Daily vs. In-depth content: Balance is key. Use social media for daily tips and insights, and reserve your blog for more comprehensive guides and articles.

Network with influencers: Your audience is likely following other experts in the field. Engaging with these influencers puts your content in front of a like-minded audience. I try to spend 30 minutes to an hour daily engaging with content on X and LinkedIn. This is the best way to build a relationship so you’re not a complete stranger when you DM privately.

6. Think of thought leadership as part of your content marketing efforts

As with other content efforts, thought leadership doesn’t exist in a vacuum. It thrives when woven into a cohesive content marketing strategy. By aligning individual authority with your brand, you amplify the credibility of both.

Think of it as top-of-the-funnel content to:

  • Build awareness about your brand

  • Highlight the problems you solve

  • Demonstrate expertise by platforming experts within the company who deliver solutions

Consider the user journey. An individual enters at the top through a social media post, podcast, or blog post. Intrigued, they want to learn more about you and either search your name on Google or social media. If they like what they see, they might visit your website, and if the information fits their needs, they move from passive readers to active prospects in your sales pipeline.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading


How to Increase Survey Completion Rate With 5 Top Tips



How to Increase Survey Completion Rate With 5 Top Tips

Collecting high-quality data is crucial to making strategic observations about your customers. Researchers have to consider the best ways to design their surveys and then how to increase survey completion, because it makes the data more reliable.

→ Free Download: 5 Customer Survey Templates [Access Now]

I’m going to explain how survey completion plays into the reliability of data. Then, we’ll get into how to calculate your survey completion rate versus the number of questions you ask. Finally, I’ll offer some tips to help you increase survey completion rates.

My goal is to make your data-driven decisions more accurate and effective. And just for fun, I’ll use cats in the examples because mine won’t stop walking across my keyboard.

Why Measure Survey Completion

Let’s set the scene: We’re inside a laboratory with a group of cat researchers. They’re wearing little white coats and goggles — and they desperately want to know what other cats think of various fish.

They’ve written up a 10-question survey and invited 100 cats from all socioeconomic rungs — rough and hungry alley cats all the way up to the ones that thrice daily enjoy their Fancy Feast from a crystal dish.

Now, survey completion rates are measured with two metrics: response rate and completion rate. Combining those metrics determines what percentage, out of all 100 cats, finished the entire survey. If all 100 give their full report on how delicious fish is, you’d achieve 100% survey completion and know that your information is as accurate as possible.

But the truth is, nobody achieves 100% survey completion, not even golden retrievers.

With this in mind, here’s how it plays out:

  • Let’s say 10 cats never show up for the survey because they were sleeping.
  • Of the 90 cats that started the survey, only 25 got through a few questions. Then, they wandered off to knock over drinks.
  • Thus, 90 cats gave some level of response, and 65 completed the survey (90 – 25 = 65).
  • Unfortunately, those 25 cats who only partially completed the survey had important opinions — they like salmon way more than any other fish.

The cat researchers achieved 72% survey completion (65 divided by 90), but their survey will not reflect the 25% of cats — a full quarter! — that vastly prefer salmon. (The other 65 cats had no statistically significant preference, by the way. They just wanted to eat whatever fish they saw.)

Now, the Kitty Committee reviews the research and decides, well, if they like any old fish they see, then offer the least expensive ones so they get the highest profit margin.

CatCorp, their competitors, ran the same survey; however, they offered all 100 participants their own glass of water to knock over — with a fish inside, even!

Only 10 of their 100 cats started, but did not finish the survey. And the same 10 lazy cats from the other survey didn’t show up to this one, either.

So, there were 90 respondents and 80 completed surveys. CatCorp achieved an 88% completion rate (80 divided by 90), which recorded that most cats don’t care, but some really want salmon. CatCorp made salmon available and enjoyed higher profits than the Kitty Committee.

So you see, the higher your survey completion rates, the more reliable your data is. From there, you can make solid, data-driven decisions that are more accurate and effective. That’s the goal.

We measure the completion rates to be able to say, “Here’s how sure we can feel that this information is accurate.”

And if there’s a Maine Coon tycoon looking to invest, will they be more likely to do business with a cat food company whose decision-making metrics are 72% accurate or 88%? I suppose it could depend on who’s serving salmon.

While math was not my strongest subject in school, I had the great opportunity to take several college-level research and statistics classes, and the software we used did the math for us. That’s why I used 100 cats — to keep the math easy so we could focus on the importance of building reliable data.

Now, we’re going to talk equations and use more realistic numbers. Here’s the formula:

Completion rate equals the # of completed surveys divided by the # of survey respondents.

So, we need to take the number of completed surveys and divide that by the number of people who responded to at least one of your survey questions. Even just one question answered qualifies them as a respondent (versus nonrespondent, i.e., the 10 lazy cats who never show up).

Now, you’re running an email survey for, let’s say, Patton Avenue Pet Company. We’ll guess that the email list has 5,000 unique addresses to contact. You send out your survey to all of them.

Your analytics data reports that 3,000 people responded to one or more of your survey questions. Then, 1,200 of those respondents actually completed the entire survey.

3,000/5000 = 0.6 = 60% — that’s your pool of survey respondents who answered at least one question. That sounds pretty good! But some of them didn’t finish the survey. You need to know the percentage of people who completed the entire survey. So here we go:

Completion rate equals the # of completed surveys divided by the # of survey respondents.

Completion rate = (1,200/3,000) = 0.40 = 40%

Voila, 40% of your respondents did the entire survey.

Response Rate vs. Completion Rate

Okay, so we know why the completion rate matters and how we find the right number. But did you also hear the term response rate? They are completely different figures based on separate equations, and I’ll show them side by side to highlight the differences.

  • Completion Rate = # of Completed Surveys divided by # of Respondents
  • Response Rate = # of Respondents divided by Total # of surveys sent out

Here are examples using the same numbers from above:

Completion Rate = (1200/3,000) = 0.40 = 40%

Response Rate = (3,000/5000) = 0.60 = 60%

So, they are different figures that describe different things:

  • Completion rate: The percentage of your respondents that completed the entire survey. As a result, it indicates how sure we are that the information we have is accurate.
  • Response rate: The percentage of people who responded in any way to our survey questions.

The follow-up question is: How can we make this number as high as possible in order to be closer to a truer and more complete data set from the population we surveyed?

There’s more to learn about response rates and how to bump them up as high as you can, but we’re going to keep trucking with completion rates!

What’s a good survey completion rate?

That is a heavily loaded question. People in our industry have to say, “It depends,” far more than anybody wants to hear it, but it depends. Sorry about that.

There are lots of factors at play, such as what kind of survey you’re doing, what industry you’re doing it in, if it’s an internal or external survey, the population or sample size, the confidence level you’d like to hit, the margin of error you’re willing to accept, etc.

But you can’t really get a high completion rate unless you increase response rates first.

So instead of focusing on what’s a good completion rate, I think it’s more important to understand what makes a good response rate. Aim high enough, and survey completions should follow.

I checked in with the Qualtrics community and found this discussion about survey response rates:

“Just wondering what are the average response rates we see for online B2B CX surveys? […]

Current response rates: 6%–8%… We are looking at boosting the response rates but would first like to understand what is the average.”

The best answer came from a government service provider that works with businesses. The poster notes that their service is free to use, so they get very high response rates.

“I would say around 30–40% response rates to transactional surveys,” they write. “Our annual pulse survey usually sits closer to 12%. I think the type of survey and how long it has been since you rendered services is a huge factor.”

Since this conversation, “Delighted” (the Qualtrics blog) reported some fresher data:

survey completion rate vs number of questions new data, qualtrics data

Image Source

The takeaway here is that response rates vary widely depending on the channel you use to reach respondents. On the upper end, the Qualtrics blog reports that customers had 85% response rates for employee email NPS surveys and 33% for email NPS surveys.

A good response rate, the blog writes, “ranges between 5% and 30%. An excellent response rate is 50% or higher.”

This echoes reports from Customer Thermometer, which marks a response rate of 50% or higher as excellent. Response rates between 5%-30% are much more typical, the report notes. High response rates are driven by a strong motivation to complete the survey or a personal relationship between the brand and the customer.

If your business does little person-to-person contact, you’re out of luck. Customer Thermometer says you should expect responses on the lower end of the scale. The same goes for surveys distributed from unknown senders, which typically yield the lowest level of responses.

According to SurveyMonkey, surveys where the sender has no prior relationship have response rates of 20% to 30% on the high end.

Whatever numbers you do get, keep making those efforts to bring response rates up. That way, you have a better chance of increasing your survey completion rate. How, you ask?

Tips to Increase Survey Completion

If you want to boost survey completions among your customers, try the following tips.

1. Keep your survey brief.

We shouldn’t cram lots of questions into one survey, even if it’s tempting. Sure, it’d be nice to have more data points, but random people will probably not hunker down for 100 questions when we catch them during their half-hour lunch break.

Keep it short. Pare it down in any way you can.

Survey completion rate versus number of questions is a correlative relationship — the more questions you ask, the fewer people will answer them all. If you have the budget to pay the respondents, it’s a different story — to a degree.

“If you’re paying for survey responses, you’re more likely to get completions of a decently-sized survey. You’ll just want to avoid survey lengths that might tire, confuse, or frustrate the user. You’ll want to aim for quality over quantity,” says Pamela Bump, Head of Content Growth at HubSpot.

2. Give your customers an incentive.

For instance, if they’re cats, you could give them a glass of water with a fish inside.

Offer incentives that make sense for your target audience. If they feel like they are being rewarded for giving their time, they will have more motivation to complete the survey.

This can even accomplish two things at once — if you offer promo codes, discounts on products, or free shipping, it encourages them to shop with you again.

3. Keep it smooth and easy.

Keep your survey easy to read. Simplifying your questions has at least two benefits: People will understand the question better and give you the information you need, and people won’t get confused or frustrated and just leave the survey.

4. Know your customers and how to meet them where they are.

Here’s an anecdote about understanding your customers and learning how best to meet them where they are.

Early on in her role, Pamela Bump, HubSpot’s Head of Content Growth, conducted a survey of HubSpot Blog readers to learn more about their expertise levels, interests, challenges, and opportunities. Once published, she shared the survey with the blog’s email subscribers and a top reader list she had developed, aiming to receive 150+ responses.

“When the 20-question survey was getting a low response rate, I realized that blog readers were on the blog to read — not to give feedback. I removed questions that wouldn’t serve actionable insights. When I reshared a shorter, 10-question survey, it passed 200 responses in one week,” Bump shares.

Tip 5. Gamify your survey.

Make it fun! Brands have started turning surveys into eye candy with entertaining interfaces so they’re enjoyable to interact with.

Your respondents could unlock micro incentives as they answer more questions. You can word your questions in a fun and exciting way so it feels more like a BuzzFeed quiz. Someone saw the opportunity to make surveys into entertainment, and your imagination — well, and your budget — is the limit!

Your Turn to Boost Survey Completion Rates

Now, it’s time to start surveying. Remember to keep your user at the heart of the experience. Value your respondents’ time, and they’re more likely to give you compelling information. Creating short, fun-to-take surveys can also boost your completion rates.

Editor’s note: This post was originally published in December 2010 and has been updated for comprehensiveness.

Click me

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading


Take back your ROI by owning your data



Treasure Data 800x450

Treasure Data 800x450

Other brands can copy your style, tone and strategy — but they can’t copy your data.

Your data is your competitive advantage in an environment where enterprises are working to grab market share by designing can’t-miss, always-on customer experiences. Your marketing tech stack enables those experiences. 

Join ActionIQ and Snowplow to learn the value of composing your stack – decoupling the data collection and activation layers to drive more intelligent targeting.

Register and attend “Maximizing Marketing ROI With a Composable Stack: Separating Reality from Fallacy,” presented by Snowplow and ActionIQ.

Click here to view more MarTech webinars.

About the author

Cynthia RamsaranCynthia Ramsaran

Cynthia Ramsaran is director of custom content at Third Door Media, publishers of Search Engine Land and MarTech. A multi-channel storyteller with over two decades of editorial/content marketing experience, Cynthia’s expertise spans the marketing, technology, finance, manufacturing and gaming industries. She was a writer/producer for and produced thought leadership for KPMG. Cynthia hails from Queens, NY and earned her Bachelor’s and MBA from St. John’s University.

Source link

Keep an eye on what we are doing
Be the first to get latest updates and exclusive content straight to your email inbox.
We promise not to spam you. You can unsubscribe at any time.
Invalid email address
Continue Reading