A/B Testing: What, Why, and How

BDO Digital Demand Generation Group

A/B Testing

Marketers frequently talk about A/B testing. Although it sounds like a fancy marketing term, it’s a very straightforward method of honing your message and developing a better understanding of your target audience’s pain points, preferences, and behavior. What is it exactly, and how do you do it right?

This post takes a look at the what and why of A/B testing, and provides some recommendations for getting it right.

What is A/B testing and why should you do it?

Essentially, A/B testing is the process of sending out two versions of the same piece of marketing content with only one variable changed. An example would be a marketing email with two different subject lines. The idea is to take a certain percentage of your total send count to conduct your test on (aka your test group). Once the winner is determined, the remaining group gets the winning email in order to optimize conversion rates.

For instance, say you’re sending out an email to 15,000 people and want to test two messages. You decide to use 30% of your list as your test group, so the remaining 70% will get the winning email. The first email’s subject line would highlight the first message (A) and the second email’s subject line would focus on the second message (B):

A: Learn the best way to perform A/B testing

B: Mistakes to avoid when performing A/B testing

While A focuses on educating recipients on best practices, B focuses on helping them avoid mistakes. Although both are accurate messages for the blog post, they appeal to the audience in different ways. Is your audience more concerned with learning best practices, or are they more worried about making mistakes?

To perform the A/B test, you’d simply send Email A to 2,250 recipients and Email B to the other 2,250, then track open rates to see which subject line performed best. When the results come in, you can analyze your findings. If the two subject lines performed about the same, then you can surmise that both messages are relevant to the audience and should be used, perhaps alternatively, to drive interest and conversions. However, if one performs substantially better than the other, you can send the winning subject line to the remaining 70% of your send list.

In addition to testing messages, A/B testing enables you to try out different marketing tactics to see which ones get your target prospects’ attention. For example, when you’re A/B testing CTA buttons, you can try using certain words or phrases to see if they generate more click-throughs. The example below shows an A/B test for a CTA button:

A: Read the blog post

B: Learn about A/B testing

A is more direct and tells the reader to complete an action, whereas B hints at the benefits of reading the blog post. If one approach gets substantially more click-throughs, you’ll have a better idea of how to compel your audience to read other content in the future.

Many marketing teams use A/B testing across various types of content, such as blog posts, social media posts, social ads, and other deliverables. A/B testing is also quite useful for website copy and design, helping teams to make data-driven decisions about how to optimize landing pages or sections of a website for maximum engagement and interaction. HubSpot used A/B testing, for example, to discover the optimal number of form fields for their prospects. You can use A/B testing to test out new website features, as well. In fact, Netflix recently shared a series of articles on how they use A/B testing to “deliver a joyful customer experience.”

Analyzing the data around how prospects and customers respond to various messages and approaches in your content and marketing outreach eliminates guesswork and reduces the risk of spending time and money on content that doesn’t hit the mark.

8 best practices for doing A/B testing the right way

Done correctly, A/B testing can provide actionable insights that help you reduce bounce rates and increase conversions. Here are some best practices we recommend:

  1. Decide what to test. If you’re just getting started with A/B testing, start small. Try it out with a CTA button or blog post title. Once you get the hang of it, you can advance to something more sophisticated, such as two variations of a landing page.
  2. Decide what to measure. The metrics you track will vary depending on what you’re testing. For an email, you will likely be interested in open rates and click-throughs. For a web page, you’ll track time on page and bounce rates. Set your goals and KPIs prior to designing your test.
  3. Determine the variables. What’s going to move the needle the most? Is it a phrase? The length of a piece of content? A design element? Choose the element you think will make the biggest difference if you change it.
  4. Play up contrast. Changing a word in a CTA button is significant, but in a blog post title, one word won’t make a difference. Consider what you’re testing, and make sure the variable in your B version is substantially different from your original version.
  5. Keep everything but the variable the same. Just like in a controlled scientific experiment, it’s only possible to measure the impact of a change if everything else stays constant. For example, if you’re sending out an email with two different subject lines, be sure to send them at the exact same time, since time of day, day of the week, and other time-related variables can have a big impact on visibility and engagement.
  6. Get the right tools and expertise in place. To properly analyze your data, you’ll need an analytics solution, such as Google Analytics, that’s intuitive and easy to use. Although you probably won’t need the help of a data scientist, enlisting someone with A/B testing expertise to help you analyze your results is a good idea, if you don’t have that capability in-house.
  7. Make sure the size of your test groups is statistically significant. In other words, make sure that your test groups are large enough to offer accuracy. HubSpot offers a downloadable calculator to help you determine if your test is statistically significant.
  8. Let time pass before sending the winning email. When testing emails, be patient before sending out the winner. Waiting is important because you want to give your recipients time to react to the asset (open the email, click the CTA button, etc.). We recommend waiting at least four hours (most marketing automation platforms can send the winner automatically for you, but you’ll need to set the wait time). In some cases, you might even wait up to a day before sending your winner.

Getting started with A/B testing can be a little overwhelming if you’ve never done it before. With our expert Campaign Services, BDO Digital is here to help. We provide a complete, cost-efficient Demand Center to extend your team’s reach and take care of any and all of your campaign needs, including campaign creation and execution, A/B testing, list management, and reporting.

The post A/B Testing: What, Why, and How appeared first on DemandGen.

Previous Article
It’s Time to Go with the Flow: Salesforce Is Retiring Workflow Rules and Process Builder (and Yes, This Matters to Marketers)
It’s Time to Go with the Flow: Salesforce Is Retiring Workflow Rules and Process Builder (and Yes, This Matters to Marketers)

Last fall, Salesforce announced at Dreamforce that they’re planning to retire Workflow Rules and Process Bu...

Next Article
Mental Health and the Modern Marketer: 3 Strategies to Help You Cope When You’ve Started to Lose Hope
Mental Health and the Modern Marketer: 3 Strategies to Help You Cope When You’ve Started to Lose Hope

If we take anything away from the Great Resignation, it should be this: employees have reached a breaking p...