Topics: Minitab Statistical Software

In the past, some marketers may have shuddered at hearing the word “statistics.” But today, if you’re not incorporating statistics and analytics into your marketing strategy you’re going to be left behind.

The good news is that companies like Minitab make analyzing data easy. As you’ll see in the A/B testing example below, dipping your toe in the statistical “pool” is easy as 1-2-3.

## What is A/B Testing?

A/B testing, also known as split testing or bucket testing, refers to a randomized experimentation process wherein two or more versions of a variable (web page, page element, email etc.) are shown to different audiences at the same time to determine which version drives the most impact. Some examples of this are sending multiple emails to see which one generates more engagement, using different advertisements to measure click through rates, or split URL testing, which splits website traffic between a control group (often the original web page) and a variation (such as a new design).

## A Simple Method for Simple A/B Testing

Most marketers use some form of A/B testing. Unfortunately, despite best efforts to determine the content that drives the highest engagement, most marketers do not run a hypothesis test to determine whether their results are actually statistically significant.

There are different types of A/B testing, including multivariate testing, which alters multiple components at the same time. Today we’re looking at simple A/B testing comparing two groups. For this purpose, the most appropriate tool is an easy-to-use hypothesis test called a Two-Sample Proportions Test.

## Example 1: Running a Proportion Test for A/B Testing of an Email Campaign

Imagine a marketer runs a monthly email promoting training courses. The goal of these emails is to generate customer interest and awareness about training. She’s had reasonably good success in the past, but still wants to see if she can improve her performance.

She decides to run an A/B test. She wants to target 2,329 customers. She decides to break up the customers into two groups: Email 1 and Email 2. For Email 1, she is sticking with the same email she sent last month and sends this to approximately 50% of her customers. The other 50% are sent Email 2.

She sends the emails. Email 1 gets a 12% open rate (140 opens) and Email 2 gets a 9.8% open rate (115 opens). Most marketers would then say Email 1 performed better, correct?

We can leverage analytics to decide if this 2.2% (12% - 9.8%) difference is statistically significant. But many people wouldn’t, because it is too difficult to know where to go to help us answer this question in a quick and easy way. In three clicks and four numbers, you can know if the difference is statistically significant!

The method is a Two-Sample Proportion Test in Minitab. It will tell you if these two proportions are statistically different, based on the sample sizes, level of confidence and observed open rates. By going to Stat > Basic Statistics > 2 Proportions, Minitab presents a dialogue window for me to make a quick calculation of my summarized data. I simply plug in my data (with events corresponding to opens and sample corresponding to number of emails) and Minitab does the rest!

## Conclusion

Looking at the results below I see that the true difference between the groups, with a 95% confidence interval, can just as likely be 0 as .022 (or 2.2%). In stats speak, this means we “Fail to reject the null hypothesis which states that the two proportions are equal.” In other words, the two groups are NOT statistically different at the 95% confidence level. Despite one email getting a 12% open rate, and another one getting 9.8% open rate – there is not a statistical difference between the performance of the emails. Concluding that these emails result in different open rates could be a mistake.

Comparing two proportions is a useful method to ensure you fully benefit from your A/B testing strategy. By incorporating analytics into your marketing work and improving the data literacy of your team, you can ultimately drive better decision making.