Review Your Email A/B Test Results

read
Last updated at:

Overview

A/B testing campaigns can provide valuable insight into your audience. Klaviyo makes determining what changes you should make to your sending strategy simple. Not only will you know the winning variation for an A/B test, but also other important factors, like the win confidence percentage and winning metric details. In this article, we’ll go over what you can find on the A/B test results page as well as what each of these indicators mean.

A/B Test Results

To see the results of an A/B test, navigate to the campaign you are testing. Next, select the A/B Test Results tab of the main campaign report.

2020-12-03_09-18-19.png

This page is broken into two parts:

  1. Test Overview
  2. Variations

The sections below break each of these down.

Test Overview

In the Test Overview section, you’ll see a snapshot of the test results, including the:

  • Winning variation, win confidence, and statistical significance
  • Status
  • Winning metric
  • Test size

Winning Variation, Win Confidence, and Statistical Significance

On the left-hand side, you’ll see the winning variation, if the test is finished, or the variation currently in the lead for any ongoing A/B test.

Below this, you’ll see the win confidence. This is how likely it is that the winning variation truly had the highest campaign open or click rate after taking random chance into account.

Say, for instance, you want to see if adding emojis to your subject line impacts your open rate with your newsletter list. If you only send the test to two subscribers and the emojis have a better open rate, you wouldn’t say conclusively that the rest of your newsletter list would feel the same way. If instead you tested with 100 people in your newsletter list and found that emojis have a substantially higher open rate, you would feel much more confident that using emojis in your subject lines is better for your audience.

Under the win confidence, you’ll see whether or not the variation shown is statistically significant. If it is deemed significant, this means that the variation has a high win probability. For Klaviyo campaigns, this means the variation has a 90% chance or more of winning. In this case, a green tag saying Statistically significant will appear. On the other hand, if the variation is not statistically significant, you will see a gray tag saying Not statistically significant. Nothing will appear if the results are inconclusive. For details on how the statistical significance of a campaign is decided, read this article on how statistical significance is determined.

2020-12-03_09-18-19.png

Status

The Test Overview section will also show the status for the test; i.e., whether it is still in progress or complete. If the test is complete, the date and time when it finished is also displayed.

2020-12-03_09-18-19.png

Winning Metric

The winning metric, which is the metric you choose when you start an A/B test, is shown here as well. This area will also show the percent difference between the leading and next-best variation.

2020-12-03_09-18-19.png

Test Size

As the name suggests, the test size is the size of the test group for a certain variation. You’ll see this as both a percentage and the actual number of recipients.

  • 2020-12-03_09-18-19.png

Variations

In the Variations card, you’ll see the following (from left to right):

  • The variation subject line, preview text, and sender name and email address
  • Number of recipients
  • Open rate and number of opens
  • Click rate and number of clicks
  • Chosen conversion metric and number of recipients who converted via that metric

For example, the default conversion metric is Placed Order. If you use this, the card will show the placed order rate, the number of recipients who placed an order, and the average order value

2020-12-03_09-18-19.png

12 hours after the campaign sends — or, if you chose to run an A/B test with a test period, 12 hours after the test period ends — you’ll see a trophy icon next to the subject line of the winning variation.

Additional Resources

Check out this strategy guide: Best Practices for A/B Testing

Learn about other A/B tests you can run:

x
Was this article helpful?
7 out of 10 found this helpful