A/B Testing a Campaign

read

Overview

Klaviyo's A/B testing feature for campaigns allows you to easily test different subject lines and content to help you refine your messaging and optimize your send times.

This article covers how to set up and run an A/B test on a campaign, how to read the results of an A/B test, and a few use cases for A/B testing campaigns.

Create an A/B Testing Campaign

A/B testing works by creating multiple versions of an email message within a single campaign.

  1. Start by creating a new campaign, and after you've created the first version of your email you will see an Add Variation link appear.

    Note

    This option will not appear if you have not yet configured any content for an initial variation.
    campaignsAB_AddVariation.png
  2. Click Add Variation and Klaviyo will automatically create a second, identical variation of your campaign and bring you to an A/B testing page.

    Note

    The A/B testing page shows you a list of all your email variations, along with the settings box. You can always return to this page by clicking Message Content in the progress bar at the top right side of the screen.
    campaignsAB_ReturnToVariationPage.png
  3. To create additional variations, click the Actions dropdown menu for any existing variation and select Duplicate Variation. You can create up to 7 different variations.

Configure A/B Test Settings

After creating your content, decide the size of your testing pool and the length of your testing period.

  • Edit the size of your testing pool by adjusting the slider. The default testing pool size is 20% of your send list.
    campaignsAB_SampleSize.png
    If you want to test variations across your full list -- and not schedule a testing period -- you can slide the scale to 100%. In this case, your send list will be divided into equal groups, with each receiving a different variation.
  • Decide the length of your testing period using the dropdown menu. The default testing period is 6 hours. This means that the variations will send to X% of your send list over the amount of time you specify, after which point the winning variation will be sent to the remainder of the send list. For this reason, you should plan to send an A/B test in advance of when you would like the majority of your campaign to be sent.
    campaignsAB_TestingWindow.png
    You may wish to shorten your test period from six hours. However, too short of a testing period will diminish the significance of your results.

At the end of the testing period, the test automatically selects a winner based on the data received from the testing pool. In the event that there is no data (for example, if no users in the testing pool open or click any of your variations), then Variation A is automatically selected as the winner and sent to the remainder of your send list.

Choosing an A/B Test Winner

You can select a method for choosing the winner of your A/B test.

  • You can choose to have Klaviyo automatically select a winner based on Open Rate or Click Rate (Unique clicks). Select the metric you want to determine the winner.
    campaignsAB_ChooseWinner.png
  • You can manually choose a winner at any point during an A/B test's initial testing period. When you do this, the testing period will immediately end and the remaining emails will send to your chosen winner.
    camps_abTestTimes.png
    Use the following steps to choose a winner manually:
    1. Schedule a longer-than-desired testing period.
    2. Navigate to the campaign while the test is still running and click the Variations tab.
      campaignsAB_ChooseVariation.png
    3. Click View > Select as Winner for the variation you would like to send to the rest of the campaign recipients.

Note

Manually choosing a winner is only possible while an A/B test is still underway. It is not possible to configure an A/B test so that it stops automatically at a certain point and prompts you to choose your own winner.

A/B Test Send Times

When testing send times, your test includes 100% of your list by default, and there is no testing period. This is because the send time is the test. In the Manage Variations window for A/B testing send times, you can thus only adjust time.

Note

When A/B testing send times, we recommend keeping the subject line and content the same for each variation. This is to minimize the other variables that may also influence opens and clicks and isolate send time as the primary test.
  1. Toggle your view by clicking the "switch to test send times" link at the very bottom of the Test Settings window.
  2. Select the number of variations in your test
  3. Select the date/send time for each variation
    campaignsAB_SendTime.png

Review A/B Test Results

  1. Navigate to the campaign you are testing.
  2. Select the Variations tab of the main campaign report.
    camps_abTestShowWinner.png
  3. Select Conversion Metric. The last column of this report is adjustable and can feature the conversion metric of your choice. To switch the conversion metric visible in this last column, navigate to the Overview tab and click to change the conversion metric as shown below.
    campaignsAB_ConvoMetric.png

Run an A/B Test Using a Control Group

You may only want to run an A/B test on a portion of your audience -- your newsletter list, for example. To do this, take a random sample of your newsletter list (or segment) using the Sample List Members tool found in the Manage List dropdown. Here, you can select the size of the sample you would like to use as your test group. This will create a new list.

2017-10-05_11-14-00.png

Next, you can schedule your A/B test campaign to send to this group. After doing this, schedule the control email to send to the entire newsletter list, and exclude your sample list. This will ensure that everyone on the list who received the test campaign won't receive this second, control campaign.

2017-10-05_11-46-54.png

A/B Test Sending Cadence

You can use a method similar to the one outlined above to test your sending cadence. For example, you may be wondering if you'll see higher open rates if you only send your newsletter out twice per week rather than three times per week. 

First, take a sample of your list or segment (for example, your newsletter list) by navigating to Manage List > Sample List Members. You can then test a different sending cadence with this sample list.

Be sure to exclude the sample list from the campaigns you send to your main newsletter list to ensure that they're not receiving duplicate emails.

Other A/B Testing Examples

Here are a few examples of how you can use A/B testing.

  • Testing multiple subject lines for every campaign to understand whether customers like short or long subjects
  • Offering different discounts (10%, 20%, no offer) in different variations to see how discounts may/may not drive increased conversions
  • Figuring out whether plain-text or graphically rich HTML emails perform better in terms of click-through rates and conversions

A/B Test Exceeded Account Sending Limits

Klaviyo will automatically cancel A/B test campaigns when the testing pool exceeds an account's monthly sending limits. If you receive an in-app notification that reads: "Your A/B test campaign CAMPAIGN_NAME was automatically canceled for exceeding account sending limits," you will need to upgrade to a higher plan in order to resend the campaign.

You can view your account's monthly sending limit in your Account Overview area. Here, you can also upgrade your plan. Once you have done so, navigate to your Campaigns tab, find the canceled campaign, and click Clone. From there, you'll be able to configure and send your new campaign.

2017-11-30_15-14-22.png

Was this article helpful?
15 out of 17 found this helpful