Guide to A/B Testing

Klaviyo's A/B testing feature for campaigns allows you to easily test different subject lines and different email content to help you refine your messaging and optimize your send times.

This article covers how to setup and run an A/B test on a campaign, how to read the results of an A/B test, and a few use cases for A/B testing campaigns.

Create an A/B Testing Campaign

A/B testing works by creating multiple versions of an email message within a single campaign.

  1. Start by creating a new campaign, and after you've created the first version of your email you will see an Add Variation link appear.

    Note

    This option will not appear if you have not yet configured any content for an initial variation.
    camps_addvariation.png
  2. Click the Add Variation link and Klaviyo will automatically create a second identical variation of your campaign and bring you to a primary A/B testing page.

    Note

    The A/B testing page shows you a list of all your email variations, along with the settings box. You can always return to this page by clicking Content in the progress bar at the top right side of the screen.
    camps_messageContent.png
  3. To create additional variations, click the Actions dropdown menu for any existing variation and select Duplicate Variation. You can create up to 7 different variations.

Configure A/B Test Settings

After creating your content you have to decide the size of your testing pool and length of your testing period.

  • Decide the size of your testing pool by adjusting the slider. The default testing pool size is 20% of your target send list.
    camps_testingpool.png
    If you want to test your variations across your full list - and not schedule a testing period at all - you can slide the scale to 100%. In that case, your send list will be divided into equal groups with each receiving a different variation.
  • Decide the length of your testing period using the dropdown menu. The default testing period is 6 hours.
    camps_testtimes.png
    Too short of a testing period, however, will diminish the significance of your results.

Choosing an A/B Test Winner

You can select a method for choosing the winner of your A/B test.

  • An Automatic winner is based off of Open Rate or Click Rate. Select the metric you want to determine the winner using the dropdown menu.
    camps_choosingwinner.png
  • A Manual winner can be chosen at any point during an A/B test's initial testing period. When you do this, the testing period will immediately end and the remaining emails will send to your chosen winner.
    camps_selectwinner.png
    Use the following steps to choose a winner manually:
    1. Schedule a longer-than-desired testing period.
    2. Navigate to the campaign part-way through the testing period and click on the Variations tab.
      camps_manualwinnerselect.png
    3. Click on View > Select as Winner for the variation you would like to send to the rest of the campaign recipients.

Note

Manually choosing a winner is only possible when an A/B test is still underway. It is not possible to configure an A/B test so that it stops automatically at a certain point and prompts you to choose your own winner.

A/B Test Send Times

When testing send times, your test by default includes 100% of your list and there is no testing period. The time of send is the test itself. In the Manage Variations window for A/B testing send times, you can thus only adjust time.

Note

When A/B testing send times, we recommend keeping the subject line and content the same for each variation.
  1. Toggle your view by clicking the "switch to test send times" link at the very bottom of the Test Settings window.
  2. Select the number of variations in your test
  3. Select the date/send time for each variation
    camps_abSelectTestTime.png

Review A/B Test Results

  1. Navigate to the campaign you are testing.
  2. Select the Variations tab of the main campaign report.
    camps_abTestShowWinner.png
  3. Select Conversion Metric - The last column of this report is adjustable and can feature the conversion metric of your choice. To switch the conversion metric visible in this last column, navigate to the Overview tab and click to change the conversion metric as shown below.
    camps_convometric.png

A/B Testing Examples

Here are a few examples of how you can use A/B testing.

  • Testing multiple subject lines for every campaign to understand whether customers like short or long subjects
  • Offering different discounts (10%, 20%, no offer) in different variations to see how discounts may/may not drive increased conversions
  • Figuring out whether plain-text or graphically rich HTML emails perform better in terms of click-through rates and conversions
Was this article helpful?
3 out of 3 found this helpful