Klaviyo's A/B testing feature for campaigns allows you to easily test different subject lines and different email content to help you refine your messaging and optimize your send times.
This article covers how to setup and run an A/B test on a campaign, how to read the results of an A/B test, and a few use cases for A/B testing campaigns.
Create an A/B Testing Campaign
A/B testing works by creating multiple versions of an email message within a single campaign.
- Start by creating a new campaign, and after you've created the first version of your email you will see an Add Variation link appear.
NoteThis option will not appear if you have not yet configured any content for an initial variation.
- Click the Add Variation link and Klaviyo will automatically create a second identical variation of your campaign and bring you to a primary A/B testing page.
NoteThe A/B testing page shows you a list of all your email variations, along with the settings box. You can always return to this page by clicking Message Content in the progress bar at the top right side of the screen.
- To create additional variations, click the Actions dropdown menu for any existing variation and select Duplicate Variation. You can create up to 7 different variations.
Configure A/B Test Settings
After creating your content you have to decide the size of your testing pool and length of your testing period.
- Decide the size of your testing pool by adjusting the slider. The default testing pool size is 20% of your target send list.
If you want to test your variations across your full list - and not schedule a testing period at all - you can slide the scale to 100%. In that case, your send list will be divided into equal groups with each receiving a different variation.
- Decide the length of your testing period using the dropdown menu. The default testing period is 6 hours.
Too short of a testing period, however, will diminish the significance of your results.
Choosing an A/B Test Winner
You can select a method for choosing the winner of your A/B test.
- An Automatic winner is based off of Open Rate or Click Rate. Select the metric you want to determine the winner using the dropdown menu.
- A Manual winner can be chosen at any point during an A/B test's initial testing period. When you do this, the testing period will immediately end and the remaining emails will send to your chosen winner.
Use the following steps to choose a winner manually:
- Schedule a longer-than-desired testing period.
- Navigate to the campaign part-way through the testing period and click on the Variations tab.
- Click on View > Select as Winner for the variation you would like to send to the rest of the campaign recipients.
NoteManually choosing a winner is only possible when an A/B test is still underway. It is not possible to configure an A/B test so that it stops automatically at a certain point and prompts you to choose your own winner.
A/B Test Send Times
When testing send times, your test by default includes 100% of your list and there is no testing period. The time of send is the test itself. In the Manage Variations window for A/B testing send times, you can thus only adjust time.
NoteWhen A/B testing send times, we recommend keeping the subject line and content the same for each variation.
- Toggle your view by clicking the "switch to test send times" link at the very bottom of the Test Settings window.
- Select the number of variations in your test
- Select the date/send time for each variation
Review A/B Test Results
- Navigate to the campaign you are testing.
- Select the Variations tab of the main campaign report.
- Select Conversion Metric - The last column of this report is adjustable and can feature the conversion metric of your choice. To switch the conversion metric visible in this last column, navigate to the Overview tab and click to change the conversion metric as shown below.
Run an A/B Test Using a Control Group
You may only want to run an A/B test on a portion of your audience -- your newsletter list, let's say. To do this, first take a random sample of your newsletter list (or segment) using the Sample List Members tool found in the Manage List dropdown. Here, you can select the size of the sample you would like to use as your test group. This will create a new list.
Next, you can schedule your A/B test campaign to send to this group. After doing this, schedule the control email to send to the entire newsletter list, and exclude your sample list. This will ensure that everyone on the list who received the test campaign won't receive this second, control campaign.
A/B Test Sending Cadence
You can use a method similar to the one outlined above to test your sending cadence. For example, you may be wondering if you'll see higher open rates if you only send your newsletter out twice per week rather than three times per week.
First, take a sample of your list or segment (let's say your newsletter list) by navigating to Manage List > Sample List Members. You can then test a different sending cadence with this sample list -- just be sure to exclude the sample list from the campaigns you send to your main newsletter list to ensure that they're not receiving duplicate emails.
Other A/B Testing Examples
Here are a few examples of how you can use A/B testing.
- Testing multiple subject lines for every campaign to understand whether customers like short or long subjects
- Offering different discounts (10%, 20%, no offer) in different variations to see how discounts may/may not drive increased conversions
- Figuring out whether plain-text or graphically rich HTML emails perform better in terms of click-through rates and conversions