Klaviyo's A/B testing feature for campaigns allows you to easily test different subject lines and content to help you refine your messaging and optimize your send times. A/B testing is currently only available for emails and cannot be used for SMS messages.
This article covers how to set up and run an A/B test on a campaign, how to read the results of an A/B test, and a few use cases for A/B testing campaigns.
Create an A/B Test for a Campaign
A/B testing works by creating multiple versions of an email message within a single campaign.
- Start by creating a new campaign, and after you've created the first version of your email you will see an Add Variation link appear.
Note: This option will not appear if you have not yet configured any content for an initial variation.
- Click Add Variation and Klaviyo will automatically create a second, identical variation of your campaign and bring you to an A/B testing page.
The A/B testing page shows you a list of all your email variations, along with the settings box. You can always return to this page by clicking Message Content in the progress bar at the top right side of the screen.
- To create additional variations, click the Actions dropdown menu for any existing variation and select Duplicate Variation. You can create up to seven different variations.
Configure A/B Test Settings
After creating your content, decide the size of your testing pool and the length of your testing period.
- Edit the size of your testing pool by adjusting the slider. The default testing pool size is 20% of your send list.
If you want to test variations across your full list -- and not schedule a testing period -- you can slide the scale to 100%. In this case, your send list will be divided into equal groups, with each receiving a different variation.
- Decide the length of your testing period using the dropdown menu. The default testing period is six hours. This means that the variations will send to X% of your send list over the amount of time you specify, after which point the winning variation will be sent to the remainder of the send list. For this reason, you should plan to send an A/B test in advance of when you would like the majority of your campaign to be sent.
You may wish to shorten your test period from six hours. However, too short of a testing period will diminish the significance of your results.
At the end of the testing period, the test automatically selects a winner based on the data received from the testing pool. In the event that there is no data for the winning metric, then Variation A is automatically selected as the winner and sent to the remainder of your send list. For instance, if you chose click rate as the winning metric but there is a 0% click rate for every variation, Variation A will win by default — even if another variation had a higher open rate.
Choosing an A/B Test Winner
You can select a method for choosing the winner of your A/B test.
- You can choose to have Klaviyo automatically select a winner based on open rate or click rate (unique clicks). Klaviyo will recommend a winner based on what is different between the two campaigns. For example, if the subject lines are different, open rate will be the recommended winner metric. However, you can select either metric.
- You can manually choose a winner at any point during an A/B test's initial testing period. When you do this, the testing period will immediately end and the remaining emails will send to your chosen winner.
- Use the following steps to choose a winner manually:
- Schedule a longer-than-desired testing period.
- Navigate to the campaign while the test is still running and click the Variations tab.
- Click View > Select as Winner for the variation you would like to send to the rest of the campaign recipients.
Manually choosing a winner is only possible while an A/B test is still underway. It is not possible to configure an A/B test so that it stops automatically at a certain point and prompts you to choose your own winner.
A/B Testing Send Times
When testing send times, your test includes 100% of your list by default, and there is no testing period. This is because the send time is the test. In the Manage Variations window for A/B testing send times, you can thus only adjust time.
- Toggle your view by clicking the "switch to test send times" link at the very bottom of the Test Settings window.
- Select the number of variations in your test
- Select the date/send time for each variation
Review A/B Test Results
- Navigate to the campaign you are testing.
- Select the A/B Test Results tab of the main campaign report.
- Select Conversion Metric. The last column of this report is adjustable and can feature the conversion metric of your choice. To switch the conversion metric visible in this last column, navigate to the Overview tab and click to change the conversion metric as shown below.
Running an A/B Test Using a Control Group
You may only want to run an A/B test on a portion of your audience — your newsletter list, for example. To do this, take a random sample of your newsletter list (or segment) using the Sample List Members tool found in the Manage List dropdown. Here, you can select the size of the sample you would like to use as your test group. This will create a new list.
Next, you can schedule your A/B test campaign to send to this group. After doing this, schedule the control email to send to the entire newsletter list, and exclude your sample list. This will ensure that everyone on the list who received the test campaign won't receive this second, control campaign.
A/B Testing Sending Cadence
You can use a method similar to the one outlined above to test your sending cadence. For example, you may be wondering if you'll see higher open rates if you only send your newsletter out twice per week rather than three times per week.
First, take a sample of your list or segment (for example, your newsletter list) by navigating to Manage List > Sample List Members. You can then test a different sending cadence with this sample list.
Be sure to exclude the sample list from the campaigns you send to your main newsletter list to ensure that they're not receiving duplicate emails.
Other A/B Testing Examples
Here are a few examples of how you can use A/B testing.
- Testing multiple subject lines for every campaign to understand whether customers like short or long subjects
- Offering different discounts (10%, 20%, no offer) in different variations to see how discounts may/may not drive increased conversions
- Figuring out whether plain-text or graphically rich HTML emails perform better in terms of click through rates and conversions
- Using URL parameters to test how different links drive customers to the same content. You can accomplish this in Klaviyo by differentiating your links with different parameters.
A/B Test Exceeded Account Sending Limits
Klaviyo will automatically cancel A/B test campaigns when the testing pool exceeds an account's monthly sending limits. If you receive an in-app notification that reads: "Your A/B test campaign CAMPAIGN_NAME was automatically cancelled for exceeding account sending limits," you will need to upgrade to a higher plan in order to resend the campaign.
You can view your account's monthly sending limit in your Account Overview area. Here, you can also upgrade your plan. Once you have done so, navigate to your Campaigns tab, find the cancelled campaign, and click Clone. From there, you'll be able to configure and send your new campaign.
Read other articles on A/B testing: