How to A/B test a sign-up form

read
Last updated at:

You will learn

Learn how to use A/B testing to measure how different elements of your sign-up forms perform, so that you can ensure it resonates well and acquires subscribers. 

Creating an A/B test

Creating a test

There are 3 different ways to create an A/B test for your sign-up form:

  • From the left-side menu on the sign-up form page
  • By clicking into the analytics for a live form
  • From inside of the form builder

No matter where you enter from

  1. Select Create A/B Test to get started.
    In the signup form editor, click Create A/B test.
  2. Next, name your test something descriptive. If you’re testing colors of the buttons on your newsletter popup, you could name the test Newsletter - Color of CTA.
  3. After you’ve named your A/B test, give your sign-up form variations different names by selecting the pencil icon by your form’s current name.
    • By default, it will come with the name Copy # of Name of Form. With the different color CTA example from before, you might have your forms named Newsletter - Blue and Newsletter - Gray.
    • Here, you can also add up to 7 variations to your A/B test. Keep in mind, the more variations you add, the longer you’ll need to run your test. An ideal A/B test would have between two to four variations. In the example below, there are three variations: the original form and the 2 additional tests.
  4. After you’ve configured your test, select Create Test.

After you create an A/B test, this menu will allow you to rename the two variations of the form, or create more variations.

Configuring the variations

After you’ve created your test, you’ll be directed back to the sign-up form editor. You will be able to toggle between the different variations of your form by selecting the name of your form. From this menu, you can also choose to change the name of the variation, create an additional variation, or delete the variation.

In the signup form editor, the drowdown arrow next to the form's name showing the variations and allow you to edit them or add more.

Once you’re done configuring your form, select Continue to Test Settings.

Configuring A/B test settings

You can reach the test settings page in 3 different ways. You can:

  1. Click the Continue to Test Settings button from the sign-up form editor.
  2. Select a form in draft mode with a draft test from the sign-up form landing page.
  3. Select Edit Test from the A/B test results page. The customizable settings of your test will appear here. You can continue to customize the title, notes, and variations of the test from the top of the settings menu.

Additionally, if you select Preview, you’ll be able to see the settings of each variation of the test. You will not be able to edit the content of a form when an A/B test is in progress. If you want to edit the form, you will need to stop the test.

The A/B test settings page where you can see the settings for each variation of the test.

Traffic selection

From this section, you can determine how much traffic goes to the variations of your form. By default, each form will appear to an equal amount of viewers. As your test runs, if one of your variations is outperforming the rest, Klaviyo will direct traffic to the better performing form while the test is still running.

If you toggle to select Weight - Manual, you can specify how much traffic you want to go to each variation of the form. The amounts need to add up to 100%.

The weight-manual toggle on the A/B test settings page that lets you specify how much you want each variation of the form to get.

Winner selection

Similar to the Traffic Selection section, you have 2 options in determining how to choose the winner of the test. You can end the test automatically or manually.

Automatically

If you end your test automatically, it will end when there is 1 variation that outperforms the other(s) at a statistically significant level or when it reaches a particular date. You can also select both boxes and the test will end when 1 of the conditions are met. For example, if you have both conditions selected and your form will only run for a month, if the form has statistically significant results after a week, the test will end.

If you end your test when the results are statistically significant alone, Klaviyo’s data science model will do the math for you and pick a winner. 

If you choose to send the test on a particular date, the test will end at midnight coordinated universal time (UTC).

On the A/B test settings page you can set a winner to be automatically selected by checking on two options under winner selection and setting an end date.

Regardless of which automatic selection you pick, after the test has finished, the form that wins will be live on your site.

Manually

If you choose to end your test manually, you will choose both the winner and the date that the test ends. To stop your test, select Choose Winner from the A/B test settings page of a live test. After you stop your test, the test you select as a winner will be live on your site.

After you’ve adjusted the test settings, select Publish A/B Test. Before your test is underway, Klaviyo will check if there is an issue publishing forms to your site and if your form has all of the content it needs (e.g., consent language, CTA, submit fields, etc.).

Review test results

While the test is running or after you’ve completed your A/B test, you can see the data in your account by clicking on the form. In the overview page, you can see how your form is performing overall. Click to the A/B Test Results tab to dive into the analytics of your A/B test specifically.

Edit A/B test

You can edit your A/B test from the A/B Test Results page. Note that while you can change the test settings, you won’t be able to change the test content. To edit the content, you’ll have to end the test and start it again.

Choose winner

From this button, you’ll be able to manually stop the test and choose a winner at any time. If you select Choose Winner, you’ll have the option to end the test and set your form to live or draft mode.

Edit details

This button will only appear on tests that are completed. Here, you will be able to edit the test name and notes.

On the A/B test results page, click edit A/B test to edit the test name and notes.


Test overview

In the test overview panel, you’ll get a snapshot of how your test is performing.

Win probability

The win probability denotes the likelihood that the current variation that is winning will be the statistically significant winner. As your test gains more viewers and is live for a longer period of time, you should see this number increase. You’ll know when the test results become statistically significant from the label below the win probability. If you’re viewing a completed test, there will also be a reason stating why the test ended (e.g., manually ended by choosing a winner).

The win probability section highlighted on the A/B test results page for a specific variation.

Test name, status, start date

The next 3 sections denote the name of the test, the status of the test (e.g., live or completed), and the date the test began. If the test is completed, you’ll also see the date the winner was selected. 

The test name, test status, and the date the test began highlighted from the overview section on the A/B test results page for a specific variation.

Current weights

Below, you’ll see the current weight of traffic directed to each form. If you have manually selected how much traffic goes to each form, these numbers will be consistent with the settings you selected when you started the test. If you’re using Klaviyo’s data science model, you will see the weight of traffic changing over time, giving more traffic to the form that is winning and letting you gain more subscribers as you run your test.

The current weights for each variation highlighted from the overview section of the A/B test results page.


Form submit rate graph

Next to the test overview is a plot of your variations. It shows the form submit rate by variation over the duration of the A/B test. If you’re interested in a particular date, hover over data points for more information.

Below the graph, you’ll see the raw data from your test. Next to each variation, you can see a preview of each form, the number of views, the number of people who have submitted your form, and the form submit rate for each variation.

The form submit rate graph on the A/B test results page showing submit rate by variation plotted, and the raw data for each test listed below.

A/B testing ideas

While you can test any idea that you may have for your forms, some common ideas for testing are:

  • Colors
  • Copy
  • Audience
  • Page
  • Type of form

New to A/B testing? Head to our article for best practices on A/B testing.

FAQs

I have a notification that my form that has a time delay A/B test is calculating form submit rate differently. Why is that?

If you’re testing a time delay on your form, your form submit rate will be calculated by looking at the total conversions as a percentage of the total people eligible to see your form, rather than the number of views the form had.

Say, for instance, your A/B test is evenly weighted and has the message, “Need help browsing?” In the test, one form shows immediately upon someone visiting your site, and the other form loads after a minute. While the same number of people will be queued up to see both forms, more people will see the form that shows immediately. While there will not be as many people who see the form with the delay, the form offers help to browsers who may be confused, and thus may still have a greater form submit rate. To test if the form delay leads to a higher or lower submission rate, the form submit rate will not be the amount of submits as a percentage of the number of views, but rather the amount of submits as a percentage of the number of people who could have seen the form.


I selected both boxes for the Winner Selection part of my A/B test. Will the test end when both or either one of the conditions are met?

These checkboxes function as an OR condition, meaning that either one of the conditions can be met and the test will stop. Both conditions do not need to be met for the test to stop.

Can I edit the test settings for my form A/B test after it's already running?

Yes, click into your A/B test from your form and select Edit A/B Test to change the settings. You will not be able to change the content of the forms without ending your current test and starting a new one, however.

If someone doesn’t engage with a form that is a part of an A/B test, when the form shows to them a second time, will they see a different version of the form?

No, they will be cookied upon viewing the form the first time and will only ever see that variation unless they clear their cookies or are browsing in an anonymous browser.

When I create an A/B test for my form, will my form go to draft mode?

The form will stay at whatever state your form was in before you created the test. If it was live mode and you create an A/B test, the current version will be live. If it was in draft, the form will be in draft mode.

I had a typo in my form A/B test, so I published a new one. How can I permanently delete the test with a typo?

Go to the results of the test you want to get rid of by selecting on the form and going to A/B Test Settings then choose Delete Test.

I set my form A/B test live, then afterward I realized I had a typo. I don’t want to stop the test, re-build it, and set it back live. What should I do?

Head to the A/B Test Settings page while the test is running by clicking Edit A/B Test from the form analytics and click Edit Content. This will create a new draft A/B test, and you can make edits to it as needed and then set it live by clicking Publish A/B Test.

One of my form variations is performing very poorly and I want to update the amount of traffic going to the form. Can I do that?

Yes. Click on your form and go to the A/B test settings for your particular form. Select Edit A/B Test and either change the weighting or toggle to automatic weightings where the model will take performance into account.

I want to test to see if different offers are more likely to convert. How do I make sure to give a different welcome series correlated to each form?

You can adjust the list that a person is submitting to for each variation. To make each variation go to different lists, click on your CTA button in the form and change the List to Submit under the Button Click Action section. From there, trigger your welcome series flow based on someone joining each variation’s list and follow up with the associated offer.

I want to track who sees each form variation of the A/B test. How can I do that?

There are 2 ways that you can do that. You can adjust the list that a person is submitting to or you can add in a hidden field that is submitted when someone clicks through on your form. If you want the different variations to go to different lists, select the button and change the List to Submit setting under the Button Click Action section. If you want to submit through a hidden field, you would add in a property under the Submit Hidden Fields section. Then, create a segment based around each property.

Additional resources

x
Was this article helpful?
106 out of 137 found this helpful