Email A/B Testing: The Basics

The Email Marketers
January 7, 2022

Email A/B Testing: The Basics

When it comes to email marketing, it's easy to get overwhelmed. That's why I like creating posts on the basics from time to time. These blogs will help you out whether you're just starting your marketing journey as a solopreneur, or you're a seasoned pro who's committed to constant growth.

In this blog post, I'm going to give you the rundown on one of the most valuable marketing tools for your email marketing campaigns: A/B testing, sometimes called B testing or split testing.

A/B Testing ensures that email campaigns are their best versions of themselves, and uses data to craft email campaigns with the highest possible click-through rates and conversion rates.

Although this testing can be slightly time-consuming, it's well worth the effort, and can actually be pretty fun if you're a data nerd like me. After all, who doesn't like a healthy dose of good old-fashioned competition?

Below we unroll the ins and outs of A/B testing. Get ready to absorb! Let the games begin, and may the best version win. (You'll fully appreciate that little joke by the end of the post.)

Why A/B Testing?

Any marketer worth their salt is a big fan of data. Using a B test is a method of collecting the data that marketers need to fashion winning email campaigns.

A/B testing can help determine the best online marketing and promotional strategies for your email campaigns. It takes the guesswork out of email campaigns and, like a constantly updating GPS system, shows you the best route to get to your destination. What's the destination? Higher click-through rates, higher conversion rates, higher ROI, and higher profits.

Exactly What IS A/B Testing? The ABCs Of A/B Testing

You're like, "Ok, ok, I get it! A/B Testing is great But what is it?"

Admittedly, I put the cart in front of the horse slightly here in order to convince you that B testing is essential for your email campaigns' growth and success. So now that you're properly convinced, we can dive into the ABCs of A/B testing.

A/B testing is a way to compare two different versions of the same email to see which version performs better or gets a better response. In a way, it's a competition between two emails to see which one gets your message across best.  It's a great tool to increase your open rates, click-through rate, and ultimately, your conversion rate.

How Does A/B Testing Work?

With A /B testing, email marketers take an email and change one variable of the email (for example, the subject line). then randomly send the different versions to users.

(Pssst...You can actually change more than one  variable of the email--this is called multivariate testing, and we'll revisit it soon.)

Ways To Use Your Different Versions

As mentioned, with A/B testing, there are two versions of an email, one is the Version A or "A Test" and the other the Version B or "B Test." There are a few different ways to approach your split testing and use these tests.

Control Version and/or Version A

The control is the version that is already being used in your email campaign. The control email demonstrates your baseline results from the campaign you are already running.

Before you begin testing, make sure you know exactly which results you're aiming for. When it comes to testing, you should already be familiar with your current results.

You can use your A Version as your control and create something new for B (leaving whatever you're currently using).

For example, say you have a flow already in place for your abandoned cart recipients, and you want to A /B test the Call To Action in the first email of that flow. In that case, you would keep the version you already have for a portion of your recipients, and this version would be called the control or the A version.

Alternatively, you can keep a control version of the email already in place, and then create a Version A AND a Version B. These two emails will have two different options for the CTA, and these will of course be different than the original CTA in the control group. In this scenario, both Version A and Version B test, and the control version stands apart.

Version B or B Test

Unlike Version A, Version B is nearly always your test version email, and so is typically called the B test. In your B test, there will be a single variable or perhaps several variables (in multivariate testing) that have been altered.

Remember, your  B test will run at the same time as your control and/or your A test, in order to ascertain the most accurate results.

Multivariate Testing

While definitely easier to just test one variable at a time, it's not the only way to test: you can test multiple variables at once.  For example, you can test your subject line and CTA in the same test. This is called multivariate testing.

In order to multivariate test, you'll need to make sure you have the right resources and tools in place, and the staff requirements to sift through the data and analyze results.

Multivariate testing can be complex, but it can offer useful feedback. Ultimately it's your choice: a multivariate test approach can be effective with the right procedures in place, but one A/B test at a time is sufficient if you want to keep things simple.

At The Email Marketers, we believe in keeping things simple whenever possible. Even as experts, we're very careful when and why we decide to run a multivariate test.

If you're just starting out with A/B testing, I'd strongly recommend starting with the simple, one variable at a time approach. Once you get the hang of testing, then you might branch out to multivariate testing if you have a good ESP like Klaviyo to help get meaningful, accurate information.

Worth The Time

Now, A/B tests do require extra time, and some marketers balk at this. They'd rather hit the ground running and get their hands dirty with an email campaign that activates their gut feelings, rather than use up precious marketing time doing B testing.

This is a mistake. Successful marketing strategies do not run on gut feelings. They run on data and results.

Running accurate email A/B testing can make a huge positive impact on your ROI.  You can determine which marketing strategies are most effective for your company and product by conducting controlled experiments and collecting empirical data.

If you run a promotion without first testing to see if one variation is performing better than the other, you risk losing large sums of money. The more you know about what's working and what isn't, the easier it is to craft more effective marketing strategies in the long run.

B testing lets you find the best-performing version of your campaign prior to using up all of your marketing funds on materials that aren't effective. From there, the most effective elements of promotion can be selected to create an airtight marketing strategy with a lower risk of failure and a higher return on investment. Watch and see your conversion rates skyrocket.

A/B testing can have an extraordinary effect on your bottom line,  and it merits the time and effort it takes. So don't be lazy with your B test! Ensure each test runs for a meaningful time before moving on to the next.

What Test Elements Can Be Tested With Split Testing?

The beauty of email A/B testing is that it can be used to test many different elements of your email campaigns. From subject lines to email design, you can optimize your strategy by further and further refining your emails, resulting in a rock-solid successful campaign.

The choice of which elements to test is wide-ranging, and what you choose to test will depend on what you perceive as the possible weak links in your email marketing strategy. You'll want to put your focus on those variables that affect conversions and website traffic, and that will be a unique choice for each company.

That being said, there are common variables that are rigorously tested because they are the essential elements in any email campaign.

From Name

It may sound basic, but who the email is from makes a significant difference in whether the recipient opens the email. And remember, when you view emails on an iPhone, the from line is actually the BIGGEST element.

If you decide to test this out (and you should be testing this out), be sure to always make it clear that your email originates from your company--you'll want to flag the company name.

For example, you might try using staff names in email send-outs: eg Simon at Company Name. You could simply send with Company Name as your from line. Or go with a department name, such as Customer Service at Company Name. You could even try something like The Company Name Team. See which one lands better with your email subscribers.

Email Subject Line

Subject lines. These may seem like simple words, phrases, or sentences, but they are the key to unlocking higher click-through rates. There's a reason that common copywriting wisdom dictates you always create your subject lines before the actual body of your email.

Experienced marketers know just how important the subject line is for email marketing campaigns, but honestly, sometimes the raw numbers about subject lines can surprise even us. A different subject line to the same email can vary the open rate so greatly that jaws drop.

This is why it's vital to nail the email subject line and test variations to make sure you use the winning words and phrases to get your contacts to actually open your emails.

Note: iOS 15 has made testing anything that has to do with opens a bit more unreliable so just keep that in mind.

Preview Text

Like subject lines, the preview text is one of the first glimpses your subscribers have of your email. It can affect your open rates and click-throughs greatly.

In fact, the from name, the subject line, and the preview text are the 3 elements that the recipient sees first. So you need these to be in top order for your campaign to reach its maximum potential and get the open rates and the click-through rate to a healthy level.


Using personalization is one of the handiest tools for marketers. Now, most people are quick to assume that personalization simply means using the first name, and then it's wrapped up in a neat bow. But not so fast--there are plenty of ways to vary personalization.

For instance, you might want to test WHERE you put the first name. Is it in the subject line? Or is it in the email body? Or is it in both the subject line AND the body? Which version will perform best and become the winning version?

You can even go beyond the first name in the subject line and copy for personalization, and get into personalization based on purchase history, landing page behavior, or customer status.

Using email A/B testing, you can find which kind of personalization resonates best with your target audience and results in more click-throughs and conversions.


Email A/B testing can demonstrate what kind of copy works best for your company, in terms of tone and positioning. You might even experiment with word order here.

The term "copy" in the context of email A/B testing has multiple meanings, including body copy, headlines, and button copy. After all, copy is queen in email marketing! And you want a strong queen ;)

Header and Subheader

If you use a header/subheader (and unless you're sending plain text emails, you probably do), spend some time testing this section. Why?

This is the VERY FIRST thing a reader sees if they open your email. This is like the second act of a movie. If the opening is great but the middle stinks, you're going to stop watching and find something better. Even if the end of your "email movie" is amazing, your customer will never know.

So test the heck out of your header and subheader. Test the copy, the design, whether you have a CTA in there... see what gets your prospect from "open" to "click."


Length is also a variable in emails. Do your email clients respond to long-form copy or shorter bursts of texts? Do different segments prefer differing lengths? You won't be able to say for sure until you put the length to test with a B test.

You might also want to consider different devices when you test your length. Do your mobile users prefer a different length than desktop users? Test to find out.

Finally, length figures into email design in a significant way--which length and layout of email perform best?

Plain Text Vs. HTML

Do you send mostly plain text or HTML versions of your emails?  How did you make that decision? If it wasn't using A/B testing, you're going to want to check to see if your approach is actually the best one for your customers.

Determine whether a plain text or an HTML email performs better using a B test. Pay attention to which email gets into your reader's brain more and gets your conversion rate higher. You might be surprised at the winning version.

Call To Action

Like the subject line, the call to action or CTA is one of those all-important elements in your email. After all, getting your subscribers to click your CTA is the goal of the email.

So you need to have zero doubts that your email has the best possible CTA. You can try out all sorts of versions of your CTA--from different color buttons to different phrasing (example: "Buy Today" or "Shop Now").

A/B Test CTA: Which one would you choose?

You should also be testing the placement of your CTA and the number of CTA's in your campaigns. While you should always, always, always have your CTA above the fold in an email, you can experiment with how much intro text you have, whether your CTA is in the Header/Subheader as well as the body, etc.

If you've never really thought about your CTA before, I'd strongly recommend running a whole series of tests to find the perfect wording and placement for your CTA.

Promotional Offers

A great way to snag potential customers is with a promotional offer. In fact, many a successful email proclaims a promotional offer in the subject line.  But it's critical to compare various options. Which supercharge your conversion process?

Here's an example: offer a free gift to group A and a discount to group B. Then see which offer has the better email performance. Which one garners more interest and gets more conversions?

You can also A/B test which products or services you promote, offering a percent versus a dollar discount, and even how long you run the promotion. Promotions are a fantastic way to drive engagement, so don't overlook this critical piece of your marketing puzzle.

Now that you have an idea as to what you can test with Email A/B testing, it's time to put the rubber to the road and run a test. Before you start, you'll need to make a few decisions.

The Basic A/B Test Process

The Basic A/B Test Process

Create Your Hypothesis

Whether you have a gut feeling that a small change could improve engagement, or are just curious to see which version your audience prefers, creating a hypothesis before you test helps you refine your objective and uncover your own pre-existing beliefs. An example of a hypothesis could be: a plain text version of transactional emails will engage subscribers better than an HTML version. Will it, really? A/B test to find out.

You could also simply state what you want to learn, for example: will a different subject line with an emoji result in higher open rates? It's okay to go into an A/B test having no idea which version will win.

Decide On Your Objective

You'll need to set your objective for your B test. First of all, what is your goal with the test? Once you have your goal, you need to decide the metrics you'll use to determine which version of your A/B test is closer to achieving it.

You may want to get your open rate higher, improve your click rate, or make as much revenue as possible from an email. All those goals would require different metrics.

Decide On Your Variable

Now that you have your objective, you can choose the variable. Expanding on the examples above, if you want to improve your open rate, you might choose to do a B test with a different subject line OR you might choose to test your from name.

And if you want to increase click-through rates, A/B testing your header OR your CTA is a great place to start.

Decide On Your Split

Finally, you'll need to decide how you'll split your subscriber list, or what your sample size will be.

Will you test your entire list and go 50/50, sending Version A  (the control) to 50% of your subscribers, and your B test to the other 50%?

Or will you go a bit more complex and send the control email to 25% of your subscribers, make the test group another 25%, and from there send the winning version to the other half, ie the remaining recipients?

There are multiple ways to get your sample size, and you might even have a control, a version A, and a version B in play all at once. No one way is necessarily preferable, as long your company is equipped to tackle the results.

Run the Test

Here comes the fun part! And by the fun part, I mean the "love it or hate it" part. Setting up an A/B test with an ESP like MailChimp or Klaviyo is pretty user-friendly, but that does not mean everyone will find it equally simple.

If you really feel out of your depth on this, consider outsourcing this part of your email marketing so you're confident that you're testing the right variables to achieve your goals.

Regardless of who's running the test, make sure to run it long enough to get good data. A minimum of 2 weeks is industry standard.

Analyze the Results

When it comes to analyzing results, statistical significance is the name of the game. If you're like me and barely scraped by in statistics, this is where you want to turn to another expert. My email strategists and I use Neil Patel's AB Testing Significance Calculator.

Rinse and Repeat

Once you have a winner from your first A/B test, it's time to go back to the drawing board. You can either choose another variable to help get you closer to your original objective or create another objective and develop a new test from there. You should be running A/B tests consistently as part of your general email hygiene.

A Few More Considerations

Size Matters (in A/B Testing) And So Does Timing

The general rule of thumb here is the more people you include in your sample size, the more reliable your results will be.

Also, the split must be random; in other words, don't hand-pick your recipients. You'll get statistically significant results via collecting empirical data.

Confounding Variables

We would be remiss not to mention that it's important to take confounding variables into account as you prepare to run your test.

Confounding variables include factors that might affect your send, such as time of year or, in this all too real example, if there's a pandemic affecting customer morale. These variables are out of your control, but they still need to be acknowledged.

Tools For Testing

You can choose to use automation for your email A/B testing, or go the old-fashioned, manual way. This will depend on whether you care to access software or prefer a more hands-on approach.


Many email marketing platforms such as MailChimp or Klaviyo have built-in tools for A/B testing.

There are several inexpensive or even free tools you can use to conduct your testing. Google Optimize is a free tool that is great for A/B testing and Google Analytics offers a How To A/B Test tutorial.

The advantage of using software is that you can automate your sends and often the program will help analyze your results. It speeds up the process.

Old School

If you don't have access to email marketing software, or the email marketing software you use doesn't have a built-in tool, don't fret. There's always the good old-fashioned method. You can split up your email list however you choose and send the control, A test, and B test manually.

In order to analyze and compare results, you'll probably want to export the data to a spreadsheet and use the tools available there to make an accurate campaign report.

Best Practices For Email A/B Testing

As with all components of email marketing, there are best practices you'll want to keep in mind as you run your A/B testing.

  • Test early and don't stop at one test: make testing a regular habit.

  • Test at the same time. When the B test goes out, send the A test (and control if it's not your A test) out at the same time. Your results will be time-skewed if you don't test simultaneously.

  • Use as extensive a test group as possible, since a larger test group gives more accurate results.

  • Trust your data, NOT your gut. You're testing for a reason; don't throw out the winning metric simply because you prefer another.

Benefits Of A/B Testing

After all that work, what can you expect? The answer is: A LOT. The benefits of a B test for your email campaigns are enormous:

Better Conversion Rates

At the end of the day, this is what matters: the conversion rates are the bottom line. All email campaigns have the goal of raising your conversion rate. With a solid B test and implemented results, your conversion rate is bound to increase.

Know Your Target Audience

Email testing helps you understand your target audience better. You'll get a feel--better yet, you'll get the DATA--on your subscribers' preferences. You work hard to create great emails for your contacts, so why not tailor your messaging accordingly based on data, then turn that feedback into an increase in conversion rates and ROI?

Stay On-Trend

As with clothing and music, marketing trends go in and out of fashion. (Of course, there are some "classics" in marketing, such as a fantastic subject line. A good subject line is a little like a Chanel Little Black Dress).

Use your B test to see if that new trend you saw or read about resonates better with your target audience. Then pivot if you need to. Because changing up your style can be fun, and more importantly, profitable!

Analyze YOUR Success

How you measure the success of your A/B testing is up to you and will depend on the specific goals of your test. You'll want to compare the results of your B test and your A test; and, after you dig into the details there, it will likely be clear which is the winning email.

You'll be able to make that comparison using the data collected during the test campaign. Useful metrics might include looking at the open rate (which email had higher open rates); the click-through rates (which has more click-throughs); which email result in more traffic to your website or landing pages via the CTA; which email resulted in more sales or higher conversions; and which email engaged the most subscribers.

The deal is, there really is no downside to conducting a B test: you'll certainly be able to gather data, even if the data just tells you, "hey, this B test DIDN'T work--what you've already got going on is clearly better. "

Road-testing your variables are valuable for the continued success of your marketing efforts. So, go forth and test, test, test! A/B testing is a test you'll actually enjoy.