What is A/B Testing in Email Marketing? Guide, Tools & Best Practices
Successful email marketing is a perfect mixture of art and science. Before the dawn of the Internet, writing marketing mails was even seen as an art.
Have you binge-watched Mad Men yet? If you haven’t, do it now because it is one of the best shows for marketers and general movie lovers. Don Draper just meditated on the beach, and a fantastic idea came to his mind and enticed millions of people to buy Coke.
But that is a lucky shot, and it worked for the sake of the movie. In real life, that is hardly ever the case.
Although you probably have gotten lucky many times - making a decision without any backed-up factual reasons rather than that “I just feel like it is the right choice.”
This is totally fine, but in business, your gut might not always be the best tool for all of your decisions, especially when you use a tool like email marketing software with built-in data analysis features.
You might love to have a wink emoji in the subject line, but Oliver from New York will just shake her head and click on any other emoji-less subject lines. Different folks require different tactics, and you know it.
The best way to find out how to make your email campaigns appealing to your subscribers is to put them to the test. And that is what we will find out in this article: What is A/B testing in email marketing, and how you can apply it to maximize your email marketing’s performance.
Table of contents:
- What is A/B testing in email marketing?
- Why do you need A/B testing in email marketing?
- How to start A/B testing in your email marketing?
- Best practices for A/B testing
- Email tool you should use for A/B testing
- Start A/B testing your email marketing today
What is A/B testing in email marketing?
A/B testing, in the context of email marketing, is the process of sending one variation of your email campaign to a subgroup of your subscriber list and a different variation to another subgroup of subscribers, with the ultimate goal of finding out which variation of the email campaign can garner the best results.
A/B testing in email marketing can vary in complexity. Simple A/B testing can include sending different versions of a subject line to test which one can generate more opens, while more advanced A/B testing can include testing email templates with completely different designs or approaches against each other to see which one can generate more click-throughs.
Suppose you have an email tool like AVADA Email Marketing. In that case, A/B testing your email campaigns is very easy as you have an email builder to create two different variations of your campaign conveniently. Then, you can set automatic timing to send them to two distinct subgroups of your list to see the variation that performs best.
Once the test is finished and you find the winning version, the app can automatically send the winning email to the rest of your list.
A/B testing your email campaigns is a fantastic way to increase the click-through and open rates of your emails. For example, you can test everything from the subject lines to the copy on your call-to-action buttons. You can even test different email templates against each other to find out which one works best, and you will be able to get a much better result with email marketing.
Still not convinced? Let’s go to the next section and see why you need A/B testing for email marketing.
Why do you need A/B testing in email marketing?
Most normal users skip email A/B testing in email marketing because they don’t know what or how to test. In reality, A/B testing is easier than you think, and you’ll be able to discover a huge opportunity to improve your email campaigns.
A/B testing is just a way for you to evaluate and compare two things, then find out the winner.
Smart marketers do A/B testing with email marketing because they want to know about:
Which subject line can have the best open rates
What preheader text can generate the best open rate
Which button text, color, or design can make people most eager to click
Whether the target audience is drawn to emojis or not at all
What imagery in your emails can drive better conversion
Etcetera - there’s so much in email marketing that you can discover with A/B testing!
With A/B testing in email marketing, you can improve the way you view your metrics, the email design that can increase conversions, and the emails that let you know your audience and find out the email tactics that can generate sales or not.
Also, small changes can make a big difference in your email marketing’s result. Just a little bit of tweaking in color, words, imagery or even the shape of your buttons can bring your emails to success.
For example, a company decided to test the effects of including a personalized sender name in their emails as opposed to a generic company name. Their campaigns got some pretty interesting results. The version of the email campaign that had a real person’s name as the sender saw a 0.23% higher click-through rate and a 0.53% higher open rate. While you may see these as insignificant figures, just a tiny improvement like that resulted in 131 more leads being gained.
While no two emails and email campaigns are the same, it is safe to assume that a better version of the email campaign will bring some significant changes to your email marketing effort’s engagement rates and revenue. And the only way for you to find that better version is by split testing.
How to start A/B testing in your email marketing?
Setting up A/B testing for your email campaigns is easy with AVADA Email Marketing. You can pick exactly which element you want to test, create two or multiple versions, choose the best sample size for each email variation, and start finding the winning email.
While the setup process for A/B testing is straightforward, I will show you a few details in each step that are essential to ensure an accurate depiction of the winner. Let’s take a look at how to A/B testing in your email marketing.
Step 1: Decide what you’ll test
When you A/B test with 2 subject lines, the open rate will prove which one of the subject lines appeals most to your subscribers. In another case, if you A/B test 2 different product images in the email layout, you should take a look at both the click-through rate and the conversion rate.
Two email campaigns can show different results depending on the result that you are looking for. Sometimes, the plain-text version of an email can have a better open rate, but when you want to attract more people clicking, a beautifully designed template can be more successful.
Why? Because the better-designed version of the email can contain the video as a GIF - which attracted more recipients to click.
While you often want to test more than one thing with email marketing, it’s important that you only test one thing at a time to learn accurate results.
Something that you should consider A/B testing in email marketing includes:
Subject line (You can test “Product A on Sale vs. “Discounts on Product A”)
Call to action (You can test “Buy Now!” vs. “See Pricing Plans”)
Including testimonials (or whether not to include them at all)
Personalization (You can test calling subscribers: “Mr. Smith” vs. “Smith”)
The layout of the email (You can test single column vs. two columns, or different placement for email elements)
Copywriting (tone, length, word order)
Links and buttons
The specific offer (You can test with offers like “Save 20%” vs. “Free Shipping”)
Each of the things above is likely to have a (negative or positive) effect on different parts of a customer’s buying process. For example, a call-to-action button will affect how many recipients buy your product or click through to the landing page. On the other hand, your subject line will directly affect how many people on the list open your email in the first place.
Think about the result you want to test when deciding which things to test. If not many subscribers are opening your emails, you will be more likely to start testing the subject line.
A good approach is to test the more critical parts first before testing more minor elements. Your headlines and call to action buttons should have a greater impact on conversions than your body text or the images you used. Test the big things with more impact first, and then test other small parts in biggest to least importance.
**But the most important thing to remember before sending out your email versions is to decide what you consider successful testing. **
First, take a look at your previous campaigns’ results. If you’ve been using the same style for your email campaigns for months or years, then you should have a good pool of data to analyze. If your conversion rate before is 10%, then for a start, you may want to increase that to 15%.
Of course, maybe you have another goal for the initial A/B test, such as getting more people to open the email. If that is the case, look at the historical open rate and then decide what improvement you want to see. If improvement doesn’t come with the first set of A/B tests, you can try running another test with two more variants.
Step 2: Pick the correct sample size
In the vast majority of cases, you want to A/B test your entire email list. It’s important to get an accurate picture of how your subscribers respond to your email campaigns, and the best approach to do that is by testing all of them. This means you should send A/B test emails to everyone you have on the email list.
If you have a big email list with over 1000 subscribers, I recommend sticking to the Pareto principle of the 80/20 rule to earn the best result.
It means that focusing on testing the 20% can bring you 80% of the results. When doing A/B tests in email marketing, you should send one version to 10% of the people and the other version to 10% of the subscribers. Depending on the version that performed best, you will send the winning version to the remaining 80% group of subscribers.
I recommend this principle for an extensive email list because you expect statistically significant and accurate results with A/B testing. A 10% sample size for each version would contain enough subscribers to show you which email version had more impact.
When you have a smaller list of subscribers, the number of subscribers that you should do A/B testing will get increasingly larger in percentage for you to have a statistically significant result. If your list has less than 1,000 subscribers, you probably should test 80-95% of the subscribers and send the winner version to the small remaining percentage of the list.
After all, if 14 people click on button A in an email and 16 people click on button B, you still can’t really tell which email’s button performs better. Your sample size should be large enough to get a statistically significant result.
There are a few situations, though, where you should not A/B test your entire email list:
If you want to try something really extreme, you may need to limit the number of people potentially seeing your email, just in case it performs over terribly. Maybe you are testing a new language or a controversial topic in your email. In this case, you still should make sure that at least you have a few hundred people seeing each version of your A/B test. If you can test with a few thousand people, it’s even better.
If you have a very large email list, and the email service you’re using charges by the number of email addresses. In this case, you can test the biggest sample you can afford and make sure that the recipients you select are chosen randomly to have an accurate result.
If you run a limited-time offer and want to get the highest conversions as possible, then it’s a good idea to run a small test batch first with a few hundred people, and then send out the winning version to your entire list. This is much like the 80/20 principle, but the recipients you select should have a higher chance of converting already.
The larger your test sample is, the more accurate your results can be. Make sure that the recipients picked for each group are random, too. Using two lists from different sources or hand-picking recipients is how you skew the testing results. The goal here is to generate empirical data and figure out which version of the A/B testing material really performs best.
Step 3: Select the timing window
When do subscribers normally open an email? The answer is probably clear: it depends.
A user might be online, see the email in the inbox and click to open within 5 minutes. Or he/she might open the newsletter 4 hours after it got delivered to the mailbox. If the subject line didn’t grab he/she enough, the email might never be opened.
These are all scenarios in real life, which is why you need to have an adequate time window when A/B testing with email marketing.
While with variables like opens and subject lines, you can decide the winning version as early as 1-2 hours after sending, you might have to wait for a bit longer if you want to measure click-throughs. If you test your newsletter on active subscribers only, you can shorten the waiting time.
When you wait 2 hours, the A/B test’s accuracy will be around 80%. The longer time you wait to get a conclusion, the more accurate your results can be. To have an accuracy of 99%, my advice is to wait for an entire day after sending.
However, be aware that waiting longer is not always better. Some emails are time-sensitive and should be sent as soon as possible. In other situations, If you wait too long, the winning email may be sent at an inconvenient time. A weekday versus a weekend can make a lot of difference for your email performance.
The main rule to define the right sending time and optimize the A/B testing process is to monitor your metrics based on your business and industry and then continue to test. You can find more stats and guides about email marketing timing in the article below.
Step 4: Choose the sending time
Keep in mind that you can automatically send the winning email once the testing period is completed. As you are likely to send this email to most of the subscribers, you can schedule the email to reach your people without spending too much time with email marketing.
For example, if you are testing 2 subject lines on 20% of the email list (each group contains 10%) and you want the winning version to arrive in subscribers’ inboxes at 11 AM and then test the open rate for 2 hours, then you have to start the A/B test at 9 AM, so the test can run for at least 2 hours before the winning version is sent out at 10 AM.
With AVADA Email Marketing, you can automatically set your test campaigns to run at 7:45 AM; therefore, you would have about 15 minutes to see which version won and send the winner to the rest of the email list. The app is fast to work with, so you won’t need more than two minutes to send the winning email.
Step 5: Test only one variable at a time
Imagine sending two email campaigns at the same time with identical content and sender’s name. The only thing that is different is the subject line. After about two hours, you see that email version A has a better open rate and know which kind of words to use in the subject line to make people open your emails more.
When you test one thing at a time, you can see a clear difference in the data you’re analyzing and draw an accurate conclusion. However, if you also changed the sender’s name or the content for the campaigns above, it would be impossible to conclude which subject line made the difference.
As mentioned in the first step, always start A/B testing your email campaigns with the most important aspects first.
Step 6: Analyze the result
Once you’ve run the A/B testing with the two different email versions, let’s take a look at the results. There are some different categories of results that you should look at:
The click-through rate
The open rate
The conversion rate when they go to your website
The reasons behind tracking the open rate and the click-through rate are pretty obvious. But you might wonder why you’d want to track the conversion rate after people leave the email since that may be out of the email’s control.
The email you send doesn’t have much to do with conversions once the recipient is on your website; this is true. However, it’s also important that the message in your email is consistent with the message on the website.
If your email promises your visitors a special deal, and the deal isn’t clear or misguiding on the website, then you’re going to lose many customers. The same result can happen if the email doesn’t resemble the look and feel of the website. Visitors from emails might get confused and wonder if this is the correct page.
Make sure you can track the conversion rate from each email version to ensure not losing sales. The end goal is conversions, not click-throughs. You may see that one email version gets more click-throughs than the other, but if it doesn’t result in many conversions, then you should probably do more testing and see if you can create an email that not only has higher click-throughs but also higher conversions as a result.
Best practices for A/B testing
Here are a few best practices that you should keep in mind when running an A/B test with email marketing:
Test a large enough sample so you can have more accurate results.
Always test simultaneously so you can reduce the chance of the results being skewed by time-based factors.
Listen to the empirical data collected; never your gut instinct.
Test early and on a good frequency for the best results.
Use the email marketing tools available for quicker and easier A/B testing.
To have the highest chance of having a positive result in conversions from the A/B test, you should have a strategic hypothesis about how a particular variation can perform better than the other.
The way to do this is to think of a basic hypothesis that you believe in for the test before you begin. Some examples of these hypotheses to help illustrate what it might look like are:
You believe using a button instead of a text link can make the call to action stand out more in the email, attracting the reader’s attention and earning more click-through from people.
You believe personalizing the subject line using the subscriber’s first name can help make the campaign stand out in their inboxes and increase the chance for it to get opened.
Even if you just think of them in your mind, these simple statements can help you define what you want to test and what you hope to achieve from the A/B testing. Then you can keep the tests focused on things that will help you get the results you want.
However, not every A/B test in email marketing you run will result in a positive increase in the conversion rate. Some of the variations can actually decrease conversions, and many won’t’ have any effect at all.
It is alright; the key is to make sure you can learn from each A/B test you run in email marketing and use that knowledge to create better email campaigns next time.
Email tool you should use for A/B testing
Now that you know how to A/B test your email campaigns let’s briefly take a look at the tool that can help you test and find the perfect email.
AVADA Email Marketing can help you get better results with email marketing and ensure that your campaigns get opened, read, and the CTA clicked. The email builder lets you tweak many versions of an email, while the automation feature allows you to send campaigns on autopilot and only need to check the result to see the winner later.
Also, the analytic suite will show you which variation of the email performs better and let you set a new automatic timing for the winning email to reach the rest of your email list.
Not only that but you can also A/B test the design of the opt-in form to see which style helps you gain the most number of subscribers, allowing for a fully customized email experience.
Start A/B testing your email marketing today
As you can see, there is nothing too complicated or mysterious about A/B testing. In fact, you would have a much harder time with email marketing without A/B testing.
Brainstorm on how you could improve the email campaign, set up an A/B test, click send, and see the result. You may see an increase in click-throughs or opens, but if you don’t, you can still learn something about the audience that can help create better email campaigns in the future.
With AVADA Email Marketing, it’s super easy to set up A/B testing. Start small and try testing the most important elements, such as the subject line, to see if you can improve the open rates. Once you get the hang of A/B testing, you can go on and maximize the performance of your email marketing efforts.