If you’re setting up emails, you’ve probably heard of using A/B testing to optimize your messages. It’s a straightforward process to get started with and it can improve your email deliverability, increase the number of people opening your emails, and help you learn about your audience. This article will cover what A/B testing is and some techniques that are important to get right in order to use it effectively.
What is A/B testing for emails?
A/B testing involves sending two variations of the same email to a small portion of your email list, monitoring how the email performs, and sending the better-performing option to the rest of the list. It’s easy to set up, and most email marketing platforms have built-in tools for it.
Tailoring your emails to your audience is a surefire way to communicate your message, but every audience is a little different. Using a small section of your email list to see what works for your audience will get better results than merely guessing what will perform well. If your recipients like your emails, they’re less likely to mark them as spam, more likely to open them, and more likely to connect with the message inside.
Over the longer term, running A/B testing will teach you valuable insights about the people you’re sending emails to, such as the best day and time to send your emails to get higher engagement, which subject lines they are drawn to, and which calls to action inspire them to act. This means you can craft better emails overall. This general improvement in performance will also help increase your email deliverability. Fewer people declaring your emails spam means it’s easier to get your emails to reach people’s inboxes in the first place.
The 3 most common types of A/B testing
There are several things you can vary with A/B testing to improve your emails. Each variable can tell you knew things about your audience. Choosing the right one to experiment with will depend on the goal for your email.
The 3 most common variables are:
- Subject line variations
- Message content
- Sender’s information
Subject line variations
A/B testing for subject lines is usually used to improve your open rate, or the amount of people who open your email. The subject line is one of the most critical parts of writing an email. If it doesn’t catch people’s interest, the rest of your message won’t be read.
You can try varying the tone of your subject line, adding emojis, personalization like using the recipient’s first name, or just different phrasing. Sometimes small changes can have a big impact. For instance, say you’ve set up A/B testing with two subject lines, one with an emoji and one without. When you send out the email, the subject line without emojis has a higher open rate. If you’ve set up the test properly, the email with the emoji-free subject line will be sent to most of your email list. You’ll also have learned that maybe your audience considers emojis unprofessional, and can steer clear of them in the future.
When using A/B testing for the email’s content, it’s common to measure the success of the email by how many clicks the email’s message gets. Clicks are usually measured when someone clicks on a link in your email, not when they open the email (opening the email is measured separately).
Varying the message content can be as simple or complex as you like, although it’s easier to draw conclusions if you only make small changes. You can try sending two completely different messages, but there are a lot of things that might make one email more popular than another and you won’t be able to tell what.
One thing you can try is two different call-to-actions (CTAs) in the email and measuring which one gets clicked more often. Some things to vary could be the phrasing, the design, or the placement of the CTA.
Every time you send out an email to your list, it has to come from someone. Many people look at who sent the email before they open it or click on anything inside, so this variable can have an impact on both metrics.
Varying the sender’s name and email address has a lot to do with what resonates most with your audience. It’s often worth testing to see if the people you’re emailing have a preference. If so, it’s easy to set the sender’s name and email address as the preferred one for future emails.
You might consider whether to say an email is coming directly from an organization, from a department, or from an individual. Each will have a different reputation. An organization might seem more professional, while an individual might be easier for people to connect with.
Strategies to set up good A/B tests
Improperly set up A/B tests will not give you reliable insight into your audience, and could damage your reputation, reduce your email deliverability, and make it more likely your messages will not get to the right people.
To help you avoid that, the following sections will go over some best practices to create good A/B tests, listed below:
- Use a large enough sample size
- Have a well-defined goal for testing
- Only test one variable at a time
- Experiment with different variables
- Use the data to learn about your audience
- Test both versions of your email before hitting send
Use a large enought sample size
Sending your A/B test to only 4 people on your list won’t give you a good idea of which email performed better. You need to send each version to a large enough group so that you can measure an average performance, not an individual performance. The larger your sample size, the more confident you can be that your results are statistically significant and not just random chance.
Have a well-defined goal for testing
Any well-designed test will have a specific goal outlined that you’re testing for. A good framework for setting testing goals is called a SMART goal. It involves making your goal specific, measurable, achievable, relevant, and time-based. For an A/B test, this might look like testing which sender name gets more emails opened within 2 hours of sending the test out.
Only test one variable at a time
In order to get good data from your A/B tests, you should only change one variable with each test. Varying multiple things in the same A/B test will mean you won’t know what change caused what, but setting up multiple tests over time will let you know for sure that one strategy works better than another.
Experiment with different variables
It’s important to try different strategies across multiple tests. If you always send out a test varying the sender’s name, you’re probably going to get the same test results over and over again, when you could be trying something new and making improvements in other areas of your email.
Use the data to learn about your audience
Each A/B test sent will tell you another piece of information about the people you sent it out to. Track this information across email campaigns, and as you build up more data, you’ll get a clearer picture of what strategies work and who you’re sending emails to.
Test both versions of your email before hitting send
Most importantly, make sure both versions of the email you are sending out are showing up in people’s inboxes error free. Errors in your emails will skew your view of what your audience thinks is important. If one of your emails has a bug, like the recipient’s name not loading properly or formatting errors in your content, it will perform worse than the other. This could make you lose out on a strategy that would have helped you send better emails.
Testing your emails before they're sent
Setting up a solid email testing strategy is important for more than just A/B testing. It’s a key part of any email strategy, since the way emails will show up in your recipients’ inboxes can depend on a lot of variables, including the device they’re using, the browser, the email client, if they’re using dark mode or not, and more. Your emails can also have bugs, especially when you’re setting up multiple versions of them.
Testing emails can be done thoroughly and efficiently by using software designed to help. Mailosaur offers software that can be integrated with many other email testing tools to make testing easier for you. Feel free to reach out to us with any questions you have about email testing.