The power of A/B Testing in Email Designs

AB Testing in Email Designs
You are always looking for ways to improve your campaign performance. You read blogs and tips on what subject lines to try, or other changes you can implement. The truth is that virtually all email components count towards the performance of your campaign.

But you don’t know the impact of each small or not-so-small change; should you address your subscribers by their first name or their last name? Should you offer free shipping or a 20% discount?

Moosend’s built-in A/B testing feature is the answer to these questions

Split testing is a feature that helps you test one variation of your email (or version, altogether!) against another on two randomly generated email lists. This is done automatically by Moosend.

Essentially what you need to do is:

Step 1: Go to New Campaign -> Select A/B Split Test Campaign

Step 2: Choose what you want to test two versions of: subject line, campaign content, campaign sender

Step 3: Determine the test group percentage of the mailing list you have selected

Step 4: Choose whether the winning version will be that with the higher open rate or click-through rate

Step 5: Establish the time that A/B testing will run for

THE END: Sit back and indulge in a gluten-free or guilt-free bagel

Click here to find out a more detailed description of the steps.

DID YOU KNOW?

A/B testing is carried out simultaneously (and automatically!) and is different from testing manually one email design one month against another the next month.

Let’s take a closer look at A/B testing by Moosend:

Setting the bar: Decide the equal size of your A and B test groups (blue/red block) and slide the bar to determine the percentage.

Then, the percentage that will receive the winning version (golden block) is calculated automatically.

Select how you want the winner to be determined; it can be based on either the higher open rates or click-through rates.

Following this, you’ll only need to set the number of hours during which the test will run before the winning version is automatically sent to the remaining subscribers.

A/B split test settings

Before you start:

It is important to decide what you’ll be testing

Whether you’ll be testing different subject lines, campaign content, or campaign senders, depends on whether you are trying to increase your open rates or your click-through rate. To achieve the former, focus on A/B testing of subject lines or sender details. To test for higher click-through rate, choose campaign content testing.

More specifically:

You Can Try Different Subject Lines

It is truly amazing what a small change can accomplish.

Consider the impact of the following subject lines: “Hurry! 20% off on all kitchen appliances” VS “20% off for the next three days”. Which email would you open?

A subject line holds the key to opening your email door. Testing subject line variations, therefore, is of primary importance.

Test Changes In Campaign Content

With regard to email body, there are more components and variations to consider:

  • For one thing, it could be the newsletter design itself. More specifically, you can test two radically different approaches so as to see whether one appeals more to your audience’s style.
  • Are you on a first-name basis with your audience? If so, should you? You should not have to wonder, really. You can test a “Hi, Joe” greeting against “Good evening, Mr Green” and find out for yourself. Numbers will do the talking!
  • The text of the body is also testing material. You can try two different writing styles to find out how your audience wants to be addressed. Friendly or professional, casual or formal, all this information adds to the profile of your user’s persona and provides you with insight on future moves.
  • CTAs also offer testing grounds. Does your audience respond better to the Buy now button or Find out more? Maybe “See pricing and freemium plans” elicits higher engagement- don’t wonder, find out and measure!
  • Sitting on the fence about which image to use? You are right; at the age of the image, the appropriate photo can communicate a lot more about your brand than the finest copy. You’ll be surprised at how changing an image can have a very measurable effect on your click-through rate.
  • Which offer the majority of your users would prefer: free shipping or 20% off their order? Find out what will help convert them more easily.
  • Considering Sender details

One of the more recent trends is that of personable sender details. More and more companies are sending emails from addresses saved as “Josh from Moosend” rather than “Moosend”. The former looks friendlier and gives a tone of familiarity while the latter looks more professional and, possibly, distant. You do not have to make a choice, whatsoever. May the highest open rates win!

Test Only One Variable At A Time

Remember that you should only test one factor in each A/B split test, otherwise you won’t be able to make a well-informed decision on which changes you should implement and which to drop.

tips and tricks

Useful Tips for A/B Testing in Email Designs

1. Run A Test On A Small Part Of Your Subscribers

Let’s face it; running a test requires, as in any statistical experiment, that your sample is big enough. In the case of email A/B testing, you should have at least 200 subscribers before you decide to test your list’s behavior. Subject lines is something you can test if your list is that small.

But if you want to get accurate results on elements that are inside your newsletter, like images, colors and layouts, we recommend that you run your A/B tests on bigger lists, namely lists with thousands, even tens of thousands, of subscribers.

If you have that many people on your mailing list, you can even choose to first segment your list and then run a split test so you can get even more meaningful results per segment.

2. Protect Your Brand Image

If you have reached your previous goals and are experimenting with the addition of new elements, you don’t need to risk all your hard-earned work. Use split testing to see how your audience reacts without compromising your brand image or the reliability of the testing process altogether.

3. Think Before You Click

Remember that once you’ve hit Send, you cannot “freeze” the A/B testing process to change a variable. This is why you need to have decided on a specific change you will be working on. You can always run more A/B tests and progressively use the implications of the findings to elaborate your future campaigns.

Conclusion

In a fast-paced world, making the mistake of randomly introducing changes in your email design when they are not part of a coherent, structured plan, is a very realistic risk. If you can’t test and measure the impact of one variation against another, following your instinct is not going to be very fruitful.

It is essential to serve the right email to your audience and be spend savvy. All you have to do is split test a number of variables that will improve your open rates and click-through rate over a period. As long as one email version tests significantly better over the other, by all means implement changes accordingly and re-examine after a while. This way, within a few months you will have a deep understanding of your audience’s preferences.