For four months, I couldn’t understand why my email campaign showed no consistency in statistics. Time after time I tried new tips and tricks, new styles and followed expert advice, but none of my emails returned with the outcome I was expecting.
This was about the time that I was reminded of the power of comparison. Think of how easy it is to convince someone of your opinion, if your facts properly display how pro’s and con’s outweigh one another.
What is an A/B split testing campaign?
Basically, you get to create two versions of one email. Then, you get to send one version of the email to 50% database, and the other version to the other 50% of your database. (There are online tools to handle the whole split testing and reporting process.)
When your click-to-open, unsubscribe, interact and bounce-rate statistics of both versions come back, they are compared back-to-back. In essence, you are doing an experiment where can compare the engagement level of the two different emails.
The key with split testing is to only change one element of the email to see how your audience reacts to that specific variation. Don’t get too hasty by making multiple changes to your emails as you won’t be able to pinpoint the exact outcome of your split testing experiment.
What has split testing revealed?
I am yet to explore the full potential of A/B split testing, but so far, I have discovered quite a few handy tips. Unfortunately, I must remain discrete about the companies these campaigns were for, and the content therein, but I can tell you what I learnt:
-
Background colour
The first time I tested A/B split testing out, I treaded lightly. The only difference between my two email versions was the background colour! When my background was plain white (easy and simple on the eye) my click-to-open rate was 29%; but when my background was a light grey (perhaps too bold for some) my click-to-open rate dropped to 18%.
-
Call-to-action
Don’t overdo it! Unless your email campaign is promoting articles where readers must click through to read more, don’t include too many call-to-action attempts. In one email version, I added a ‘buy now’ button below every product. In the second version, I only added a ‘buy now’ button below a row of 3 products. Version 2 got a whopping 17% higher click-through rate; simply because the call-to-action was less forced.
-
Text length
Some people crave education and information, while quick facts and light reads entertain others. While the answer lies in the kind of target market you are speaking to, what I did still find was that: the more text-heavy the email became, the less interaction there was in the later parts of the email, i.e. skipping out on the read more button could lose you some readers.
-
Image size
This one is quite obvious, but often overlooked. Basically, if your images are too big, your email becomes less compatible across different devices. If your header, or other images for that matter, takes too long to load, your readers will get irritated and leave the email before seeing all your hard work.
If you still feel like you have a lot to learn about running a decent email experiment, contact professionals who can help you; because at times, running a successful campaign takes a lot more than we think.