Are you Split over Email Testing?
Did you pass your email test?
Lately, does it seem your email marketing campaigns bring in lower conversion rates than expected, or are the results better than average - do you really know what the average is?. At times, do you notice a sharp increase in clickthroughs followed by lower conversion rates or do you get another unexpected outcome? Should you expect consistent results with every email campaign?
How would you know what to expect without running some sort of test before you send.
Did you know, statistically speaking, routine email testing is only used by a little more than half of the total number of all US based email marketers. If you are in the group that doesn’t test, how can you optimize email conversions rates without email testing? Chances are you can’t.
If you’re not getting optimal conversion rates, look at all those marketers who routinely test their emails. Did you know, by testing, the numbers indicate they have seen noted improvement in all areas of email marketing from an increase in lead generation to elevated ROI percentages. The conclusion, email testing also known as A/B split testing or just split/testing, is a means to an end: increased ROI.
What is Split testing?
There are different types of A/B split testing tools, from straightforward simplistic CGI scripts to advanced software applications. A split test is a type of email test that gives the ability to send different versions of a message to random subsets of your mailing list, and then compare the results to see which version is most effective. This type of testing eliminates any demographic or action-based bias that could alter your results. Use split testing as:
1. A tool to solve specific problems by pinpointing issues in your test groups.
2. A means to a greater understanding of web visitor behavior such as buying habits and other priorities that you may not have considered.
3. A discovery tool that will help you reorganize your website design and content.
4. A way to resolve particular problems on your landing pages or other website pages: essentially an analytical tool to discover if anything is going wrong and how to correct it.
How does it work?
To start, select a number of email recipients to which you want to send your tests to and split them into groups. As an example using Lyris ListManager™ hosted by Dundee Internet, the marketer can simply choose either a percentage or an absolute number of recipients in which to send the email test versions, and ListManager™ automatically selects a random subset of the list based on this figure.
Next, using the content from your latest email marketing campaign develop alternatives of the same email by changing the subject line, the content, the offer or the images, etc. As an example, you might retain the same template, image and links but use a different subject line. You might keep the same subject line but change the promotion placement in your newsletter or change the promotion itself.
Each run of the email generates valuable feedback as you collect the audience responses from one group who had an email with (for example) a different subject line compared to the other group with the same subject line.
“The key to a meaningful split test; is to conduct the test by changing only one thing in each different version. If you run a test where you change both the subject line and the content, and one version in that test performs better, how do you know if it was the different subject line or different content that made the difference? “ What should you test?
Start with something simple, as mentioned, the subject line. Consider reworking your subject for optimal length, different stated benefits, first person, call to action etc. Again, make sure you only change one variable at a time (per test) keeping all other aspects the same, such as the time of day you mail to the color, fonts and image placement in your message
Run your tests and compare your results. Consider the responses: Did an older list respond better than a newer list to the test? Which email responded with greater open rates - the blue background or the yellow background? Which mailing yielded the most referrals and so on?
When should you test?
Email is seasonal. Be aware that testing a change in your email layout may receive a better response rate one month over another month. Consider testing all your email versions (for example) simultaneously, so outside influences such as a Holiday will have minimal effect on your results, unless of course you’re testing for Holiday responses.
Did you pass?
You can run tests all day, however, how you read your test results should determine the next steps to take with your email campaigns. What is the goal of your campaign? Are you seeking higher open rates, more referrals or greater, clickthroughs? Was your list members asked to fill out a form, send in an email address or visit a particular website? Consider all these factor when you test and test for success in your future campaigns.
Need help with split testing, contact help@dundee.net for more information.
.
Comments
Post a Comment