Enter a search term such as “mobile analytics” or browse our content using the filters above.
That’s not only a poor Scrabble score but we also couldn’t find any results matching
Check your spelling or try broadening your search.
Sorry about this, there is a problem with our search at the moment.
Please try again later.
Having recently published an article about why email isn’t dead, I thought it would be useful to roundup some case studies to help marketers inject some life into their own campaigns.
Hopefully they should provide some inspiration for marketers who are in the process of testing their own email messages.
Buttons vs. text links
AWeber Communications ran a test on its email newsletter to find out whether buttons were more effective that text links in encouraging clicks.
Previous testing on the website had found in favour of buttons, so the assumption was that the same would be true of email.
The test included two identical versions of the same email, except that one included this CTA button while the other used text links:
AWeber used clicks-to-opens as a measure of success so that if one version of the message got an unusually high number of opens, it wouldn’t skew the results to make that version’s CTA look more effective than it really was.
The initial results showed that the attention-grabbing CTA was far more effective than text links.
In the first five split tests, the button drew a clicks-to-opens rate that was on average 33.29% higher than the text link clicks-to-opens rate.
However it appears that the initial success was down to the novelty factor, as the gap began to reduce over time.
After 20 tests the larger CTA was only winning by an average of 17.29%, and after 40 tests the text links were consistently outperforming the button, winning nearly two-thirds of the time and by double-digit margins as high as nearly 35%.
The key lesson here is that tests should not be run as a one-off, as the results will alter over time.
Had AWeber run a handful of tests then opted for the larger CTA then it may ultimately have caused a decline in clicks over time.
How many CTAs?
Prior to conducting any A/B testing, Whirlpool’s marketing team assumed that having multiple CTAs within an email had a positive impact as it increased the chance that someone would click one of the buttons.
However it was pointed out that all four CTAs related to separate actions. This distracted from the main focus of the email, which was to drive users to a rebate landing page and encourage them to visit a showroom.
It was therefore decided to run an A/B test to see whether a single CTA would have more success in achieving the team’s objective.
Whirlpool’s original email
The treatment with the single CTA achieved a 42% increase in clicks for Whirlpool.
Whirlpool’s new email design
An anonymous restaurant chain used A/B testing to analyse whether CTA placement had any impact on engagement with its non-offer based promotions.
The business builds its promotions around menu items and pricing rather than discounts, and the aim of the test was to get more people to respond to email campaigns and come into its outlets.
Two versions of the email were deployed, with the subject line and copy the same in both versions. They also both contained two CTAs with different copy.
One of the buttons stayed in the same place in both templates, however in example A the second button was near the top of the mailing copy and in example B it was just below the main text.
The results showed that the position of the buttons had little impact on the clickthrough rate, suggesting that subscribers aren’t put off by having to scroll down an email.
However the wording did have a major impact, as the better performing CTA saw a 66% increase in clicks during the campaign.
This case study shows that marketers need to test several different elements to find the optimum email design.