Why Testing Isn’t Optional
A/B and multivariate testing is a proven email optimization strategy that can drastically improve open rates, click rates, the user experience, and ultimately sales and service conversions that impact your marketing ROI.
When done correctly, testing gives you a definitive answer and eliminates guesswork allowing you to easily pinpoint what changes need to be made, and what you should leave alone. Testing can also help to uncover long-standing wrong assumptions about your audience and their preferences. When done correctly, only a portion of your audience is subjected to the variable elements, so testing is relatively low risk.
This sounds great. So everybody’s doing it, right?
No – Only 15% of marketers are routinely brainstorming new email testing ideas. *1
If you’re not testing and optimizing then you’re leaving money on the table. Testing isn’t optional for companies that wish to compete in today’s marketplace.
How does testing lead to more opens, clicks, conversions, and higher customer satisfaction?
• More OPENS: Simple subject line tests can boost open rates tremendously. This exposes your message to more people and allows them to interact with your brand.
• More CLICKS: Testing different “calls to action” or button placements can generate hundreds or thousands more clicks that direct customers to your most valuable landing pages.
• More CONVERSIONS: You can pinpoint what types of messages motivate your audience to buy, but don’t stop there. Testing the placement and design of your calls to action is another huge conversion driver. Maybe your customers gloss over your “Order Online” button because it’s buried in a navigation section. They saw it, but it wasn’t intriguing because YOU didn’t make it seem special. A test will reveal the best design.
• Higher CUSTOMER SATISFACTION: Through testing, customers are telling you what they like and what they don’t like without the need for surveys and polls. They are actually participating in a type of focus group, and you are instantly collecting massive amounts of customer preference data. Listen to what they “say” and use the test data to anticipate what they want in the future. For example, you could run a split test to see if more people buy when you offer “Free Shipping” vs. a “10% discount”. Customers will appreciate seeing the types of offers that appeal to them, which creates higher overall satisfaction with your brand. Testing and optimization allows you to maximize your efforts and create the best customer experience possible.
Key Testing Takeaways
1) Small changes can have a big impact. How one word created a 22.4% lift in unique clicks
We ran a subject line test for an eNewsletter that contained product news, incentives, and lifestyle content. In the subject line, we wanted to see if the word, “Specials” prompted more opens than “News”
Subject A: “Specials from [dealer name]”
Subject B: “News from [dealer name]”
We were surprised to see virtually identical open rates for the two subject lines. Interestingly, “Specials from [dealer name]” generated a 22.4% lift on click rate! We originally expected to see a difference in open rate, but the click rate was the real story. When we looked even closer, we discovered that almost the entirety of the additional clicks were attributed to two links within the newsletter: a “special offer” link in the sidebar, and a “special offer” link in the lead newsletter article.
After seeing the word, “specials” in the subject line, customers actively sought out “special offer” links within the email and then clicked through. This test demonstrated that customers prefer a congruent message all the way from the subject line through the body of the email. It goes to show you how small changes can have a big impact on consumer engagement with your email.
2) “New” doesn’t always mean “better”
Many marketers make the mistake of redesigning an email template without thoroughly testing against their current design. Even if your new design is more aesthetically pleasing and follows all the latest best practice research out there, you should still run a test against your old design. For example, your customers may like your old design because it’s easy to find what they want, without any flashy design distractions. We learned this lesson during an annual creative overhaul. To begin the redesign phase, our teams came up with a new navigation section within our email newsletter template. Compared to our old navigation design, it looked fresh, had eye-catching sub-headers, and used text links rather than image-based links (A best practice for inbox preview optimization). We almost didn’t run a test because everyone agreed that it would surely beat our old design. Luckily, we did test, because the results were surprising. Our old “boring” version had a 13% higher click rate than our new “optimized” version! Obviously, the audience didn’t agree with our new design choices.
Don’t make the mistake of using a gut feeling, or relying too heavily on someone else’s research to guide your new designs. Running a test with your creative on your audience is the best way to make a good decision.
3) Disagreeing on Design or Content? Just test it!
Often times, clients or internal team members will disagree on an approach to email design or content. A valid split test is a great way to settle arguments fairly without creating lingering animosity between groups. The proof will be right there for everyone to see, so no one is left wondering if their idea would have performed better.
Here’s the best part – everybody wins! At the end of the day, both sides benefit because you learned something about your customer and what appeals to them. The point of a test is to learn. Therefore, a “failed” outcome is just as valuable as a “successful” outcome. Just because your hypothesis was incorrect, doesn’t mean both sides didn’t learn a valuable lesson about your audience.
1: Marketing Sherpa’s 2012 Email Marketing Benchmark Report”
Originally published on The 1to1 Media Blog