When I ask clients if they’ve considered doing some A/B testing with their email campaigns, they often tell me they just don’t have the time.
Back in the day, A/B testing wasn’t very easy to do. Now, marketing automation tools like Eloqua and Marketo make it simple to setup and conduct A/B tests. In fact, you can do a subject line A/B test in Marketo with just a few clicks.
A/B testing is a quick, low-risk way to find out what kind of creative and messaging resonate better with your clients and prospects. Many people still believe it’s a huge undertaking, though, and they’re missing out on a relatively simple way to improve their campaign results.
For those of you who want to finally give it a go, I’ve identified three steps for successful A/B testing:
1. Identify what you’re going to test
You can test just about anything, from a simple color change to a complete overhaul of your creative. Sometimes the simplest of changes — like tweaking a CTA or swapping out an image — can have the biggest results. Check out 24 of the Most Surprising A/B Tests of All Time for a few examples.
If you’re looking for more opens, for example, you’ll want to test variations of your subject lines:
- A question vs. a statement
- Longer vs. shorter subject line
- Inclusion of a number (e.g., 3 Steps for Successful A/B Testing)
- Personalization
- Some sort of CTA, like “Act Today”
If you want to improve the number of clicks in your emails, you might test one or more of the following:
- Content format: Do your prospects prefer to read (pdf), listen (podcast), or watch (videos)?
- Headlines: Long vs. short, question vs. statement
- Layout: Full HTML design vs. simple letter-type email
- Color: Do people respond better to a blue button vs. a yellow button?
- CTAs: Do prospects respond better with a deadline or when you include potential ROI?
Just remember: Only A/B test one thing at a time. I really can’t emphasize this enough. By focusing on one variable at a time, you can narrow down which change had the greatest effect. If you start doing too much, it’s difficult to identify which change actually drove the greatest improvements.
2. Review your historical data and form a hypothesis
Some people jump into A/B testing without a hypothesis or anything to compare it to (e.g., “Let’s do a subject line test!”).
There’s nothing wrong with testing subject lines. In fact, according to Litmus’ 2017 State of Email Creative, they’re the most tested element in an email.
First, though, you should go back and look at some of your previous campaigns and review the different types of subject lines you’ve used. Maybe some subject lines posed a question, and others made a statement.
Next, compare how a handful of campaigns from one group performed against the other. This will help you form your hypothesis. Then, when you run your A/B test using a question versus a statement in the subject line, you’ll have something to compare those results to. If you don’t, you might think your results are great — when they really aren’t.
3. Trust the data — and act on the results
After you’ve performed your A/B test, there’s one last thing you need to do, and that’s trust the data.
It’s easy to shrug off the results because your gut is telling you they can’t be right. Maybe you’re 100% certain your audience prefers the color blue when the data says they prefer green. Or, maybe your prospects are well educated, but a 5th-6th grade reading level resonates better with your recipients than a 9th-10th grade reading level.
In order to trust the data, you have to make sure you have an applicable sample. You can’t do an A/B test on 10 people and draw any kind of reasonable conclusion. Test large enough sample sizes so you can be sure the results will apply to the rest of your clients and prospects.
Still not convinced? You can always retest down the road. Just because you tested subject line length once and found that six words are better than 10, test again in six months and see if the data still holds true.
Finally, implement the results! Don’t go through the exercise of reviewing your data, forming a hypothesis, identifying what you’re going to test, and performing the test — and then do absolutely nothing with it.
Start small and just do it
Over the years, I’ve heard a lot of big ideas around A/B testing that didn’t go anywhere because marketers got stuck in analysis paralysis. The trick is to start small, get comfortable with it, and then build on that experience.
At DemandGen, we help our clients manage as much or as little of campaign development and execution as they need, including A/B testing. Let us know if we can help!
Todd Roll is a DemandGen Campaign Manager. With years of experience in digital marketing and Marketo in particular, he provides a strategic campaign framework that helps ensure flawless client campaign execution and aims to turn our clients into marketing heroes.
The post 3 Steps for Successful A/B Testing: Start Small and Just Do It appeared first on DemandGen.