Questions to ask Yourself Before A/B Testing
A/B testing has the potential to boost anyone’s bottom line, but only when the time is right. Is it right for you? Some questions to ask yourself:
Will A/B testing bring the biggest improvements to my current campaign?
While A/B testing is a valuable aid in optimization, it’s not the only method of optimization. Are there other quick fixes you could make to your campaign before you dedicate time and resources to the A/B testing process?
Have you removed navigation from all your landing pages? Have you figured out your best sources of traffic?
Consider this example from Derek Halpern:
“If I get 100 people to my site, and I have a 20% conversion rate, that means I get 20 people to convert... I can try to get that conversion rate to 35% and get 35 people to convert, or, I could just figure out how to get 1,000 new visitors, maintain that 20% conversion, and you’ll see that 20% of 1,000 (200), is much higher than 35% of 100 (35).”
Sometimes A/B testing isn’t the fastest route to a bottom-line boost.
What are my expectations?
Too many people start A/B testing for the wrong reasons. They see that their competitor used the method to boost signups by 1,000%, or read a case study on a business that generated an extra $100,000 in monthly sales with A/B testing. So, they expect to get a big win from a similar experiment.
But big wins aren’t common. If they seem like they are, it’s because nobody publishes blog posts about the dozens of failed tests it took to get those big wins. Martin Goodson, research lead at Qubit writes:
Marketers have begun to question the value of A/B testing, asking: ‘Where is my 20% uplift? Why doesn’t it ever seem to appear in the bottom line?’ Their A/B test reports an uplift of 20% and yet this increase never seems to translate into increased profits. So what’s going on?
The reason, he says, is because 80% of A/B tests are illusory. Most of those big, 20, 30, 50% lifts don’t exist.
The stat underlines the importance of proper testing method. Just one error can produce a false positive or negative result.
Don’t A/B test if you’re expecting a conversion rate lift of 50, or even 20%. Sustainable lifts are usually much smaller.
Have I done my homework?
Landing page best practices already exist. Could you test a landing page that features blocks of text over skimmable copy? You could, but plenty of studies have shown that people prefer to skim than read fully. Block text has the potential to scare your visitors away.
Could you test one version of your landing page with navigation and without? You could, but time and again navigation has been proven to decrease conversion rates. HubSpot tested this a while ago with one page that had navigation against one page that didn’t:
Here were the results:
There’s no use in testing things that have already been shown to work. Read up on how best to build a high-converting landing page before you start running any experiments. If you haven’t already built an anatomically correct landing page, you’re not ready to start A/B testing.
Why am I A/B testing?
Obviously you’re testing to boost your business’s bottom line, but for reasons you’ll discover later, you should have a reason to conduct each one of your tests. And that reason should be rooted in data.
Instead of picking a random element to test, find the weak links in your marketing funnel and choose the type of test that has the potential to fix those links. Jacob Baadsgaard’s clients, the weakest link is usually traffic:
“I discovered that — on average — all of the conversions in an AdWords account come from just 9% of the account’s keywords.
Yes, you read that right — all of the conversions.
To put it simply, for every 10 keywords you bid on, 9 of them produce nothing! Absolutely nothing! And here’s the kicker — that useless 91% of your keywords eats up 61% of your ad spend.”
More on figuring out what to test in chapter 3.
Am I willing to dedicate the time and resources it takes to A/B test properly?
A good A/B test, when conducted correctly, should last several weeks at the very least. Even if you have access to a substantial and steady flow of traffic, just one test can take months to conclude.
Are you ready to dedicate that kind of time? Are you ready to dig into your data, find out why people aren’t converting, and brainstorm ways to fix it?
Are you willing to watch your test carefully as weeks go by, making sure that outside threats to validity don’t poison your results?
If the answer is “yes,” continue on to learn how exactly you should begin preparing for your test.