Don’t Believe the Hype, A/B Testing Is Not Overrated

Don’t Believe the Hype, A/B Testing Is Not Overrated

Last updated on June 20, 2016 by Ted Vrountas in A/B Split Testing
Show Me the Top 10 A/B Testing Tips

In a perfect world, you’d conjure up a powerful headline, craft some compelling copy, embed some eye-catching images, and together they’d convert ALL your visitors.

You’d make a ton of money, and when your campaign concluded, your landing page would be retired in the digital marketing hall of fame.

Keep dreaming!

In reality, no landing page converts 100% of its visitors — and only 22% of businesses are satisfied with their current conversion rates.

That means there’s always room for improvement. Enter A/B testing software.

What is A/B testing?

A/B testing is a method of optimizing your landing page by isolating one page element at a time and testing it against another variation.

Could your headline be better? Create a variation of your landing page with a different headline.

Are your images boring? Develop another version of your page with some more attractive ones.

Whichever page has a higher conversion rate is the winner. Here’s a diagram to help you visualize what an A/B test might look like:

This diagram explains A/B testing software using two landing page variations.

A/B testing software (like the one built into Instapage) helps people like you and me, who don’t have a ton of technical experience, test landing pages against each other.

Is A/B testing overrated?

A/B testing can have an astounding effect on your conversions, and ultimately, your bottom line.

Just ask Barack Obama’s digital strategists, who boosted campaign donations by 60 million dollars with one A/B test; or Google, who ran more than 7,000 A/B tests in just one year alone.

And it’s not just for presidents and Silicon Valley giants either.

Research from Optimizely shows that 93% of businesses use A/B testing or multivariate testing (testing more than one element at once) to optimize their online channels.

You can use A/B testing software to create the highest converting version of your landing page possible. Test things like headlines, images, and button copy to see what has an effect on conversions — without needing to know your way around the back end of a landing page or website.

You’ll be surprised what you find.

Take California Closets for example, who by using A/B testing, found that the landing page shown below outperformed the original by 439%!

This picture shows how California Closets used A/B testing software to increase their conversion rate.

How to use A/B testing software to get a high ROI

Do: Test important page elements

If you’re having trouble thinking of things to test, here’s a good place to start. Headlines, form fields, copy, CTA buttons, layout, images — they’re all fair game.

Sometimes you’ll find a shorter form will perform better than a longer form, like this case study of Flying Scot Parking’s landing page:

This picture shows how Flying Scot increased its conversions by using a shorter landing page form.

The second form resulted in a 35% boost in conversions!

Sometimes you’ll find that a testimonial headline is better than a question headline, like in the case of this landing page from Laura Roeder, who saw a 24% conversion rate increase from her control:

This picture shows how Laura Roeder uses a question headline to increase landing page conversions.

Versus her variation:

This picture shows how Laura Roeder uses a statement headline on her landing page as variation B.

Don’t waste your time testing things that have already been tested: “Dear (First name)” has been proven to work better than “Dear customer” in almost every case.

Testing for things that are already commonly known is a waste of your time and money (finding out if something has already been tested is as easy as doing a Google search).

Instead, stick to A/B testing the most important elements of a high-converting landing page.

Don’t: Get blinded by the data

It’s February 12th, and you’re a chocolatier who’s just starting to test a landing page for the “Ultimate Chocolate Sampler.” Yum.

It doesn’t take long before you realize you must be a natural when it comes to landing page creation because your conversion rates are through the roof! For real, you’re crushing every industry benchmark!

But… wait a minute. Three days later you finally get around to creating variation “B” of your landing page, and there’s some bad news: it’s converting 300% lower than the original! Hm…must be a fluke.

You create version “C” to compare to your original, but this one isn’t much better. It’s still converting 244% lower than the control. What gives?

If we had to guess, that sky-high conversion rate was probably a result of the heightened demand for chocolate before Valentine’s Day.

Now, if you were only focusing on the numbers, you wouldn’t be inclined to take external variables like holidays into account. Remember: the data you’re generating is based on actual people who are affected by real-world conditions like days of the week, and even weather (that landing page for your snow blower isn’t going to perform as well in the summer heat).

To make sense of your data, you’ll need to think beyond the numbers to interpret it.

Do: Be patient

Ending your test too early could result in a false positive when it comes to choosing the winner of an A/B test. Conversion expert Peep Laja describes a test he ran that many people would have ended too early:

“Two days after starting a test these were the results:”

This picture shows how Peep Laja A/B tested his landing page and the results two days after starting the test.

The variation I built was losing bad — by more than 89% (and no overlap in the margin of error). Some tools would already call it and say statistical significance was 100%. The software I used said Variation 1 has 0% chance to beat Control. My client was ready to call it quits.

However since the sample size here was too small (only a little over 100 visits per variation) I persisted and this is what it looked like 10 days later:

This picture shows how Peep Laja's A/B test determined a winner after ten days of experimenting.

That’s right, the variation that had 0% chance of beating control was now winning with 95% confidence.”

The more statistically significant your data is, the more confidently you can make inferences about your data. So what makes a statistically significant data sample?

Peep says you should test until your page gets around 350-400 conversions.

Other marketers think that you should let the test run for a few months.

Some people say to wait until you’ve generated enough visits to produce a 95% confidence level. The example above proves that even that method is flawed.

Here’s the thing: there’s no magic number or length of time when it comes to testing landing pages. Every business, landing page, and marketing campaign are different; and all those things will impact how you run your test.

To make an educated guess on the type of traffic you’ll need to reach statistical significance, try using this calculator (but even this isn’t a sure thing).

The important thing to keep in mind here is that the bigger your sample size gets, the more accurate your data will be, and the more confident you can be about making decisions based on that data.

Don’t: Run multiple tests at once

If variation A’s button converts 20% higher than variation B’s button, and variation B’s headline converts 10% fewer visitors than your control’s headline, which version of your landing page should you run? (Brings you back to high school algebra, doesn’t it?)

Conducting too many tests at once can create all kinds of confusion. If you want to test your headline and your form fields, first test one, and then the other.

With Instapage, you can run multiple tests at once, but we recommend you keep things as simple as possible.

Do: Fail your way to success

This picture shows how Thomas Edison's "fail" quote inspires marketers to A/B test constantly.

Nobody likes to fail, but when it comes to A/B testing, failing just means you’re currently using the best version of your page.

Now, that doesn’t mean you should stop testing altogether when your variation page converts less than the original. In fact, it means just the opposite.

You never know what’s going to make the difference in your conversions. We were surprised to find out that the original color we used on our landing page button converted 34% more than its two variations, shown below.

Variation 1:

This picture shows how Instapage uses a green CTA button to increase their conversion rate by 34%.

Variation 2:

This picture shows how Instapage tested an orange CTA button on its landing page, which lowered the conversion rate.

Does that mean we have the best-converting button color for this landing page?


It just means we haven’t found a better one yet.

Don’t: Expect a huge lift in conversions

Think you’re going to be the one to create the next $300 million dollar button?

Think again.

The chances of one small change boosting your conversions by 50, 100, or 300% are less than slim. So don’t get caught up chasing that imaginary big fix, and stick to the fundamentals of landing page optimization. A bunch of small conversion bumps adds up to one big lift.

Here’s an example of how it took six tests of the same page to create a substantial conversion rate boost.

Don’t: Spend ALL your time A/B testing

As Social Triggers founder Derek Halpern explains, spending too much time testing every little thing in search of teeny-tiny lifts can distract you from other ways to boost your conversions (remember when Google tested all 41 shades of blue?). Here’s the example he uses:

“If I get 100 people to my site, and I have a 20% conversion rate, that means I get 20 people to convert.

I can try to get that conversion rate to 35% and get 35 people to convert, or, I could just figure out how get 1,000 new visitors, maintain that 20% conversion, and you’ll see that 20% of 1,000 (200), is much higher than 35% of 100 (35).”

If you’re the member of a small marketing team at your business, you should spend time optimizing other parts of your landing page campaign.

After all, landing pages are only 1/3 of your conversion equation.

Check to see if your emails could use rewriting, or your ads need new images. Analyze other aspects of your campaign to see how they could be affecting your conversion rate instead of focusing all your time on optimizing every little thing on your landing page.

Do: Use A/B testing software to create the absolute best version of your landing page

Some marketers claim your time is better spent creating the perfect landing page than A/B testing variations — and we partially agree.

You should spend time optimizing until you have the best, most beautiful, highest converting landing page possible.

But there’s one problem with creating the “perfect” landing page.

First, there’s no such thing (perfect would mean a 100% conversion rate).

Second, you can follow the rules of optimization and every best practice out there, but you still won’t create even a near-perfect landing page on the first try.

How do I create my recipe for success without A/B testing?

Well, you could use multivariate testing. But that’s infinitely more complicated.

Other than that, the answer is: you can’t! With the right software, no other way of testing is as accurate or as straightforward as the A/B method.

To begin your testing quest for the highest converting landing page, start building your page today!

Show Me the Top 10 A/B Testing Tips
Sign Up for a 30 Day Free Trial
Join 250,000+ other businesses who rely on Instapage.
Get Started