What is A/B Testing? – Chapter 4

A/B Testing Best Practices

Last updated on September 28, 2016

1. Test everything

What’s trending? Will it work on your page? Does it convert, or is it a total flop like image sliders?

If you haven’t personally tested it on your landing page, you don’t know if it’s going to be a success or failure. What works for other marketers may not work for you, vice versa. So test away!

Short-form versus long-form pages. Big button vs. little button. Headline “A” versus headline “B.” Your landing pages should always be competing to see which variation is victorious. Remember… There can only be one.

2. No, really everything

Do you need that extra information you’re requesting? Do you need more than a user’s email? Test it and see.

What if you tried using a different image? What if you cut the image entirely? What’s going to perform better?

Make no assumptions. Ask questions. Fail faster so you can find success faster. There is no substitute for the data you’ll get out of this process.

3. Start over, and test it again

One A/B test is not sufficient to achieve maximum results. We’ll say it again: One test does not optimize a landing page. Two tests won’t either. If you want to reach peak optimization, you’ll need to experiment. And once it stops working?

You’ll need to do it all over again. And again. And again.

This is why having software that makes it fast and easy to create landing pages is imperative to the success of any data-driven marketer.

The best part about A/B testing is that your failures, your bad tests, are just as valuable as the successful ones. Keep track of this information users are providing you.

If you want to master conversion rate optimization, you must have a complete understanding of what can go wrong with a test so you know whether or not your data is reliable.

The 6 Major Mistakes Marketers Make When A/B Testing

1. Letting success (or others’ success) go to your head

A/B testing winners are all over the place. We’ve used A/B testing case studies to prove our points throughout the guide. What you’re not seeing is the vast majority of experiments that fail.

You hear about the big, exciting discovery. You don’t really read about the thousands of failures on the way there. Start with your MVP, and then proceed from there. You’ll find more success in the long run.

2. Assuming huge wins are the norm

It’s easy to think 20% or 50% increases in conversions are just what happens when you’ve seen those hyper-successful campaigns. These percentages are not normal.

Your test is just as likely to come back as a failed experiment, or even with results that are not outstanding. This is a good thing and will provide you just as much insight as a successful experiment (we’ll cover more on that later).

3. Reading neutral results incorrectly

Getting a neutral result is… normal. When it comes to achieving statistical significance, getting a null hypothesis (a hypothesis that results due to chance alone and not due to a concrete cause) is just part of A/B testing.

4. Only testing obvious elements

We can’t emphasize enough there is more to a page than the CTA button and the headline. They are important, but small, subtle elements on the page can be as effective or even more so in persuading your audience to convert.

Have you tested different testimonials? Do you have a full understanding of what convinces a reader to become a user?

5. Not being patient

Set up your testing window in advance so you don’t pull the trigger too early based on incomplete results. MailChimp’s A/B testing software for headlines allows you to set up a test with two headlines on a given percentage of your list. Then, it will automatically send the winning headline to the rest of your list.

6. Forgetting about mobile A/B testing

Mobile responsive landing pages are the newest trend and are here to stay. If you want to stay ahead of your competition, you need to think about mobile A/B testing. Create variations for your mobile A/B tests keeping in mind your visitors’ mobile experience.

Make a realistic estimate of the traffic you expect to receive, determine your end point, and set up your test.

What elements should you A/B test?

A/B testing variants of each landing page element give you the best chance to increase conversions.
Now that you have a strong foundation to start testing let’s go through some case studies to give you ideas for testing different elements.

1. Test your headline

Signal v. Noise tested five headlines and subheadings on the Highrise signup page. The winning variation put an emphasis on the 30-day free trial and mentioned that signup takes less than 60 seconds, and increased conversions by 30%.

Highrise Signup Page Headline Test

Laura Roeder tested a long versus short headline on her page.

This was the control:

Highrise Control Page Test with Shorter Headline

This was the challenger:

Highrise Challenger Page Test with Longer Headline

The challenger produced 24.31% more conversions — using a customer’s words to get more newsletter signups.

2. Test your copy and headline

WikiJob A/B tested to see if three lines of customer testimonials had any impact. The page with testimonials resulted in a 34% increase in sales. Here is the original page (without testimonials):

WikiJob Control Page without Testimionials

Here is the 34% higher-converting page with testimonials:

WikiJob Control Page with Testimionials

Crazy Egg also tested a short page against a long page. The longer version won by 30%:

Crazy Egg Long vs Short Content Comparison Example

3. Test the CTA button

Button size can make a difference. In the example here, the bigger button received 201% more conversions:

this image shows an A/B test performed on CTA button size, with the bigger button increasing conversions

Here, personalizing the CTA button copy increased conversions by 33.1%:

this image shows a personalized copy button test

GoCardless one of the UK’s leading online direct debit suppliers decided to change one word in their CTA button copy. The A/B test involved changing the word, “request” to “watch” to see which one would have an impact on conversions.

Variation B with the CTA button copy “Watch” increased product demo conversions by 139%:

GoCardless A/B Tests One Word to Increase Conversions

4. Test the form

Vendio removed their signup form that lead to an increase in conversions. This was the control page:

Vendio Control Page Test

The second version utilized a two-step opt-in process, so visitors didn’t see the form as soon as they arrived on the landing page:

Vendio Lead Capture Form Removed from Page

The two-step opt-in form became visible when visitors clicked the call-to-action button, this reduced friction, as visitors aren’t intimidated to give out their information as soon as they land on a page. Removing the lead capture form on Vendio’s landing page resulted in a 60% increase in signups.

Removing a trust seal next to the CTA button increased conversions by 12.6%:

A/B Testing Trust Seals on Landing Pages

Interestingly enough, the above case study defies the best practice that trust seals and symbols always increase conversions.

Blue Fountain Media, on the other hand, increased conversions by 42% using a trust seal:

Blue Fountain Media Page with Trust Seal

Removing the “company name” form field from Expedia’s landing page form, gave the company a $12 million increase in revenue!

Expedia Landing Page A/B Test

5. Test a “free trial” button

This particular test is often very successful for SaaS company landing pages.

When GetResponse added a “Free trial” CTA button alongside the “Buy Now” button, they increased signups by 158.60%:

Get Response Free Trial CTA Button Test

6. Test adding a human image

Highrise experienced a 102.5% increase in conversions when they added an image of a real person instead of a graphic explaining the service:

Highrise Adding Person to Landing Page for Higher Conversions

7. Test adding a video

Grow your own groceries is a service that focuses on showing their customers how to grow organic food. The service added a video on their landing page that resulted in a 12.62% increase in clicks on the Add to Cart button.

Grow Your Own Groceries Video A/B Test

8. Test customer review widgets

Adding social proof and authenticity badges on e-commerce pages helps with conversions. ExpressWatches, an online dealer of Seiko watches added an authenticity badge on their page that declared them as an “Authorized Dealer Site”, which increased sales by 107%:

Seiko Adds Authenticity Badge to Landing Page

A/B testing can dramatically improve your conversion rates. But before you begin testing, it’s important to understand some best practices, major mistakes, and what elements to test. Once you have that foundation, it’s all about repeating the process and adjusting your page variations accordingly.