6 A/B Testing Myths: How These Myths Mess with Your Results

Last updated on by Alex Birkett in A/B Split Testing

A/B testing is fun. It’s popular. It’s getting easier to do.

However, if you’re doing A/B testing wrong, you still may be wasting a ton of time and resources.

Even with the increasing ubiquity of A/B testing, there are still many myths around the subject, some of which are quite common. To really derive value from any given technique, it’s important to understand it for what it is — including its limitations and understanding where it’s powerful.
6 A/B testing myths you should stop believing today.

Click To Tweet

This article will outline the top myths I’ve seen spouted time and time again in blogs and by consultants.
1. A/B Testing and optimization are the same thing

This may seem a bit finicky, but A/B testing itself doesn’t increase conversions. Many articles say something to the effect of “do A/B testing to increase conversions,” but this is semantically inaccurate.

A/B testing, otherwise known as an “online controlled experiment,” is a summative research method that tells you, with hard data, how the changes you make to an interface are affecting key metrics.

What does that mean in non-academic terms? A/B testing is a part of optimization, but optimization encompasses a broader swath of techniques than just the experimentation aspect.

As Justin Rondeau, Director of Optimization at Digital Marketer, put it, “Conversion rate optimization is a process that uses data analysis and research to improve the customer experience and squeeze the most conversions out of your website.”

Optimization is really about validated learning. You’re balancing an exploration/exploitation problem (exploring to find what works, and exploiting it for profit when you do) as you seek the optimal path to profit growth.
2. You should test everything

I was reading a forum on CRO where someone asked about a particular word choice in a headline (I think it was “awesome” or something), and they were wondering whether or not it was overused.

An “expert” chimed in with advice (paraphrasing here) that you can never know for sure until you test every other similar word (“fascinating,” “incredible,” “marvelous,” etc.)

This is silly advice for 99.95% of people.

Everyone has heard the story about how Google tested 41 shades of blue. Similarly, it’s quite clear that a site like Facebook or Amazon theoretically has the traffic to run tests like this.

Turn More Ad Clicks into Conversions

Try the world's first Post-Click Automation™ solution today. Start a trial or schedule a demo to learn more about the Enterprise plan.