The most common A/B testing mistakes
It’s not uncommon, even for regular A/B testers, to make these costly mistakes. Be warned — the following have the potential to waste your time, resources, and data.
Blindly following best practices, or not following them at all
So you saw that your competitor boosted their conversion rate with an orange button, and they declared in a case study that orange is the best button color. Well their business, no matter how similar to yours, is not yours.
“Best practices” tested by other businesses aren’t necessarily best for your business. At the same time, you’re not trying to reinvent the wheel. Professional optimizers have run countless tests for you already, and from the results of those tests we know things like:
- Testimonials boost trust
- Benefit-oriented headlines work
- Skimmable copy makes reading easy
- Navigation links kill conversion rate
But things like “the best button color” and “the perfect headline”? They’re all subjective. They don’t necessarily apply to you.
Testing things unlikely to bring a lift
Google once tested 41 shades of blue to determine which was most likely to impact conversion rate. Should you?
No. If you’re like most small to medium-sized businesses, tests like that would be a waste of your resources.
You should not approach A/B testing with a “test everything” mindset. Instead, focus on adjusting things most likely to bring the biggest lifts in conversion rate. Most of the time that’s not subtle changes to colors or typeface.
Sure, you’ll read case studies on the internet about how a business generated millions in profit by changing one word on their landing page, or by removing a singular form field — but results like that are incredibly rare (and they may not even be accurate).
Unless you have an abundance of time and resources at your disposal, steer clear of tests like those.