How 10 Minutes Per Week Can Increase Conversions

How 10 Minutes Per Week Can Increase Conversions

Last updated on by Fahad Muhammad in A/B Split Testing

Imagine designing a CTA button that instantly quadruples your leads, or putting together a lead capture form that increases your conversions overnight. No research on your part, no trial, and error — just great results.

Sounds pretty enticing, right?

This is exactly how a huge chunk of marketers believe A/B testing works. Simply read a great case study online, see what the expert changed on their pages, replicate, and — voilà — a substantial increase in your conversions.  

Sadly, a higher conversion rate doesn’t grow on imaginary A/B testing trees.

Yes, A/B testing is the most efficient way for you to increase your landing page conversions. But, you can’t wave a magic wand and expect everything to fall into place. You need to spend some time on your tests if you want to achieve beneficial results.

How much time exactly?

When spent the right way, as little as 10 minutes per week can be enough to increase conversions.

Prioritize your time effectively

The first skill you should have before you begin A/B testing is time management. Before you start watching an A/B testing webinar or listening to a podcast on what to do with your headlines, you need to learn how to spend your time efficiently.

This is what we’re going to do today. We’re going to discuss how you can A/B test your pages by spending just 10 minutes per week on your A/B testing journey.

Begin your A/B testing journey the right way

Spend the first 10 minutes of your A/B testing journey fulfilling the prerequisites of testing. Don’t jump in at the deep end, try on a floaty first.

1. Forget Everything

To stop yourself from making any assumptions about your landing pages, pretend you have amnesia and start off with a blank slate. You know better than to make assumptions about your audience; so why would you make assumptions about what elements your audience will respond to?

To increase conversions, you must be data-driven. Test everything — not just the elements you think your audience will focus on.

2. Set an end point before you begin

Don’t rush your A/B tests. To run a successful A/B test, it’s important to establish an end date before you start. Determine the duration of your tests beforehand to make sure you’re getting accurate results. Whatever timeframe you decide, keep this timeframe consistent between tests so you can compare the results evenly.

3. Best practices are not the law

Get inspiration from current trends and best practices, but don’t just do what others have done before.

Experiment with your pages. Create a variation that makes sense for you. Just because Todd increased conversions for his SaaS landing page by changing his button size, doesn’t mean a smaller button is going to work for your real estate landing page. Test your button size for yourself.

The landing page industry is still pretty new. Who knows? Maybe your next page will set the trend for the next big wave!

4. Test one element at a time

That way you don’t split your focus, and your results are clear and easy to interpret. If you changed your CTA button size and your headline in one variation, how do you know which one increased (or decreased) conversions?

You don’t. Take the time you need to create different variations to guarantee you’re making the right changes.

5. Get a handle on basic statistics

An A/B test can tell you which variation “wins”, but that’s only half of the puzzle. Statistical significance tells you whether or not you can rely on the A/B test you just performed.

To confirm statistical significance, you need to make sure your sample size wasn’t too small. For example, if you saw a lift in conversions by testing two variations of your CTA button — and your sample size was only ten people — you can’t guarantee those results are statistically significant because you can’t generalize them. Even if the results showed that variation B’s button won, it could mean that only six of the ten people preferred the latter button. In this situation, that’s not enough data to generalize results. Test a bigger sample size to get accurate results.  

Now that you know what to do before you begin A/B testing, let’s talk about what you should test on your pages.

A/B test the most “disruptive” elements first

Disruptive elements are the most obvious landing page elements — the elements that catch your visitors’ eyes instantly. Dedicate 10 minutes per week for your A/B testing journey and create variations for these elements.

Wondering what to test? Here are a few ideas.

Test your headline

Your landing page headline is the first thing your visitors see, which is why your headline needs to be attention-grabbing and clear. Ten minutes per week writing a new headline can produce some fascinating results.

Shown below is Awayfind’s headline copy for one of their landing pages:

pic 1

Guess which variation won?

Variation B because it included a sub-headline that supports the main headline, converting 38% more trial submissions than variation A.

Variation B also includes the brand name in the copy highlighted with bold text.

However, sometimes a little mystery and vagueness also work for your headlines. Speed Vegas conducted an A/B test using a vague headline, as opposed to a specific headline:

pic 2

The headline on variation A reads, “Life is Short. Just Drive.”, while the headline on variation B is “Drive Five Supercars: The US Supercar Tour.”

Variation A converted 34% better than variation B at a 99% statistical confidence level. The moral of the story? Don’t assume anything.

Test your CTA button

Your CTA button is where the action happens, so your button needs to be large, color-contrasting, and be written with personalized copy. Spending about 10 minutes per week creating a different CTA button (changing the color, size, or copy) can teach you a lot about what CTA works best for your business.

Infusionsoft conducted a test on their button copy that ran against all page traffic. Here are the two primary variations created:

pic 3

Infusionsoft’s team thought the word “demo” in the button copy was a little misleading because it implied more than what was being delivered — causing friction. Turns out their intuition was correct because variation A bought in 48.2% more leads at a 98.5% confidence level.   

Test social proof

Having numerical social proof helps your landing page gain credibility with your visitors, as it allows them to see that other people have used (or are currently using) your service. In fact, Instapage’s PPC landing page form also has numerical social proof on it:

instapage social proof

While this social proof is working for us, it may not work for you. The only way to know is to test it on your landing pages.

DIYthemes tested different variations of their form copy:

pic 4

Guess which variation won.

Believe it or not, variation A — the one without the social proof increased conversions by 102.2%!         

So, why did social proof fail here? It could be because the social tally above the signup form interrupts the visitor from entering their email. And adding that one line of text to the CTA most likely took less than 10 minutes.

Best practices are not the law of the land.

The above three elements are not the be-all-end-all for A/B testing. Other disruptive page elements to consider testing include:

So, what did we learn?

Every landing page is different, which is why every A/B test is unique. Learn from your own tests and increase conversions on your terms.

P.S.: Did you know Instapage’s Pro and Premium plan allows you to create unlimited A/B testing variations, so you don’t ever need to worry about any limitations? Sign-up for an Instapage account or upgrade to Pro or Premium here to learn more. It takes less than 10 minutes, too!

Show Me the Top 10 A/B Testing Tips
We use cookies to give you the best experience on our website, deliver our services, personalize content, and to analyze traffic. By continuing to use our website you agree to allow our use of cookies. To know more please refer to our Cookie Policy.
Got it close