Personalized Experiences vs A/B Testing: Different, but Better Together

Last updated on by Nicolai Doreng-Stearns in A/B Split Testing, Instapage Updates, Marketing Personalization

At Instapage, we get a lot of questions regarding the difference between A/B testing and personalization. And it’s not surprising when you consider they share a common purpose: to optimize a page’s performance.

But, while they overlap in many advanced marketing strategies, the two are very different. There’s a time and place to use each.

That doesn’t mean, however, the two should remain separate in your optimization strategy. As you’ll soon find, they’re more powerful together.

What is A/B testing, and how does it work in Instapage?

In simplest terms, A/B testing is a method of testing the effectiveness of two different versions of a design: the original, known as the “A” version or “control,” vs. the “B” version, known as the “variation.” After driving equal traffic to each, you can determine which is more effective at accomplishing the goal you aim to achieve.

Of course, there’s much more to it: There’s data to be sorted, a hypothesis to be made, and confounding variables to control on the way to statistical significance (for a comprehensive look at A/B testing, read our complete guide).

While many times it’s worth investing in, A/B testing can be a long and difficult process to manage. Luckily, with Instapage, it’s easy to set up.

How does A/B testing work in Instapage?

The setup process is difficult with many A/B testing tools. With Instapage, though, you’ll find an intuitive setup process that makes it easy to get started immediately. Here’s what it looks like:

Navigate to the page you want to test. It will be accessible via the left sidebar:

Select it and click the “Edit Design” in the top right. It will take you to the page.

In the upper left corner of the builder, click “Create An A/B Test.”

This reveals the option to create a variation of your page. Click “New Variation” to create the “B” version of your page, and a drop-down will appear, from which you can select:

At this point, you should have already consulted your data and formed a hypothesis for testing. Now, in Instapage, you’re going to adjust your “B” page to reflect that.

On the “B” variation, edit by clicking the element you want to adjust and make your desired changes. When you’re done, be sure to save your work and preview both versions of your page. If everything looks good, it’s time to return to the dashboard.

Once you’re there, click on the “Analytics” button next to the page you’re A/B testing. Here, you’ll be able to set the traffic split of your A/B test (what percentage of your traffic will see the experiment you’re running):

When you’re ready to run the experiment, return to the dashboard and select your landing page and hit “Publish.”

What is Personalization in Instapage?

The new solution is all about experiences. To provide unique experiences, you identify the UTM parameters, i.e., the tracking code within Instapage. In doing so, advertisers and marketers can maximize conversions with 1:1 personalized post-click landing pages.

Personalization aims to serve the most relevant page to each traffic segment as opposed to finding the average. It’s not built on the randomness that A/B testing is, but instead, relies on segmenting traffic based on factors like demographics, behavior, referrer, etc. then serving a page tailored to those factors.

The new offering enables marketers to:

With Personalization, marketers can create and optimize unique experiences for each target audience of a post-click landing page, in minutes and at scale, thereby enabling 1:1 personalization.

How personalized experiences are different from A/B testing

After you click “Publish” your A/B test will begin. Here’s how it works: If you’ve set your traffic split to 50/50 in Instapage, 50% of your visitors will visit your control, and 50% of will visit your variation.

However, who arrives where is completely random. This is both a strength and weakness of A/B testing as an optimization tool. Randomness keeps the experiment from skewing one way or the other, but it only allows you to find the best performing page on average.

What it doesn’t help you do is find the highest performing page for each visitor. Some visitors, for example, will love design A. Others won’t and they may never convert.

But if design “A” has the highest conversion rate at the end of the experiment, you’ll run design “A.” You’ve found the middle ground — where the majority will convert — but you’ve neglected the minority, which is still a large part of your traffic.

A/B testing example

Consider this example of a hypothetical A/B test run for a sweater company, which aimed to find the highest converting sweater color for a hero image:

Conversion rates soar for the users who see the blue version. Green sweaters don’t do too bad either. Red is the worst performing. In traditional A/B testing, these results might lead you to display the winning blue sweater image to all users, as this has the best chance of leading people to a conversion.

Let’s say 60% of your users preferred blue sweaters, 35% green, and just 5% red. So even though you’ve optimized for the majority, there’s still 40% of visitors that are not instantly attracted by your hero image and in danger of bouncing straight off your site.

Neglecting that 40% “minority” is what personalization aims to remedy by serving the most relevant page to each traffic segment (red to those who prefer red sweaters, green to green, etc.), as opposed to finding the average.

A/B testing vs personalization: a hypothetical example

Perhaps the best illustration of the difference between the two comes from Harsha Kalapala in a blog post for Bound Engagement:

Two guys walk into a bar. Let’s call them Alex and Ben. In a bar that is using A/B testing, Alex is given a wine list and Ben is given a beer list. The bar tracks whether Alex or Ben buy a drink or walk out empty handed, and attributes the result to the effectiveness of the wine or beer list.

In a bar using personalization, the bartender knows that Alex founded a brewery, so he hands him the beer list. The bartender may not know Ben, but he has purple teeth, so the bartender gives Ben a wine list. Both Alex and Ben buy drinks because the bartender offered what they were each looking for. Here we have two audience segments: brewery owners and people with purple teeth. In the A/B testing bar, one or both could be mis-served. In the personalization bar, each visitor is served based on their identified need.

In this example, the A/B testing bar will only learn whether the beer list or the wine list is more effective at keeping people there. And, while the beer list may be more effective at keeping people in the bar, it doesn’t mean the wine list is ineffective.

With the help of A/B testing, you can optimize all you want, but you’ll never reach true optimal performance by creating an increasingly better average page.

At the same time, you can personalize based on key identifying information, but you don’t have information for everything.

So, what’s the solution? Using the two together.

Why you should A/B test and create personalized experiences

Personalization and A/B testing may be different, but that doesn’t mean they shouldn’t be used together to accomplish a common goal: providing an optimal user experience. To determine how, let’s look at a basic hypothetical.

You run a nationwide chain of gyms that serves a mix of men and women. Currently, you’re offering a New Year’s deal for a year-long membership, so you create an ad and landing page to promote it.

Now, you could start personalizing right away with factors like age and sex:

But this would be getting ahead of yourself. You don’t even know if your general landing page is effective. To start driving segmented traffic immediately would be assuming you had already created the best general design.

So first, you run an A/B or A/B/C test on all your potential prospects: men and women of all ages, by showing them drastically different designs. Then, at the end of the test, you’ll know which is the best average base page among them.

After that, you’re free to start personalizing by age, sex, location, etc. Separate your traffic into specific segments and, now, A/B test within those segments. For example:

By segmenting after you first A/B test, you’re starting with a page you know is a strong design based on the response from all your prospects. Then, you make it stronger by boosting its relevance with increasing levels of personalization.

Best practices when creating personalized experiences

When we turn the optimization process above into a graphic format, it looks like this:

Here’s how to complete the process, with best practices, using Instapage:

Create your base page

The first step of this process is to create your base page in the builder. Remember that, no matter your audience, landing pages are created with very specific persuasive elements like social proof, useful media, and a 1:1 conversion ratio. Keep this in mind as you design.

Next, create one or two very different variations of your page. Don’t just change the headline. Don’t simply adjust button color. These should be very different designs that aim to find the global maximum: the best general version of your page.

Run your A/B test

When you’re done designing, it’s time to run your A/B test. In addition to following the steps above, it’s important you adhere to sound experimental design and practice. For more on what you should know before you run an A/B test, have a look at this post.

At the conclusion of your A/B test, you’ll have the best general design for your page. This will be the launchpad for personalized experiences.

Personalize your experiences

In Instapage, it’s easy to create personalized experiences with the new Personalization solution. When you sign in, click any page to see its default experience. This, for example, is the default experience for the Journey Page:

Here, your default experience features a photo of the Golden Gate bridge. With a menu in the margin you can edit URL, Integrations, Conversion Goals, SEO, Social Info, Scripts & GDPR.

But, what if you have events in both San Francisco and London? Click the blue “New Experience” button, which opens a module where you can name it:

This doesn’t replace your old experience but duplicates it with all its corresponding settings.

Now, to create your London experience, you can click “Edit Design.” This will take you to the builder where you’re free to customize:

Once you’ve finished editing your experience, you have to define its audience. Under the London Journey experience, click the “Audiences” tab, and you’ll see this:

Simply input your parameters in any order you wish, and even create ones if you need to (you don’t have to use them all). Then, click “Save,” and only the traffic that runs through URLs tagged with those parameters will see the experience you’ve created (for more on how URL parameters work, read this post). If a prospect does not match all the parameters, they will see the default experience.

Test your experiences

To test changes to the London experience and improve the relevance to that audience, you could now create an A/B variation of the London experience. You might test a different image of London, headline, longer copy, or even a new layout altogether.

By continually testing these experiences against each other, you get closer and closer to true personalization. And with increasing relevance, you reap better ROI.

A/B testing and personalization are better together

To sum up, it’s important to remember that, while they share a common goal, A/B testing and personalization are different:

Each of these methods can be carried out independently of one another. But, should they? Not if you want to combine the best of design and relevance to build the most profitable campaign possible. If you’re ready to start, get an Instapage Personalization Demo today.

Turn More Ad Clicks into Conversions

Try the world's first Post-Click Automation™ solution today. Start a trial or schedule a demo to learn more about the Enterprise plan.