So, you may have heard of this crazy term called “A/B testing” or “split testing”. In the world of marketing it’s a pretty common concept, but if you are new to marketing or perhaps are just looking for a refresher – you’ve come to the right place.
I have been a marketer for more years than I would like to admit and what I have learned over time is that the success of a campaign can be boiled down to a couple key things. First, there are some fundamental, core techniques that one uses in marketing campaigns such as proper terminology, hooks, persuasive call to actions and delivering the lead value. What makes one version more successful than another is sometimes hard to pinpoint. This is where the second aspect of effective marketing comes in – testing!
Before continuing, I do want to point out that not every marketing tool has the ability to do A/B testing. Most email platforms do, but be sure to keep this in mind if you are shopping around. Tools such as Pardot, offer the ability to run your A/B test to a smaller portion of your list, once a “winner” is determined, the remaining contacts receive the winning version of your email. Smart, eh?
You’re probably thinking what kind of things can I A/B test? Well, to start with, you can test different versions of emails, website pages, landing pages, paid advertising copy or images, and in the non-digital world you can test different versions of mail outs, cold calling scripts, etc.
Let’s take a look at a landing page example to get a better idea of what one can test for.
In this example we have three different versions. They all have different colored call to action buttons, layouts different and even the amount of content is different.
Something very important to keep in mind when A/B testing is to test for small differences one at a time. In the above example the first test could have involved changing the button color. If the green button outperformed the others, great we have a winner. The second test will have a green call to action button but we’ll change the length of the form. We then find that a short, one field form captures more lead info. Lastly, we’ll test for overall layout of the page. By moving through these systemized tests, each new landing page version should be more effective than the last.
As you can see, we are able to accurately determine what is influencing people by methodically test each aspect separately.
Here is a quick recap of the A/B testing process.
- Define Starting Point: It helps to begin with high traffic areas of your site or emails going to a large list, as that will allow you to gather data faster. Look for pages with low conversion rates or high drop-off rates that can be improved.
- Identify Goals: Your goals are what you are using to determine whether or not the variation is more successful. Goals can be anything from clicking a button to product purchases and e-mail signups.
- Generate Hypothesis: Once you’ve picked a goal you can begin generating A/B testing ideas and hypotheses for why you think they will be better than the current version. Once you have a list of ideas, prioritize them in terms of expected impact and difficulty of implementation.
- Create Variations: Make the desired changes to an element of your website or email. Make sure to run quality control on your experiment to make sure it works as expected.
- Run Test: Visitors to your site or receiving your email will be randomly assigned to a variation. Their interaction will be measured, counted, and compared to determine how each performs.
- Analyze Results: Once your experiment is complete, it’s time to look at the results. Your software will present the data from the test and show you the difference between how the two versions of your page performed, and whether there is a statistically significant difference. As a marketer it’s always nice to see one version significantly outperform, however this isn’t always the case.