How To Use A/B Testing in Your Digital Marketing Activity

A/B testing – sometimes also called ‘split testing’ – is the process used by digital marketers to compare 2 versions of a marketing asset to identify which one works best in performing a determined objective. Performing regular A/B tests on an asset allows you to gradually optimise its performance and get better results from your marketing activity over time. The types of assets that can be improved through A/B testing include:

  • Landing Pages: testing different layouts, messaging, or design assets to identify improve the page’s conversion rate
  • Emails: testing subject lines, design, headline, call to action or main text to optimise engagement metrics such as open rates or clickthroughs
  • An advert’s messaging or image to optimise conversions
  • And any digital asset linked to a determined objective

A/B testing allows you to get the most from an audience by methodically determining what elements in your activity work the best. In this guide, we will use landing pages as an example of digital assets to A/B test, with the objective of increasing its conversion rate:

Getting Started – What Should You Test

Say you are using a landing page to get your audience to download a free nutrition guide as part of your overall marketing strategy. You are seeing conversions coming through, but as any good digital marketer, you want to optimise your results as much as possible and so you decide to run an A/B test. To do so, you need to introduce an identical version of your landing page with one key difference in one of the page’s elements – for example your headline, use of colours, button, or page layout. This is the element being tested. So in our example, you have a closer look at your page and come up with the hypothesis that changing the colour of your landing page’s button to yellow, thereby making it contrast more with the page’s colour, would do a better job at capturing your audience’s attention and should therefore increase conversions. You could test almost anything on your marketing assets, it is recommended to keep things simple and pick one variable at a time. This way you can identify more easily specifically which changes you have made have resulted in uplift.

Your instinct may tell you that only changing one variable on your page won’t lead to a big change. To challenge that view, have a look at the following graph showing the conversion rate of 2 variants of the same landing page we are currently managing. Variant A (in orange) has a conversion rate of 6.8% while version B (in blue) is flying much higher with a conversion rate closer to 15%. The only difference between these two is that variant A’s form has a dropdown list from which users need to choose one of three options before submitting their details, while we have removed this choice in variant B and pre-selected an option for users. To put that in real terms, this means a simple change can more than half (or double) your Cost Per Registration, and be the difference between a campaign being profitable or not.

abtest-img.jpg

Split Your Audience

Now you have picked the element you want to test and have prepared a second version of your landing page, it is time to send some traffic to both variants. The traffic to both pages needs to be randomly selected from the same user pool to make sure we are comparing ‘apples with apples’. Many landing page builders and other digital marketing tools will have an A/B test function that will enable you to set the percentage of traffic you want to assign to each variant. Once you have your two variants running side by side, it’s time to wait and collect the data. Note that the higher the level of traffic you get, the quicker you will know the results of your test.

Review Your Data & Keep A Winner

After letting your activity run for some time, it is time to review your data and identify which version of your page is the winner. How long you ask? The answer is simple. You need to wait until you have enough data to gain enough statistical confidence in your results. We recommend waiting until you have a 95% confidence rating before concluding which variant works best (if any!). This is very important because although you may see one of your landing pages showing a much higher conversion rate than the other, if your confidence rating is only 10%, you are effectively only 10% sure that you can trust your results.

Have you checked your confidence rating is now high enough? Congratulations, you have found your winner variant! You can now discard the lower performing variant. Now what?

Repeat

Knowing which of your two variants does a better job at converting your audience is great, but the real value of A/B tests come from repeating them over time, always testing new variants. The cycle you should go through goes like this:

Test 2 variants against each other
Keep the winner and discard the loser
Introduce a new variant
Repeat
Being disciplined in going through this optimisation cycle time after time can ultimately have a big impact on your bottom line, because you are determining with confidence exactly what strategies and assets are the most effective to reach your business goals. By doing so time after time, you are working towards getting the best possible version of your assets.

For any questions or comments, please feel free to contact Hyphen.