In discussions about A/B testing, everyone usually agrees that it is a good thing. To get started, however, somebody must first have the time and the resources to build the A/B test rig. It will take weeks, and will probably involve some interesting big data technology. Also, it usually never gets priority enough to actually get done.
However, there are many way to do A/B tests quickly.
I was recently part of an Iterate team that helped a local cashback service (startshop.no) improve conversion rates. We managed to get some feedback from users using rapidengage, and proceeded to make a few different landing page variants using Unbounce. Setting up the tests in Unbounce took us a few hours, and it is a very quick way to A/B test. It can be even quicker, though.
After working for a while on our conversion project, we found a little piece of gold: Customers seemed to convert better if we used the word “bonus” instead of “discount” to explain the cashback concept. Some of the cashback rates are 3%, and it seemed like customers would deem that a low discount, but a decent enough bonus. With this change in wording, conversion improved two to three times.
This made me curious – a discount and a bonus are clearly equally valuable. While the landing pages clearly indicated that customers thought otherwise, there were sources of error. The design of the landing pages for “discount” and “bonus” were quite different. Perhaps it all came down to the design of the page – not the wording?
I wanted to test this, but as cheaply as possible. A simple survey would not do. Once you are asked to rate a discount or a bonus first, you are influenced. Your response to the attractiveness of the other word will be influence by your response for the first word.
Instead, if I could distribute two almost identical Google Forms, and randomly send half the respondents to one form and the other half to the other form, I would be good. I could ask people to rate a “bonus” in one instance and a “discount” in the other. A quick search gave me the answer: na.gg is a link shortener that can be set up to do A/B testing.
With one na.gg-link, pointing to two different Google Forms, I asked people to rate the attractiveness of a service that gave you a 3% discount or bonus, depending in which Google Form na.gg sent you to. This took me less than 5 minutes to set up – no rig required. I distributed it in various channels, and after a while I had about 75 responses.
A/B tests won’t come much cheaper than that.
Besides using Google Forms, you can use whatever link destination you want that it is easy to set up two or more versions of.
The results? On a scale from 1 to 10, the discount was rated at an average of 3.89 while the bonus was rated as 4.39. Not a huge difference, and I would need more respondents for it to be significant (a t-test in R gives a p-value of 0.2 for my results).