Here’s Why Smart Marketers Use A/B Testing

6

alitalia2

How often are websites designed using “best practices” or by trusting the experience of a seasoned expert? The answer is, “all too frequently.”

In every speech I give, I offer practical advice on how to get better marketing results by using brain and behavior science. But, I also remind audiences of the importance of testing rather than blindly trusting my advice, not to mention the advice they get from books, articles, experts, and other sources. A/B testing is simple and much cheaper than missed conversions.

WhichTestWon (a site you should subscribe to!) just described the results of an experiment which shows why A/B testing is so important.

Their post describes an email-collection popup test for an airline site which ran in two markets, Italy and the UK.

The variation was simple – one version, “A,” asked for just the email, and the other, version “B,” had a drop-down selector for the visitor’s preferred airport of origin.

Conventional wisdom says that every form field you add reduces the conversion rate. So, unless you are using additional form fields to filter out unmotivated visitors, one field is usually better than two.

alitalia

Here’s what the tests revealed, according to WhichTestWon:

In Italy, version B won by a landslide, increasing conversions by 80.9%. These results are based on over 650,000 unique visitors, with a confidence rate of 99.9%.

In the UK, however, version B was the clear loser. It created only a 0.63% lift, 81% less than Version A, the non-personalized email-only version.

The first surprise is that in Italy, the additional form field improved conversion in a dramatic fashion. So much for minimizing fields as a universal best practice.

The second surprise is that the UK market behaved nothing like the Italian market. If you saw results from just one set of A/B tests, you probably wouldn’t guess the results wouldn’t extend to other markets.

The WhichTestWon folks took a stab at explaining the results:

It’s likely that cultural differences between audiences affected behavior.

The team from Webtrends hypothesized that customer care is more highly valued in Italy. Italian users may have, therefore, viewed the additional form field as a positive attempt to create a more personalized web experience.

In contrast, UK visitors may have seen filling in the extra field as double the amount of work.

I am always very leery of after-the-fact explanations of human behavior, but that reasoning is as plausible as any other conclusion.

To me, the big takeaways from these tests have nothing to do with form fields. Here’s what I’d emphasize:

  • Never rely on best practices as a substitute for testing.
  • Never assume that even very strong test results will translate directly to similar, but not identical, situations.

Happy testing!

6 Comments
  1. Paul Hassels Monning says

    Personally, I believe human behavior should ideally be interpreted both after-the-fact (or rather: fact-based) and upfront. Combining digital analysis in marketing & sales with the merits of neuromarketing (EEG / fMRI) presents the strongest blend to date to come to better customer (& prospect for that matter) insight. In your final comments there’s a double negative. Am I right in assuming you meant “never assume that even very strong test results will translate directly to similar AND identical, situations.”?

  2. Andrew Munro says

    Hi Roger, great post as usual, and I agree with your argument. On the specific example though, I wonder if the explanation is more obvious. Alitalia is the flag-carrier airline for Italy. Presumably, users in Italy see greater value in providing additional, local information to receive more targeted results. I guess a user in the UK or US would have a very limited choice of local airports from which Alitalia flies? Just a thought.

    1. Roger Dooley says

      Good point, Andrew. That’s a logical explanation, too. If you have a preferred airport for Alitalia, as Italian customers might, you might find it desirable to specify it.

  3. Kenan Nashat says

    As Andrew mentioned, the fact that Alitalia is the national carrier can be a possible logical reason, but it could go beyond just that. As a carrier, Alitalia doesn’t have a great reputation outside of Italy, so I wonder if one is more ready to give information if the brand perception is better. So for example, if you did A/B testing with Lufhansa, a carrier with a strong reputation as a reliable brand, are people more likely to provide more information willingly?

    1. Roger Dooley says

      Yet another possibility, Kenan. I think this discussion shows why testing is so important. With no testing, one opinion wins based on what seems logical (or on who has the most power). In an organization that tests, that opinion simply becomes a hypothesis to test. And, it’s entirely possible that an alternative approach might increase or decrease the conversion rate even if the reasoning behind the hypothesis isn’t correct.

  4. Soumya Roy says

    A/B testing is so much important these days and I completely agree that designs and strategies should be fact-based. We are a small startup company and we do A/B testing on the design of our landing pages, our adwords campaigns and on our email campaigns. This not only helped us to point out which is working and which are not but it also helped us to find out how things and marketing perceptions change from niche to niche and market to market. As a startup we always have tight budget in every aspects and monitoring and testing can ensure highest ROI over time.

Leave A Reply

Your email address will not be published.