Don’t Guess What’s Best: Why You Should Be A/B Testing

March 31, 2015


Internet marketers love their letters. SEO, PPC, CTA, and so on and so forth. Here’s one that you probably won’t see tossed around quite as much: A/B testing.


It’s the blunt-instrument method of refining a web page or marketing campaign; take two variations on a theme, set them loose, and see which one customers prefer.


And, if we’re being honest, we can understand why A/B testing doesn’t snag too many headlines. It’s a relatively simple method, technologically and conceptually. In a word, it’s unsexy. But here we are talking about it. Why?


It works.


A/B testing still has an enormous amount of potential value for anyone looking to promote their online presence. And while this utility applies to just about any online content, we’ll be examining the role that A/B testing plays in a recently maligned corner of internet marketing: banner ads.


Why Test?


Planning, placing and launching a banner ad can make for an enormous amount of work. By the time it’s actually running, there’s a very understandable temptation to leave it alone and turn to other work. Unfortunately, that’s almost never enough.


You’ve probably heard the numbers on banner ads before. Recent articles have come down hard on the venerable medium. The oft-repeated .1% average CTR is grim, and encourages advertisers to think of banners as cannon fodder, maybe good for establishing presence or snagging a few clicks, but not worthy of close attention.


Grim, but misleading. That .1% represents an average, and one that can be affected, as we’ll see later, by even the tiniest of changes. Second, mobile devices are breathing new life into the banner. A study released last August showed that on tablets, banner ad CTRs skyrocketed up to nearly .6%.


There’s still plenty of kick left in banner advertising, and A/B testing remains one of the best ways to find it.


A/B Testing Yields Actionable Data


A/B testing brings market research right down to its empirical roots. Begin with a hypothesis, then attack it a couple of different ways. If you believe that a small change would make a variant more successful, there aren’t many ways to prove that outside of A/B testing. Project and analyze all you want, but until you actually compare real-world performance, there’s no guarantee that you’re right.


But run an A/B test, and you’ll almost always come away with actionable information. Send two or more variants of an ad into the wild and one will outperform the other(s). If you’ve tested intelligently, you’ll know why.


A change in background color can be all it takes for option A to leave option B in its dust. Whenever one of these obvious winners appears, it leaves you with a better ad and a valuable piece of market research that you can apply to later campaigns. If a change that you thought was a sure winner comes up empty, then you’ve at least identified a dead end, and can avoid wasting any more time on it.


A/B Testing Increases ROI


Small businesses can never afford to throw money away. Ad budgets, especially for startups, are generally skimpy, and marketers are forced to juggle cost and effectiveness. Banner ads are already a relatively cheap way to promote a brand, but that only covers the I in ROI.


Bumping up the R portion requires some time and effort. Banner ads start with pretty measly conversion rates, but they don’t have to end with them.


Minuscule changes can have shockingly large effects on a campaign. It’s something we’ve seen time and time again; tweaking a word or two can make the difference between thunderous success and squelching failure. Consider these cases of A/B testing success:



  • Simplifying the copy of just one bullet point in a description of an eBook increased conversions by 18.59%
  • Adding a rating system, free delivery incentive, and payment options to a call-to-action helped raise a mail order company’s basket adds by 24%
  • Changing the language in a banner ad to be more personal helped Sony bump sales of a laptop, as the altered ad drew 6% more customers than the original.

In each one of these cases, an advertiser had a hunch, acted on it, and saw the benefits. The changes are small, and not necessarily intuitive, but they have enormous impacts; in some cases each altered word likely brought in thousands of extra sales.


Why Not to Test


No system’s perfect, and some marketers have pointed out flaws in A/B testing. One of the most common: a comprehensive A/B test gives an ineffective ad too much play, something that could be avoided by fully optimizing one ad before release.


It’s a fair complaint, but not one, I think, that is ultimately enough to scupper A/B testing. The potential value of an A/B test, and the difficulty of realizing that value with other methods, is simply too high. Even if we were to consider the above a serious negative, modern analytics suites are likely enough to render it moot. Google Analytics and similar programs give incredibly rapid, accurate feedback, enough so to allow advertisers to dynamically alter campaigns to reflect insights gained from A/B testing.


Bottom Line


Sometimes the simplest tools are the best.


An A/B test isn’t going to wow, but the results from it can. Until the tools used to streamline a banner before launch are much, much better, A/B testing will remain the most reliable way of separating wheat from chaff.


If you have plans to run banner ads, then don’t settle for that .1%. A well-run A/B test gives you a better shot at a successful campaign and lays the groundwork for future progress. After all, every company is different; when running a banner ad, patience, perseverance, and the willingness to make changes on the fly will take you much further than you’d think.

Digital & Social Articles on Business 2 Community

(217)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.