You would think that marketers would be adept at running successful A/B tests. But the stark reality is that the overwhelming majority of A/B tests—around 90 percent of them—fail. Marketers spend weeks, even months, running these tests, only to watch them fall flat nine out of every 10 times.
For those in the dark, A/B testing, in its simplest form, is comparing two versions of a web page to see which one performs better. Such split testing can not only test web pages, but calls-to-action, landing pages, blog headlines, tweets, etc.
In a webinar hosted by Kissmetrics, Phil Sharp, marketing VP at UserTesting, identified problems and solutions in A/B testing. Part of the problem, according to Sharp, is that marketers only succeed with A/B testing about 10 percent of the time. But Sharp asserts that the real problem is how much time is wasted on losing A/B tests.
Some marketers might only come up with one winner per year. This is terribly inefficient, to say the least.
Why are A/B tests only successful 10 percent of the time? How can we improve this abysmal performance? What techniques can marketers use to get more winning A/B test results, and get them faster?
Oftentimes, marketers simply have too many ideas. While this may seem counterintuitive because marketers by their very nature are constantly seeking new ideas for A/B testing. Try this: Google “A/B testing ideas.” A quick glance at the results and you’ll soon be overwhelmed. There is just too much information out there, even though some of those ideas are undoubtedly winners in waiting.
In order to conquer information overload, it really helps to have a process, Sharp notes. But before you zone out or doze off (process time is snoozy time, we know), consider this: process equals more revenue. After all, the core goal of a process is to determine what’s stopping customers and prospects from converting, and then fix what’s wrong to increase conversion rates.
Here’s a three-step process Sharp says marketers can use to better understand and profit from A/B testing results.
First, take that list of ideas you’ve got and throw it in the trash bin. Really. Some of those ideas are likely to be quite good, but are they important for you and your goals? Most of them won’t be. There is a need to avoid what Sharp refers to as the “curse of knowledge,” a sort of cognitive bias that can cause those with an abundance of knowledge or expertise to be blind to thinking about problems from different perspectives.
Next, listen to your customers and prospects. Ask the right questions. This sounds obvious, but are you really asking and listening to the reasons why people are—or aren’t—converting? Utilizing analytics tools like Qualaroo, which enable website surveys to uncover customer insights that lead to better business results, can be extremely helpful, allowing you to ask simple, straightforward questions like, “If you’re not going to sign up today, can you tell us why not?” Also, if your business offers customers and prospects live chat, carefully review what goes on with that project. There’s no point developing a customer-service process without checking in to see if it’s viable for your long-term business model.
Finally, use a framework to make decisions about what to test and when. You are not going to quadruple your conversion rate overnight, but just having a framework will be helpful. Test objectives in one to two sentences. Formulate and write down your hypotheses.
Be mindful of opportunity size—if your test works, what’s the total impact your company can expect to see over the course of a year? Also always know how long it will take to run your A/B test, and write down three of the most likely scenarios. If you are surprised by the results of your test, you are not preparing enough, Sharp adds.
In the wake of any given A/B test, ask yourself what worked, what didn’t and what you can learn from the experience. Always remain positive; your problem isn’t a problem so much as it is an opportunity.
By following the three steps mentioned above, you have a strong chance of increasing your A/B testing success rate to 20 or 30 percent, perhaps even more, Sharp concluded.
Photo via wpmudev
Latest posts by Brett Wilkins (see all)
- B2B Trending Conversations: AI, Brand Rivalry, and More - March 18, 2017
- Evolving to Heroism: “Mega-Trends” in Account-Based Marketing - March 11, 2017
- Building a Better Tech Stack: A Panel Discussion at #RevSummit17 - March 9, 2017