How to Find Quick Wins in A/B Testing
Successful A/B testing takes more than changing a button from red to green. It’s a longer process that pays amazing dividends over time.
However, when you’re starting out, there are low-hanging fruit—easy tests that could make a meaningful impact on your conversions for little effort. Every SaaS and ecommerce site has them. In this post I’ll explain how you can find those quick wins, followed by an actual case study.
Why Opportunities for Quick Wins Exist
You have a complete view of your site and conversion funnel. Every page and step is familiar to you. You can probably list all your pages from memory…
And you forget that others can’t.
As a result, your site and conversion funnel may have small issues that are preventing some visitors from converting. Ambiguously labeled buttons and easy-to-miss links are examples of small issues that could be A/B tested with little effort, but familiarity makes it easy to overlook them. This is true even for organizations with great internal marketing, design, and development teams.
How to Find the Quick Wins
If familiarity is the enemy, then the solution is to pretend to be unfamiliar. Navigate your site as if you’re seeing it for the first time, and note down everything that raises questions or causes confusion.
Easy peasy, right? Well…
As it turns out, overriding a deep familiarity is quite difficult. Just as with proofreading, the most effective method is to have someone else do it for you. If you do want to try it yourself, here’s how:
- Open any text editor in one window, and your site in another.
- Pretend for a moment to be someone from your target audience, who has never been on the site before. Act extra confused and impatient.
- As you stay in character, navigate through your site. Try to perform some actions that you'd want your visitors to take, such as signing up for a trial or adding an item to cart. Stay in character!
- Throughout all this, use the text editor to write down every question, issue, or question that comes to mind... Be more critical than you think you should.
Be embarrassingly clueless, and you’ll find your weak spots.
After the exercise, you’ll have a list of issues throughout your site that caused you—and probably your visitors—to be confused or frustrated.
Here’s a selection of notes from a real evaluation I did:
- Free shipping is nice. Wonder why not highlight that?
- "Subscribe" and "items" buttons take up a lot of space. Why do I care if I'm looking at a particular item?
- What's "subscribe" anyway? What is it and why would I click it?
- Featured items scroller is annoying, can't focus.
For each issue or question on your list, think of a small change that might solve it. Now you can run an A/B test to see whether that changes makes any difference, compared with the original. In many cases it won’t, but when it does, then you’ll have increased your revenue in return for very little effort.
Example of an A/B Testing Quick Win
The sample list above is from an actual evaluation for Ruby Lane, the largest online marketplace for vintage and antique items. Before starting any major tests to increase their checkout rates, I did this exercise to find some low-hanging fruit.
One of comments above refers to a ”subscribe” button at the top of every product page (screenshot below). I had no idea what it meant or what it does. As it turned out, this button let visitors opt into receiving updates about new item listings from a specific seller, which eventually translates to more purchases. An important button, after all!
This seemed like an opportunity for a quick win. Would more visitors opt in if it’s clear to them what this button does?
After discussing and testing several alternatives, I tested the main contender against the original version. Here’s how each button looked:
We suspected the phrase ”Favorite Shop” (with a heart icon) would be more compelling and relevant to visitors. The test took just a few minutes to set up and several days to collect a substantial sample size. In return, it increased the number of visitors who opted in by 53%! The test variation won.
To be clear, this was one of several minor tests we ran while working on larger tests more closely tied to the checkout process. Some did well and others did not. The point is that we found an easy opportunity and leveraged it for a quick win, with very little effort.
For another example, see my case study of a sign-up button test.
Lessons Learned
- Although A/B testing is a long (and fruitful) process, there are opportunities to increase conversions (or micro-conversions) for relatively little effort.
- The second-best way to find easy and effective A/B test ideas is to explore your site from the perspective of a confused, impatient visitor. The best way is to have someone unfamiliar with your site do it.
- Be mindful of the opportunity cost. Looking for quick wins should not supersede more larger and more important tests. Nor should your entire A/B testing plan consist of trying to pick low-hanging fruit.