A/B Testing for Local Business Websites Is A) Impressive, B) with Caveats

AB Testing Diagram from Visual Website OptimizerWith A/B testing software, you can show one group of website visitors one version of a page, show another group another version and see which version gets you more clicks or phone calls—easy, quick, and cheap or free. Impressive!

Except, as a small local business with a low number of site visitors, it may be tough to get meaningful results. Hmmm.

Here’s the deal on A/B testing (AKA split testing, website optimization or conversion rate optimization) for small business websites:

What you can test

Typically, you’ll take your home page or a landing page, alter one element such as the headline or main image or the button language on a form, and test that against the original. Or you might change the placement of an element on the page. Or try out different prices on a special offer. You can also do multivariate testing, changing multiple elements in different combinations.

Small-business friendly A/B test software

Check out Unbounce, Optimizely and Visual Website Optimizer. Free trials are available and starter fees are $17 to $49 monthly. The software allows you to create test pages by rearranging or over-writing existing pages on the fly, without involving IT help. Google Content Experiments is free and promises you faster test results (see more below) but you create test pages by adding new pages to your site, so IT help is needed.

OK, so how many site visitors do I need for a test?

This is actually easier to understand by playing with an online sample size calculator like this one from Optimizely.

A/B testing seeks to predict what all site visitors will do based on a sample of site visitors. The sample needs to be large enough so that the result isn’t skewed by random chance. You choose how precise you want the test to be.

The sample size depends on how you set the dials on:

  • Conversion rate and minimum detectable change on the rate. Say you’re testing a new button design. The old button gets a 20% click rate (conversion rate). You set the test to detect a change of plus or minus 5% in the new button design–that’s 5% of 20% or +/-1%. If the new button has a conversion rate lower than 19% or higher than 21%, it will be detected but between those numbers, not detected.
  • Statistical significance is the likelihood that your results are showing a FALSE difference between the original and the test, in other words, a chance result. The standard is 95%, meaning 5% or less likelihood that the result is chance.
  • Statistical power is the likelihood that your results are picking up a TRUE difference. The standard is 80%; in other words, the test will show a true difference 80% of the time and miss a true difference 20% of the time.

Use the Optimizely calculator to see how changing these elements changes the sample size needed to view the test page. For instance, with 95% statistical significance, 80% statistical power:

  • 20% conversion, 5% change = 19, 907 sample size
  • 20% conversion, 10% change = 5,006 sample size
  • 10% conversion, 5% change = 44,846 sample size
  • 10% conversion, 10% change = 11,293 sample size

The other element is time: How long will it take your site to reach the required number of visitors and is running an experiment in that time frame timely enough to help your business?

(Google Content Experiments offers a different approach that quickly starts directing more site visitors to the variation that appears to be winning to complete the test faster and with smaller samples. There’s some controversy in the testing field over whether that produces unreliable tests for small samples.)

How to get started

Before testing, put serious effort into choosing the right site change to test:

  • Understand site visitor behavior by studying your Google Analytics or other site metrics data.
  • Focus on something with bottom-line impact.
  • Go for bold changes that visitors won’t miss.
  • Test out your ideas first by showing designs to customers.

The bottom line

Never having dropped $252,000 on testing, I’ll defer to the sadder-but-wiser advice in “What Spending $252,000 on Conversion Rate Optimization Taught Me” by marketing software entrepreneur Neil Patel. He’s talking about using consultants, not DIY software, but the lessons apply:

  • “You tend to get only a few wins each year that drastically affect your revenue.” Give it a year before your test program is cash-flow positive.
  • Your goal is not to get more buttons clicked on your site but to increase your total number of sales. Focus on revenue.
  • “Just because a test says it increases your revenue by 30%, it doesn’t mean it will maintain that increase in the long run.” So you need to keep testing. “It’s a never ending game.”

Image from Visual Website Optimizer

Comments

  1. says

    Hey Jeff I’m the CEO over at Instapage and wanted to take a second to first of all say awesome post. Small businesses have been slow to adopt an A/B testing or any online optimization strategy at all.. so we need more people like you!

    I also wanted to introduce our Landing Page sevice Instapage as we’ve been around for 4 years and think we’ve built a tool that easy enough to use by Small businesses and is priced lower than a lot of these other services.

    I’d be more than happy to answer any questions. Thanks again for the great post and I hope you don’t mind this plug.

Let us know what you think!