A/B testing is a way to compare your original page (your “A” page) against one or more alternative versions of the page (your “B” “C” etc. pages.) A/B testing allows you to randomly assign visitors to your test pages, which means that you can actually run statistical analyses on the way they interact with each page (and most A/B testing platforms will do this analysis automatically!)
One of the beauties of A/B testing is that it gives you definitive answers to questions like “If I move this button to the other side of the page do more of my visitors click it?” In this example, we could see how many visitors to the A page click on the button in comparison to the visitors to the B page, and use the data from this test to create a version of the page that is optimized to yield the most clicks on the button.
Note: Most A/B testing will automatically cookie your visitors so that they’ll be sent back to the version of the page they saw before if they return again. That means you don’t have to worry about people getting confused as they’re constantly bounced back and forth between different versions of a page.
When:
Rule 1: Only A/B test when you want to figure out a better way to get your visitors to do something on your site.
One of my clients went to a marketing convention recently and was told that making things super obvious and easy was always a good idea on a website, so he suggested we put buttons around the site encouraging people to click for more information. While this initially sounded like it could be a good idea, it was important to test that new button page against the original. Turns out, the visitors were much more likely to go where we wanted them to when they weren’t visiting the button version. If I hadn’t run that test I might have made a big mistake that would have actually harmed my chances of driving conversions.
Rule 2: A/B testing tests how people use the site, not search engines.
This may be surprising coming from an SEO analyst, but in the end the reason we do SEO is to get more visitors to come into our site and do what we want them to do, i.e. convert. A/B testing isn’t great for pure SEO ranking improvement checks. For example, it’s ludicrous to try to do an A/B test for different title tags. If we’re already getting a lot of visits into our site we may want to start A/B testing, but if our visits are still pretty low then we may want to hold off on a test. This leads us to our third rule:
Rule 3: A/B test on higher volume pages.
In order for you to get any kind of statistically significant outcomes from your test it’s helpful to have a lot of visitors interacting with the test. This number needs to increase every time you add in a page variable. You can break this rule if you’re willing to wait a long time to get significant data.
Tip: Start with your running tests on your homepage – that’s typically where most site traffic goes.
Rule 4: Testing small changes means better responses.
If you’re looking for a specific answer, then you’re going to want to compare one variable at a time. Testing things like button colors or sizes, on-page content choices, or menu placement are all good places to start. However, when we start to compare full page differences, like two completely different homepages, we start to get a little less clarity in our response. The reason these smaller tests are so valuable is that we can take the outcomes we get and apply them to other parts of the site, whereas making larger changes means we could overlook some factor. If we apply similar changes to another page we might actually end up hurting ourselves by missing the important component.
How:
There are a lot of A/B testing platforms out there, and because most of them cost money, A/B testing can be an investment. At Perfect Search Media, we tend to use Optimizely, and I will say that it’s a great platform, so if you’re feeling overwhelmed with decisions then it’s a good place to start. However, there is actually a great, free A/B testing solution—Google Analytics has this capability in the experiments section nestled under the “Behavior” drop down. I think a lot of people don’t use it because it can seem intimidating, but it’s actually very simple to set up.
The biggest difference between Google Analytics and other platforms is that GA will require you to already have the two pages created (so you can copy your original and change one thing on it in the new page,) while most other platforms will allow you to actually click and drag things around to create different page variations. Whatever the platform you choose you have to remember the most important thing about A/B testing:
You must be tracking conversions
It doesn’t necessarily matter what conversion it is, but you need to be tracking some kind of conversion so that you can compare how many people convert in the “A” page versus the “B” page. Without tracking conversions A/B testing is useless. So ask yourself what you want your visitors to do. Is it click on a video? Go to your contact page? Stay on the page for more than 3 minutes?
Once you’ve figured this out you will have to go into the code of the page and actually insert something. For either Optimizely or Google Analytics the code is very simple. All you need is access to the <head>. Most likely you’ve already put your Google Analytics code on the site anyway (and if you haven’t you should!) so just find where you put the Analytics code and put your A/B testing code near it.
Summary:
- A/B testing is a way to test multiple versions of a single page in order to see which page drives your chosen conversion most successfully.
- You should use A/B tests on high traffic pages to get faster results and test small changes to get more specific answers.
- You can get these tests set up through paid solutions like Optimizely or through free solutions like Google Analytics.
- Make sure to track your conversions.
There you have it! You now (hopefully) know a whole lot more about A/B testing than you did when you started reading. There’s a lot more you can do with these kinds of tests but this will give you a good place to start. Good luck and happy testing!