I’ve been doing a lot of A/B testing on the landing page of a side project website I have called VIN-History.com. The goal of the site is to capture organic searches for VIN numbers (the unique, 17-character serial number that every production automobile in the world has). People put VIN numbers in Google to try to find more information about that car. I try to give it to them (to the extent my database has any) and then help them find out the market value of the car, its maintenence schedule and how to buy a full vehicle history report from Experian’s AutoCheck.com.
How to test (plus some actual code):
There are a lot of ways to do this, but I’ve found that what makes the most sense is to give each unique session either version A or B for that session only. If they close the browser and come back at a later date, they have a 50/50 chance of getting A or B. If they hit the page you’re testing 10 times in the same session, they’ll always see either A or B, whichever they were assigned when they arrived. This keeps things consistent and doesn’t distract or confuse the user.
Steps To Implement:
- In the main page that will call either verison A or B of the content, assign this visitor either “A” or “B” for the rest of the session. First see if there is already a session cookie set. If so, get the value. If not, set one.
- $ab is the variable that holds whether the user is going to see version A or B. Now, wherever you want to include the code you’re testing, you do a simple line that includes the appropriate file. It helps if you can name the include file with an “A” or “B” in it to keep things simple.
- Be sure you include Google Analytics tracking codes on the different actions you want to measure. On my landing page, a user can do 1 of 3 things: Get an AutoCheck report, find out the market value of the car, view the service schedule for the car. All 3 are links or form POSTs that take the user away from the site. Each action is worth something different to me and I’d like to measure which page layout yields better results for each action and overall less dropoff from the page (bounce rate).
- The text ‘/ds-psr-a’ is completely arbitrary. It doesn’t even have to be a valid URL. It is just a unique string that will show up in the Google Analytics report later as an action that was taken by a user. For this particular action, I used /ds-psr-a and /ds-psr-b to track the same action on two different versions of the page.
The First Test
The goal is to test goal conversion on the VIN number landing page. I’ll use this VIN as an example: 1N6AA07B55N529895
Here are the two versions of the landing page I decided to test first. A is the original, B has some significant changes. Both have the same 3 actions a user can choose from.
After testing both versions for about 2 weeks, the results from Google Analytics were pretty conclusive:
- Goal 1: AutoCheck Report:
Version A sends almost 75% more traffic to AutoCheck than Version B does. Interestingly, sales through AutoCheck remained constant. This means that the leads being sent by Version B were more qualified and were converting much more efficiently. Net-net: Version A wastes traffic by sending too many users to Experian when that may not be what they’re really looking for. Pure gold!
- Goal 2: Market Value Lookup:
Versions A and B basically tied this race for the trial period. This is not surprising, considering that on both versions of the page, the Value Goal is pretty much the “second” thing on the menu. No action to be taken here.
- Goal 3: Servicing Link:
Version B clearly converted on this goal better than Version A – 253% better. A lot of these clicks were probably ones that would have been wasted on Goal 1.
Round 1 Conclusion: So, it’s clear that Version B wins here by converting on 2 of the 3 goals more efficiently. Next step will be to come up with a new version B that can test a few more theories about converting even better. That will be in the next post … stay tuned.