Our client in this case study sells clothing to a worldwide audience. Before we started, he was seeing 30,000 unique daily visitors to his site — and a good number of those were converting. However, he believed that higher sales were possible. This article examines what we did, including tests, post-test analysis, and use of tools.
When the client asked me to help him, he had a daily traffic of 30,000 unique visitors, nice amount of sales and also a feeling he can do better. He wanted to increase his conversion rate. Particularly, he thought that making his product page better will help him increase sales.
The page lacked some crucial for conversion elements and though the it looked pretty much normal for an ecommerce site, there was no logic in positioning of the elements and visual accents. I saw possibilities to produce required results so I agreed to help.
The first thing I did before redesigning the page was analysis of its structure. I broke the page down into components (and groups of components, like product info, service info, trust signs, etc.) and made a document with this information to understand better what the page already had and what was missing. Among other insights which I got from this document was that there were not enough trust signs of the product and the service on the page, and that those which were present, were out of the visitor’s sight.
Having that information I made a concept of re-structuring the page using decision making funnel (helping a visitor see the information in the order which assists purchase decision making), agreed on it with the client and made a wireframe of the new page.
Creating an A/B test
Next step was making a variation based on the wireframe. For conducting A/B tests I usually use either Optimizely or Visual Website Optimizer. This time I was using the latter as the client already had a paid account there.
A simple test like changing a button color or a title is very easy to implement in VWO even for somebody without developer skills, but in this case page required a big remake.
For more complicated tests VWO supports jQuery, which was used in this case to create the variation.
A/B testing and results
The experiment started on 17 of March 2015, was limited to 2,000 U.S. visitors a day and was going for about 3 weeks.
The results of the experiment were quite inspiring, but not unexpected.
The main client’s objective was increasing adds to cart and the experiment showed, that new design made 38% more conversions.
But what is more important, is that the ecommerce conversion rate was also 9,5% more on new design.
Post test analysis
The real change of conversion rate after the A/B testing is finished and winning variation is implemented on the site is in most cases different from the experiment results, that is why I always do post-test analysis.
Doing this analysis I compare trends of 2 periods. The first starts some time before the experiment and ends some time after the winning variation was implemented, and the second includes same dates one year ago (adjusted so that both periods include full weeks).
In this case the experiment started on 17th of March and the winning variation was implemented in the beginning of May. So I was comparing period from 1st of March to 16th of May 2015 with 2nd of March to 17st of May 2014 (U.S. visitors only).
Here’s the ecommerce conversion rate trend for the period 2nd of March to 17st of May 2014.
The conversion changed from 0.34 % in the first week to 0.40% in the last week of the period.
And here’s the trend for 1st of March to 16th of May 2015.
You can see that the conversion increased dramatically here – from 0.44% in the first week to 0.73% in the last week of the period. The marked point on the graph is when the winning variation was implemented on the site.
Tools, technologies and cost
I spent around 35 hours on the full cycle of the experiment: from analyzing the product page and wireframing it to developing the experiment variation and reporting.
- Google Docs for making and sharing docs
- Axure for wireframing
- Visual Website Optimizer + jQuery for conducting the A/B testing
- Google Analytics for post-experiment analysis
A/B testing may be costly in absolute numbers in cases like this, but they give a considerable amount of new sales for sites with high traffic.