At Myna we’re all about considered, methodical approaches to website optimisation. But for newbies, A/B testing can seem like a lot of hard work: fussy micro-management of page elements and aesthetics in pursuit of minor gains.
How do you know where to begin? How do you find the tests that make a big difference?
To get the most out of A/B testing, you need a plan. Don’t just sit down in front of your website and start testing the colour of a button – start by establishing what you want to improve. For that you need statistics, so sit yourself down, fire up your analytics program of choice, and let’s get started.
This is the first in a two-part series of posts outlining how to build a testing plan. In this post we’ll talk about testing the process you’re guiding your users through. In the next post we’ll talk about testing the positioning of your product in the market.
When testing process, you are assuming that your product or offer is attractive to your users, but something is getting in the way of making a sale.
Look at the paths you want users to take through your web site, find out where you’re losing them, and test around this.
Where are you losing users? Do they hit your landing page and then bounce straight off? Do people fail to complete the sign up process? Or do people get distracted and wander off during checkout? Possible culprits are:
- your landing page lacks a strong call to action
- your sign-up process is too long
- your checkout process is too complicated
- your product pages don’t offer a clear path to purchase
Your analytics will tell you what you need to know – the key terminology is a funnel or sales process. This is a sequence of steps that you define that lead users to your conversion goal. You can set up funnels using conversion goals in Google Analytics as well as other analytics packages to track the flow of users through the desired steps.
Here is an example sales funnel based on our sign-up process at Myna. Note that the conversion rate figures are fabricated for illustrative purposes.
When people sign up to Myna, we push them through four steps: they choose a plan, enter account details, enter billing information, and arrive at their shiny new Myna dashboard. At each step some percentage of users proceed onwards, while other users leave and go elsewhere. Our goal is to maximise the number of people who get all the way through to the end.
The key information provided by the funnel is where we are losing users. In the example there are two weak steps: the landing page (step 1) and the billing page (step 4). We immediately know where to start optimising:
- The landing page is likely to have a low conversion rate because not all users arrive at our site wanting to sign up. Some users have already signed up and want to log in; other users may want to read our blog or contact us. Nevertheless, we can run a few tests here to try to increase this conversion rate. It’s not hard to come up with several ideas around messaging, design and layout.
- The final step of the pipeline, billing information, is a great candidate for testing. By the time users get to this step we know they definitely want to sign up to a plan. Why aren’t 60% of them doing so? This is a great candidate for testing ways to minimise friction on the billing form: collect fewer fields, optimise the layout, and provide the user with immediate feedback if they get something wrong.
The conversion rates on a funnel provide useful information going even beyond testing. For example, in Myna we sign people up to our free plan in step 3, before they enter billing information. This way, we allow users to use Myna even if they choose not to give us their credit card details. We also allow users to opt in to our mailing list in step 3, providing us with a way to contact them in the future.
In conclusion, analysing funnels and traffic around your site provides the key to testing process. Once you understand where you can make the biggest gains, it’s much easier to get into the specifics of tweaking layouts, images, and buttons. At every level you should make hypotheses about why people are leaving your funnel and address them by testing new ideas.
It’s important to note that sales funnels are often more complicated than the one we’ve described above. Our sales process at Myna involves many more steps on and off our web site, encouraging users to come and try our product, providing support setting up tests, and helping them get their first set of results. Your web sign-up or checkout process is a great place to start, however. As you get used to testing, you can expand outwards to look at the big picture.
Finally, testing process is only useful if you’re managing to attract users to your site. In order to do that, you’re going to need a marketing strategy. A/B testing can help greatly with marketing as well, as we’ll see in the second post in this series on positioning.
The eagle-eyed among you may have noticed that we’ve got a new addition to the team. Please welcome our new mascot, Monique:
As the old site design clashed with her beautiful colouring, we threw it all away and gave the site a bit of a polish. Please excuse any empty paint cans around the place.
Monique is a Balinese myna bird, a critically endangered species. Though exact population numbers are unknown, efforts are being made to protect and strengthen the wild population, which has been severely diminished by poachers.
What does a small bird, native to South-East Asia, have to do with A/B testing?
The answer? It’s all in the algorithms.
Some of the the sub-species of the myna (or mynah) are considered talking birds, i.e., birds that can mimic human speech.
As a myna (the bird) mimics human words, and is rewarded by positive reinforcement from its owner (or delicious, delicious seeds), it learns which are the most desirable words or phrases and knows to repeat them more in future.
In much the same way, as Myna (the revolutionary new A/B testing solution) repeats your variants, and is rewarded by successful conversions, it learns which is the most successful variant and knows to repeat that more in future.
As the page views roll in on a new test, we like to picture an excitable bird, squawking away and gobbling down treats, learning not to swear in front of polite company.
Even in our short lifetime, we’ve seen multiple spellings of the word: from minah to miyna, mynor to mhyna (what can I say, sometimes I just type too fast). While both myna and mynah are considered valid spellings for the bird, the A/B testing is spelled only Myna.
We’re pleased to announce that we’re finally coming out of beta, which means the introduction of our least-requested new feature: pricing plans. These will come into effect on 15th December.
We’ve got plans for all sizes of business and website, including a permanently free plan for low-volume users or sample tests. Sign up now and start A/B testing for free – you can upgrade whenever you’re ready.
All our existing beta users (and any new sign-ups) will have the opportunity to choose a plan before December 15th. On that date, everyone else will be transferred onto the free plan. Don’t worry! All existing tests and data will remain intact.
We’ll make further announcements on billing closer to the time, but for now, all the details are available on the Myna pricing page.
Just a quick post to let you know that there’s now a client library available for Python.
It uses the built-in
httplib to allow you to suggest and reward your Myna clients. It has no external dependencies and has been tested in Python 2.7.
Here’s a link to the github repository, which includes some basic documentation so you can get started. We’ll add some more detailed information to the main site pretty soon.
Earlier in the week, we sent out a short survey to our awesome beta testers, asking them what they would most like to see added to or improved upon in Myna.
One of the common responses was a request to improve our documentation. So, we’ve done it, and you can access the new and improved documentation from the ‘help’ tab above, or from this handy link.
We’ll continue working on the documentation to make sure it’s as complete and helpful as possible. We hope that this initial change will make it easier and faster to start A/B testing with Myna.
If you have any comments or suggestions regarding the documentation (or anything else), please let us know using our contact form.
Greetings from Myna HQ!
It’s been a veritable whirlwind of excitement since our last post.
Firstly, we’re pleased to announce that we’ve been accepted onto the Oxygen Accelerator program at BSP-A. This is a 13-week bootcamp for startups, providing access to mentors, introductions to investors, and (importantly) office space.
The information and advice we’re receiving is golden, and we’re having an absolute blast. We’ll admit to being a bit nervous about being locked in a room with a gang of hungry startups, but they’re a great bunch, with some fantastic ideas, and we really do wish them well on their way.
And if nothing else, we have a captive audience.
While we’re absorbing information like a pair of parched sponges, we’re also hard at work on Myna itself. We’re working on some refinements to the UI and reporting features, and rolling out changes to the support system – our help pages are getting a bit of a polish, and we’ve just given our internal metrics tracking a significant boost, integrating with Mixpanel.
We also seem to have adopted a small robot.
Isn’t he cute?
Follow us on Twitter for the most up-to-date haps, or subscribe to our RSS feed from the sidebar on the right.