Avinash answers my Hour a Day questions: Part 3 of 4
After reading Web Analytics: An Hour a Day, I had a lot of questions, and the author was kind enough to answer them all. In this third installment, we talk about testing and just begin to talk about conversion rate. My questions are in bold and Avinash’s answers are indented.
When you wrote about usability (p. 53), you commented, “Usability tests are best for optimizing UI designs and work flows, understanding the voice of the customer, and understanding what customers really do.” However, I do usability testing all the time. During testing, I learn about the offer and the price, I learn how much the customers trust the site, I learn if the customer understands the site. A whole lot more than usability. So what things is user testing *not* good for (besides statistical significance, and some would disagree with even that)??
My comment you quote stresses what Usability is really optimal for. It can, as you aptly point out, be used for a number of wonderful things and can be a rich source of learning.
With advent of various technologies (including live recruiting and remote testing, experimentation and testing) you have such a wonderful set of tools that you can deploy. For example I prefer to do offer experimentation using a multivariate or testing tool rather than usability. Offer is cleanly tied to a outcome (say conversion), so why should I ask eight people who might not be really representative of my customers what they think? I can just as easily throw an experiment on the site and ask a million people on my site what they think.
Lab usability testing is valuable. It is perhaps the only way to see a customer and observe them intimately. Look for non verbal cues and reactions. Applied for the right purposes it can be a rich source of learning.
It can also be extremely deceptive to ask 50 people what they think of your site / experience / offers and assume that you have it nailed. If that were true site redesigns based on extensive usability tests would not bomb with the frequency that they do.
Why does experience testing get you any close to a global maxima (p. 248)? At the end of the day, you still need to know what to test.
Let me say this first, in any scenario you need to have a very intimate understanding of your customer experience.
Customers overall are very good at telling your their problems, they are terrible at telling you the solutions (and that is quite ok, never ask a customer for a solution).
To solve complex problems on a higher magnitude where your solutions will “slash and burn” what exists today you have a great friend in experience testing. Rather than just optimizing a page, you can optimize huge chunks of the customer experience, if not the whole site, by trying radical solutions and seeing which works. The nice thing is you set participation rates which means that you can easily control for risk.
Experience testing helps you jump the curve (to a get on the global maxima curve potentially) because your canvas is so much bigger, you can take bigger more radical risks and win big.
With most testing your optimize a page, when was the last time that you or I ever had a website experience were one page was so golden that it had a disproportionate impact on the outcome. Probably not a lot.
How do people set conversion rate (or other) goals? It’s great if the CEO says, “We have to increase our sales from our web channel by 50%” — then you can just run the numbers. But absent direction from someone else, do people just say, “Hmm, wouldn’t it be great if we could increase our conversions by 12.45%?” Do they pull out their HP 12C calculators and do an internal rate of return based on the cost of testing and the cost of money? (p. 256)
Here is my recommendationâ€¦..
1) A: Sign up for the shop.org annual study and look at what your competitors are doing. Use that as a initial discussion starter of what your conversion rate should be.
1) B: Type â€œfireclick indexâ€ into google and look at last yearâ€™s worth of data for conversion rate for the web or for one of the six vertical industries that they provide. It is free. Use that as a starting point for discussion of what your goal should be.
2) Plot out your conversion rates (segmented by your core acquisition strategies – DM, Email, PPC, Display, whatever) for the last year and see where things are trending. Bring this to your fireclick/shop.org discussion.
3) Finally see where in your acquisition strategy or site optimization you are making increased investments. If you just hired a SEM Goddess pump up the goal by 50% for that stream of traffic (Goddess will deliver). If you are implementing MVT then see what that will do.
1 + 2 + 3 = An intelligent discussion.
You’ll come up with a goal for the next three months. It might be wrong but persist and repeat the process three months later, you’ll do better this time. In six months when you do it you’ll nail it.
Give yourself permission to be wrong, trust me you’ll get better so fast.
Coming next: Part 4, where Avinash continues to talk about my favorite topic, conversion rate.
About Robbin Steif
Our owner and CEO, Robbin Steif, started LunaMetrics ten years ago. She is a graduate of Harvard College and the Harvard Business School, and has served on the Board of Directors for the Digital Analytics Association. Robbin is a recent winner of a BusinessWomen First award, as well as a Diamond Award for business leadership.