Conversion rates and A/B testing
So what is A/B testing, anyway?
A/B testing springs from old fashioned direct marketing. Mailers would have a mailpiece that worked well, but wanted to see if they couldn’t improve on it a little bit. So they changed just one variable — maybe it was the snipe (“Your free gift inside!”), or the offer – and tested the new piece against the control. Because they changed just one variable, they could scientifically measure not only whether the change made the mailpiece perform better than the control, but also what about the new mailpiece made the difference.
Let me take one last minute to distinguish between A/B, multivariate and Split Path, since I’ve mentioned them all. A/B is when you test one kind of creative against the current creative. Maybe you just change the headline on a test page. When doing A/B testing, never forget what your 8th grade science teacher taught: if you change more than one variable, you don’t know which one was responsible. Multivariate is when you change a bunch of elements on the test page, and then use statistical technology to figure out which elements mattered. (I will do another post on this if anyone is clamoring for more…) Finally, Split Path is just the technology to send some people to one page when they click on a link and everyone else to a different test page when they click on the same link. You can do 3-way or 4-way or nth-way tests, but you generally need more traffic so that your sample size is large enough to mean something.
(MarketingSherpa did a very interesting piece on testing multiple variables with a small audience size. I will dig it out when I get back from the UIE Usability conference.)
About Robbin Steif
Our owner and CEO, Robbin Steif, started LunaMetrics ten years ago. She is a graduate of Harvard College and the Harvard Business School, and has served on the Board of Directors for the Digital Analytics Association. Robbin is a recent winner of a BusinessWomen First award, as well as a Diamond Award for business leadership.