Conversion rate vs Traffic: which should you work on first?


I’ve had this conversation with a lot of customers and potential customers. The site owner has at least two problems: not very much traffic, and lousy conversion. “So which do you want to work on first?” I ask them.

On rare occasions, the very savvy customer will say, “Let’s work on the conversion first. After all, it doesn’t make sense to send lots of traffic to our site and just see that traffic bounce.” This is common “wisdom,” and lots of analysts pay lip service to this idea. Maybe more than that, maybe they really embrace it.

However, I love to be the contrarian, and would submit that it is smarter for the vast majority of sites to work on traffic first. Why? Because when it comes to conversion, split or multivariate testing are your best friends. You can do all the “conversion” work in the world, but at the end of the day, you need to prove which version works the best through testing. And if you don’t have enough traffic, you will sit and sit and sit with your test, waiting for statistical certainty. Or you will dumb down your test enough so that it runs in six weeks instead of six months, and then not learn enough.

(To those who haven’t done the six month thing: when your test has to run for six months to get statistical significance, you will fight battles all the time. Everyone wants to make changes to the page you are testing, from the content manager to the SEO firm to the PPC firm. And let’s not forget the calendar: you may need to show a different page in the fall than in the spring, based on what you sell.  As soon as you change the page, the test is basically dead.)

But what about larger sites? They don’t need to work on traffic, do they? They already have traffic.

Well, maybe not so fast. Statistical significance of testing is based on having enough data in both a numerator and a denominator: The number on top of the ratio (conversions) and the one on the bottom (views of the test page.) So you might have a fairly large site with a miniscule conversion rate, and still not have enough traffic to get a significant test quickly. Ergo, you still need traffic.


Our founder, Robbin Steif, started LunaMetrics in 2004. She is a graduate of Harvard College and the Harvard Business School, and has served on the Board of Directors for the Digital Analytics Association. Robbin is a winner of a BusinessWomen First award, as well as a Diamond Award for business leadership. In 2017, Robbin sold her company to HS2 Solutions and has since retired from LunaMetrics.

  • Another thought in support of building traffic first is that traffic builds on itself. The earlier that you get started building traffic, the faster it will build. Build the traffic first and it will grow on its own without any more work. That doesn’t happen with conversion rates, does it?

  • Great arguments against stupid focus on conversion rate! :-))
    You’ve got a link in next spot on my blog. 🙂

  • Absorutely, the traffic the A-#1 priority. Customer cant to convert if not there. And test very important to find numbers to prove thing and must have many numbers, therefore to say let sales be sales person job. Revenue very over-rate than traffic.

  • Oh Miso, I am not sure. Revenue is never less important than traffic. IMO. It’s just that you can’t GET the revenue without the traffic, for a variety of reasons, including the one I wrote about.

  • Might I try to “out contrarian” the contrarian. : )

    Before you make the decision (based on what the “Gurus” say) might I suggest that you get a “suckometer”.

    Now before you decide look at the conversion customer experience (pages, links, buttons, steps) and do a quick back of the napkin check to see if the reading on the “suckometer” is High, Medium or Low.

    If the reading is High. Choose improving the experience using a “Heuristic Evaluation Process” (see my book) / common sense.

    If the reading is Medium. Ask your mom. If she thinks it is ok, go for Traffic. If not fix the experience.

    If the reading is Low. Go for traffic. Fix the experience as resources permit.

    What do you think?


  • I think you are nuts to try to out contrarian me.

    Well, if you are saying, fix the stuff you know sucks without doing testing, ok, sure. But I am just so tired of looking at sites that suck, but don’t get fixed without proof. (Lions and tigers and hippos.) And proof is MVT or AB. And MVT needs traffic.

    Now, there is another issue that you didn’t address, but I was thinking of you and this issue when I wrote the post (so I’m surprised you didn’t mention it). Does it have to be 95% certain? Maybe you roll out your new pages with the data you have and not the data you wish you had (I think I got that quote a little wrong, but he is out of the White House now, anyway.) You, AK, are always saying things like, we don’t get perfect data, how about 70%? But the lions and tigers and hippos all want to see the bars turn green, and it is even harder now that GWO made all those changes…

  • There are scenarios where I would not even go to testing.

    Yes we should test. But when the reading on the “suckometer” is High I think you should just jump to Heuristic Evaluations and fix.

    Here is a great example of how to do it, from my friend Dr. Pete:

    Click on the image in that post, see the pdf.

    Fix those things first, they don’t need testing. 🙂

    Once you start moving into Medium and Low you need to test because then it is not obvious what might work.


    PS: You are right about my position on imperfect data. We all need to be flexible in our belief systems. Not everything requires statistical significance, for some things directionally correct data is ok. But if it is huge in biz impact I’ll wait. Flexibility. : )

  • AK, thanks for sending the link.

    I think in this case you need to step back and look at the big picture. (And I think that’s what Avinash means by “suckometer” – there’s got to be some bigger insight that drives what strategy a business is going to use. Just basing a decision on two dynamic variables – conversion or traffic – isn’t enough.)

    If there’s not enough traffic because the there are fundamental IA issues with the page in question (high “puke” rate, etc.) then get those in order so that you’ll have a statistcally significant sample group to run a viable test otherwise – like you said – the test is worthless.

    But if the conversion rate isn’t statistically significant becuase you can’t get enough visitors to the site (no pipeline) then figure out how much this information is worth and get spending on paid search (or other ways of getting visitors) so you can get that pipeline big enough to provide value.

    Effeciency and profitabilty are the real keys here.

    Having said both of those options though, I’m curious to know what any businesses (or analysts) threshhold is for decision making because in my experience, this threshold will be different for different businesses in different industries because they all have different needs.

    IMHO, businesses need to be less concerned with “having their cake and eating it too” and instead focus on knowing “how to slice the biggest piece with best knife”.

    (Didn’t realize I had so much to say about this…:)

  • Oh, now I understand what Jeremy meant about the link. I kept wondering, what link?

  • Ben

    At times, I’m baffled by the willingness of SMB’s to dump more money into advertising without first taking a critical look at their website’s ability to convert. At the same time, rarely is there enough data to be statistically significant (I guess I’m just living in a small world). For this reason, I tend to act along the lines of what Avinash has suggested – use the suckometer to fix the obvious problems and then get more traffic.


  • “Statistical significance of testing is based on having enough data in both a numerator and a denominator: The number on top of the ratio (conversions) and the one on the bottom (views of the test page.) So you might have a fairly large site with a miniscule conversion rate, and still not have enough traffic to get a significant test quickly.”

    This is not true. The conversation rate which has the highest standard error is 50%. The closer the conversion rate is to 100% or 0% the smaller your sample size needs to be to achieve a given confidence level.

    Why? Well, there’s an argument from symmetry. If converting at 100% let you get results more quickly than converting at 0%, measure the “non-conversion rate” instead. A 1% conversion rate is a 99% non-conversion rate. But of course, one is statistically significant if and only if the other is.

  • Also, the question isn’t so much “should traffic come first?” but rather “where should my traffic come from?”

    The only acquisition strategy compatible with your advice, IMO, is SEO. If your traffic is paid, a low conversion rate is just throwing money down the drain. If your traffic is viral, you need to be optimizing your viral loop, which puts you back at square one.

    With SEO, sure, you can work on getting placed higher for some set of keywords and get organic traffic that way. From there you can iterate more quickly.

  • I don’t think it is the answer to this question is so black and white. I agree with Avinash, Jeremy and Ben.
    Where you focus first on depends on where you suck the most. if you conversions suck the most and you are getting decent traffic then conversion becomes your priority. However, if you have low traffic which means you are just starting out (the site/businesses new) and than getting traffic is your priority. However running simple A/B test is not that cost prohibitive or time intensive so you can do testing as well as driving traffic at the same time.

  • Abhi

    Anybody can help me how much percentage traffic should we consider in the A/B test for A version and B version
    and how long (time) we need to test initally. Is there any statistical significant test?

  • Abhi

    Thank you Robbin.

Contact Us.

Follow Us



We'll get back to you
in ONE business day.
Our Locations
THE FOUNDRY [map] LunaMetrics

24 S. 18th Street
Suite 100

Pittsburgh, PA 15203


4115 N. Ravenswood
Suite 101
Chicago, IL 60613


2100 Manchester Rd.
Building C, Suite 1750
Wheaton, IL 60187