Upcoming LunaMetrics Events
San Francisco, Apr 28-30 Los Angeles, May 12-16 New York City, May 19-23 Chicago, Jun 16-18

Avoiding Cognitive Bias in Conversion Optimization and Content Experiments

Spock

Even Vulcans can be afflicted with dreaded cognitive biases.

The goal of content testing different variations of web pages during a conversion optimization experiment is to get as accurate a result as possible, in order to say which variation is the best possible one.

Just because we’re testing, however, doesn’t mean we can’t be wary of cognitive biases which can filter into our experiments. Just because you put up two pages, release them to the public, and then gauge reaction, doesn’t mean that you’re not subjecting those tests to your own or others’ perceptual distortions. Watch out for these cognitive biases when forming content experiments.

You’re the wrong person to ask for ideas…

“The Curse of Knowledge”

This is when an informed person has a hard time thinking about problems the same way an uninformed person does. It can come out in a few ways when doing content experiments. For instance, let’s say you’re changing a banner on the top of the page. When you open the new page your brain KNOWS that there is a change to the banner, and you’re going to have a hard time not seeing that change. But someone else who DOESN’T KNOW that the banner was changed, might be asked to look at the page and not even notice that it had been changed.

Once you know something, it’s impossible for you to un-know it. Did you add new social buttons to the page? Great, ask someone else to look at them. You’re biased by the curse of knowledge. The same can be held true even for coming up with test ideas. You know there is a new banner on the page, as well as new social buttons, so you might not look to change those again, however if nobody else can see them, maybe those buttons need to be changed again. Just because you know something, doesn’t mean everyone else knows it too.

“Bias Blind Spot”

This is related to the Curse of Knowledge in way. It’s when you see yourself as less biased than other people. Oh sure you see everyone else’s biases, but you? You’re not biased. You’re like a vulcan. You’re pure logic and rationality.

Well, unless you ARE a vulcan, you’re biased in some way. Don’t fool yourself; it’s ok. It’s human, and in most cases it’s helped you and your ancestors survive and not get eaten by saber toothed cats. Except now the saber toothed cats are all dead, and we need to test websites. So don’t pretend you’re not biased, and don’t think you’re not biased. KNOW you are, and you can hopefully use some of that logic to overcome it.

Don’t just test the things you think don’t work, or change the things you don’t see. Test the things you think work and that you do see. Start from the position where you know everything you’re thinking, and how you’re reacting, is biased, and maybe you can overcome it.

…and so are your friends… 

“The Bandwagon Effect”

Everyone hop on the bandwagon. Plenty of room.

This is groupthink. Ideas (memes, fads, trends, beliefs) spread between people like viruses. If someone believes something, it’s far more likely someone else is going to believe it as long as they’re nearby. People like to agree with other people.

Needless to say if you get people together in a room to go over a wireframe or design, and start talking about it aloud, you’re going to get a fair amount of agreement on certain issues, even when it might seem like people are arguing. They’re arguing about the same thing. Other people won’t mention ideas that they may have had because it goes against the groupthink, or they’re about a different aspect of what you’re discussing entirely.

“False-Consensus Bias”

People have a tendency to overestimate how much other people agree with them, as well.  People tend to assume that other people have the same opinions and preferences they do. This can be subtle, and it can be extreme, but it puts all small group feedback into question. It can be very subtle, but in general you need to take individual feedback very delicately. Everyone is going to look at things with their own individual biases, and not understand that others see things differently. You might look at the page and think “well I don’t see how we could get more X than this page” but that could be fed by your own biases. An example: I’ve worked with a client who was convinced that black and white photos were more realistic than color photos. He didn’t think color photography looked real. This was a bias on his part, that we had to convince him that was not a common one. It’s not that he was wrong, it’s just that he had different opinions and preferences than the majority of the population. Be aware of YOUR biases, and that you can see things differently than other people. Also be aware that your friends and close acquaintances can be the same way.

Get feedback from individuals, not groups, but don’t get caught up in specific comments. Reviews of anything by groups are going to be full of bias, reviews by individuals full of personal preference. If you see something on a website, and think that it fulfills the role it’s fulfilling, you’re being affected by bias. Test even the things you think are working just fine.

…even though you’ve been right before…

Texas Sharpshooter Fallacy

Insert Chuck Norris / Texas Sharpshooter joke… No really do your own, it’s just too easy.

“The Texas Sharpshooter Fallacy”

There’s a joke about a Texan who fires some shots at the side of a barn, and then paints a target centered on the biggest cluster of hits. Just because there is a similarity, doesn’t mean there is a pattern. Just because you’ve made some good guesses in the past doesn’t mean that you actually were accurate. You just got lucky. It’s a Clustering Illusion. Maybe you got some good results off your intuition, but that doesn’t mean that you’ve tested enough of your own intuition to not know that you’ve just clustered in some good results. Your past success does not necessarily affect your future behavior.

“Hot Hand Fallacy”

This is when people who have been successful assume they’ll keep being successful. it’s also related to the Monte Carlo Fallacy. It’s when people believe that the future essentially is affected by the past. Like if you flipped a quarter 100 times and each single time it came up heads, what are the odds that it’ll come up heads the 101st time? 50/50. Though people tend to believe that there is this pattern and it’ll continue, the odds remain the same for every coin flip. Just because you’ve been right before, doesn’t mean you’ll be right again.

Don’t be fooled when your own biases are reinforced, because eventually your luck will run out.

…focus on proving yourself wrong…

“Confirmation Bias”

People have a tendency to test things that confirm their beliefs. They have a hypothesis “This form is just too small” so they test making it bigger, and if it’s successful by a small bit, they have their feelings confirmed. However, maybe the form being even smaller would increase conversions. Maybe it wasn’t the form size at all, but a much larger factor was the image above the form, or the color of the text. The confirmation bias makes people try and reinforce their ideas, which isn’t always the best result.

“Congruence Bias”

Similarly this bias is related to people directly testing their idea, rather than indirectly testing it. People have a tendency to test what they think is the problem, rather than other things. So say there is a form on a page, with an image above it, and a font color. One person might think that if they change the image it’ll increase conversion, so they test removing the image. Maybe that increases conversion, but they don’t ALSO test changing the font color, and leaving the image. doing that might disprove the theory if the conversions went even higher.

If you have a hypothesis for what to test, try and come up with a different hypothesis, as different as possible, and then try and find a way to test BOTH.

…think outside the box…

“Functional Fixedness”

Wrong tool for Pizza

Wrong tool for Pizza

When a person looks at a hammer he might only see it as being useful for putting nails in wood. He might not think about using it for straightening a metal blade on his push lawnmower. People have a bias towards using an object only in the way it HAS been used. You’ve seen it before. Someone takes something used for something else, uses it in some other way, and it blows your mind. There are entire websites based on these sorts of adaptations. The same is true for websites and content experiments. Don’t limit yourself to JUST how things have been established.

Come up with unique and different ways to do things, and test them. Just because you’ve always done things a certain way, doesn’t mean you can’t change it up radically

…and don’t get stuck with the same old things.

Anchoring

No matter how much rope you give, you’re still restricted to the length of the rope away from the anchor

“Anchoring”

Humans tend to rely on the first piece of information they are offered, and then base everything else off of that, however then all their later thinking is skewed by the initial piece of information. This can also be compared to the mountain hopping in experiments. Sometimes you can optimize locally, but in order to increase your conversions more, you need a radical change. you need to hop to the next mountain with a higher peak, and that requires a big change. If you’re anchored, you are limited by that initial piece of data.

“Status Quo Bias”

This is the tendency of people to neglect data and pay attention to what’s in front of them. It’s a bias to the current, to the status quo. It’s fear of change, it’s an irrational hold on what currently exists. This comes into play when you want to make a change to something big usually, but it doesn’t have to be big. it can simply be that someone likes how the pictures currently look on the product page, and not see a need to change them. It might not be a Bias Blind Spot, or a False Consensus Bias. It could simply be that even if some silly data proves otherwise, that keeping things the same is more important.

Other than the mixed metaphor, don’t get anchored onto a specific mountain and believe that there is a maximum optimization. Always try and cut that anchor and jump to the next mountain. Don’t be afraid to change the status quo, as long as you have data to support it. 

TL;DR

You’re the wrong person to ask for ideas, and so are your friends, even though you’ve been right before; focus on proving yourself wrong, think outside the box, and don’t get stuck with the same old things.

Sayf Sharif

About Sayf Sharif

Sayf Sharif is a Web Analyst, and expert in Usability and UX, who has worked with businesses large and small to maximize their online presence since the beginning of the Web, winning numerous awards along the way. Sayf has studied human tool use from the stone age (he went to graduate school for Archaeology) to the information age (he started programing on his father’s TRS-80), and is always interested in what goals people wish to accomplish using their tools, and how successful that experience was.

http://www.lunametrics.com/blog/2013/01/17/avoiding-cognitive-bias-conversion-optimization/

3 Responses to “Avoiding Cognitive Bias in Conversion Optimization and Content Experiments”

Danny says:

Great list!

Using the scientific principle is the way to go as far as testing is concerned. Though I see a lot of A/B-test which are questionable to say the least. You need a fairly large group (N) and not everyone comes with the same intention to the site especially if you have a webshop, this means that it’s pretty difficult to control for certain variables.

I think this also means you have to do the test for your own website, results from one website aren’t valid enough to say something about the next website.

Sayf Sharif Sayf Sharif says:

I don’t have a problem with questionable A/B tests per se. Really you should test a whole bunch of stuff, things you think are good ideas, as well as things you think are bad ideas, things that would support your biases, things that counter your biases…. You never know what will work better until you throw it on your website and see from actual usage.

I always come back to the Obama email marketing campaign in the 2012 election where they started by trying to come up with good email subject lines to increase open rates, and the more they tested they more they realized that it was the horrible ideas that worked better, and by the end they were deliberately trying to come up what, to them, seemed like terrible subject lines. The top winner for subject lines of Obama emails was “hey”. You never know bad ideas are really bad, until you test them.

яࡱ says:

I have to thank you for the efforts you’ve put in writing this blog. I am hoping to see the same high-grade content by you later on as well. In truth, your creative writing abilities has inspired me to get my very own website now ;)