Designing for cognitive bias: An interview with David Dylan Thomas

Profile picture of Oliver Lindberg




Illustrations by {name}

Content strategist David Dylan Thomas offers a unique and in-depth look at cognitive bias in design.

10 min read

A photo of David Dylan Thomas in black and white, and his book called “Design for Cognitive Bias” on a gradient background

Stay informed on all things design.

Thanks for submitting!

Shaping Design is created on Editor X, the advanced web design platform for professionals. Create your next project on Editor X. 

Get our latest stories delivered straight to your inbox →

The vast majority of the time our brain runs on autopilot. Around 95 percent of cognition happens below the threshold of conscious thought, which generally is a good thing. We make billions of decisions every day: how fast to read this article, how to navigate the device you’re reading it on, whether to sit or to stand. If we thought carefully about each one of these decisions, we’d never get anything done. The shortcuts our mind takes saves us time, so we can focus on what’s really important, but it also prevents us from making rational decisions. Sometimes this leads to errors – cognitive biases – that can cause harm.

David Dylan Thomas, a content strategy advocate at Philadelphia-based experience design firm Think Company, has studied these biases for the last three years. He covered every single one in his Cognitive Bias Podcast and recently put down his thoughts and learnings in a book called Design for Cognitive Bias, published by A Book Apart. Its aim is to empower us to build better products and collaborate more effectively.

“A large part of a designer’s job is to help people make decisions,” David explains. “And the more we understand how people actually make decisions, the better we can be at our jobs. As most biases happen really fast, however, they’re very hard to fight. We’re not aware of them, and even when we are, we’re often still unable to avoid them. Our mind is really good at fooling us into thinking we’re always in control when in fact we’re only in control about five percent of the time. But what we can do is put guardrails in place, so we can make decisions more carefully.”

"A large part of a designer’s job is to help people make decisions, and the more we understand how people actually make decisions, the better we can be at our jobs."

User biases and how they impact the UX

In his book, David differentiates between user bias, stakeholder bias, and our own bias. As people make all kinds of decisions that seemingly make no sense, we can make design and content strategy choices that will influence our users – for good or bad. You can use rhymes to design for believability, for instance.

“When you say ‘an apple a day keeps the doctor away’, it feels more believable than saying ‘hey, you should eat more apples’,” David points out. “The human mind prefers things that are easy to remember and process, so our design choices can impact it in ways users aren’t even aware of. I can use a rhyming scheme to try to convince you that face masks don’t work, or I can use it to convince you that a mask is your public duty.”

People also tend to attach more importance to the visual element that’s placed highest on a page. This is due to the serial position effect – we’re more likely to remember the items at the beginning and end of a list than the ones in the middle. Originally, Amazon, for example, would show their most recent user review first, and customers would assume it was the most authoritative review. To mitigate the issue, Amazon created a “helpfulness” metric.

“People could rate how helpful they found a review,” David explains. “Amazon then aggregated this information and displayed the most helpful review at the top. That was a little better but now people assumed if it was a positive review, the product was good, and if it was a negative review, the product was bad. So what Amazon settled on, at least for a while, was placing the most helpful positive review next to the most helpful critical review. By giving them both equal visual weight, the user had to give both sides equal cognitive weight.”

We even make unconscious assumptions about time when we see two images next to each other. In a culture that reads left to right, we place “before” pictures on the left and “after” pictures on the right, for example to show the impact of a product, because we think of the past as on the left and the future as on the right. If we flip it, the product we’re selling tends to perform worse. In a right-to-left culture these effects are reversed.

In addition to taking into account user biases during the design phase of a product, it’s important to be aware of biases during the user research and testing stages of the design process, as they can occur even before users interact with our designs and prototypes. David believes that user research is fraught because people are very bad at self-reporting.

“Our memory is terrible for a start,” he laughs. “We think it’s perfect in every detail, but it really isn’t. We’re very bad at remembering why we made a decision or how we felt about it because we think about it from our current perspective. And we’re also very bad at predicting how we will feel in the future. In user research we rely heavily on people to report honestly, and they do it as best as they can, but it’s always a good idea to trust but verify.”

Netflix did exactly that when they were trying to determine what content on their home screen to present to users who had not yet subscribed. In a survey, 46 percent of users said they’d like to know all of the content that was available on Netflix before they took out a subscription. However, when Netflix carried out an A/B test, the version that allowed users to see all of the content performed worse than a second one that highlighted specific features.

Two lamp ads, where the lamp is placed either to the left or to the right of the ad.
People tend to view the lamp-on-the-left ad as more classic looking, and the lamp-on-the-right ad as more modern looking, even though it’s the same lamp.

Combatting the framing effect and using it for good

The most dangerous bias, according to David, is the framing effect, which can massively influence decisions in a way that supersedes the facts.

“If I say this brand of condoms over here is 98 percent effective, and that brand over there has a one percent failure rate, the latter is actually the better condom but my framing makes you think you should go with one that has a 98 percent success rate. Framing can lead to all sorts of horrible decisions. Donald Trump is brilliant at framing. When we now talk about immigration, we always talk about a wall. It didn’t use to be like this but he’s so closely associated the subject with the wall, that that’s become the frame for immigration.”

There are various techniques that you can use to fight the framing effect. If you’re multilingual, for example, and can think about decisions in a language other than your native one, you can decide to spend more time processing and translating your thoughts. By the time you’re done, you’ll hopefully have seen right through the illusion. This shows us that if you think more carefully about a decision, you’re less likely to fall for a bias. That’s why outdoor clothing company Patagonia deliberately slows down its users. Their approach goes against the idea of frictionless experiences.

David explains why sometimes friction can be a good thing: “Patagonia is very environmentally-conscious, and they don’t want you to ever return anything you’ve bought. If you mail back the clothes, it would double the carbon footprint of the purchase. So they slow down the content and buying experience to make sure you really want an item before you buy it.”

The company achieves this with long-form content and massive visuals on its product pages, as well as detailed instructional copy in the shopping cart. David says that in eCommerce this is anathema, because usually you hurry customers through the experience as fast as possible, but Patagonia wants its users to focus and make informed decisions instead, so that they make fewer errors.