Profile picture of Oliver Lindberg

10 min read

Designing for cognitive bias: An interview with David Dylan Thomas

Content strategist David Dylan Thomas offers a unique and in-depth look at cognitive bias in design.

A photo of David Dylan Thomas in black and white, and his book called “Design for Cognitive Bias” on a gradient background

Illustrations by {name}

Stay informed on all things design.

Thanks for submitting!

Shaping Design is created on Editor X, the advanced web design platform for professionals. Create your next project on Editor X. 

The vast majority of the time our brain runs on autopilot. Around 95 percent of cognition happens below the threshold of conscious thought, which generally is a good thing. We make billions of decisions every day: how fast to read this article, how to navigate the device you’re reading it on, whether to sit or to stand. If we thought carefully about each one of these decisions, we’d never get anything done. The shortcuts our mind takes saves us time, so we can focus on what’s really important, but it also prevents us from making rational decisions. Sometimes this leads to errors – cognitive biases – that can cause harm.


David Dylan Thomas, a content strategy advocate at Philadelphia-based experience design firm Think Company, has studied these biases for the last three years. He covered every single one in his Cognitive Bias Podcast and recently put down his thoughts and learnings in a book called Design for Cognitive Bias, published by A Book Apart. Its aim is to empower us to build better products and collaborate more effectively.

“A large part of a designer’s job is to help people make decisions,” David explains. “And the more we understand how people actually make decisions, the better we can be at our jobs. As most biases happen really fast, however, they’re very hard to fight. We’re not aware of them, and even when we are, we’re often still unable to avoid them. Our mind is really good at fooling us into thinking we’re always in control when in fact we’re only in control about five percent of the time. But what we can do is put guardrails in place, so we can make decisions more carefully.”



"A large part of a designer’s job is to help people make decisions, and the more we understand how people actually make decisions, the better we can be at our jobs."


User biases and how they impact the UX


In his book, David differentiates between user bias, stakeholder bias, and our own bias. As people make all kinds of decisions that seemingly make no sense, we can make design and content strategy choices that will influence our users – for good or bad. You can use rhymes to design for believability, for instance.


“When you say ‘an apple a day keeps the doctor away’, it feels more believable than saying ‘hey, you should eat more apples’,” David points out. “The human mind prefers things that are easy to remember and process, so our design choices can impact it in ways users aren’t even aware of. I can use a rhyming scheme to try to convince you that face masks don’t work, or I can use it to convince you that a mask is your public duty.”

People also tend to attach more importance to the visual element that’s placed highest on a page. This is due to the serial position effect – we’re more likely to remember the items at the beginning and end of a list than the ones in the middle. Originally, Amazon, for example, would show their most recent user review first, and customers would assume it was the most authoritative review. To mitigate the issue, Amazon created a “helpfulness” metric.


“People could rate how helpful they found a review,” David explains. “Amazon then aggregated this information and displayed the most helpful review at the top. That was a little better but now people assumed if it was a positive review, the product was good, and if it was a negative review, the product was bad. So what Amazon settled on, at least for a while, was placing the most helpful positive review next to the most helpful critical review. By giving them both equal visual weight, the user had to give both sides equal cognitive weight.”

We even make unconscious assumptions about time when we see two images next to each other. In a culture that reads left to right, we place “before” pictures on the left and “after” pictures on the right, for example to show the impact of a product, because we think of the past as on the left and the future as on the right. If we flip it, the product we’re selling tends to perform worse. In a right-to-left culture these effects are reversed.


In addition to taking into account user biases during the design phase of a product, it’s important to be aware of biases during the user research and testing stages of the design process, as they can occur even before users interact with our designs and prototypes. David believes that user research is fraught because people are very bad at self-reporting.


“Our memory is terrible for a start,” he laughs. “We think it’s perfect in every detail, but it really isn’t. We’re very bad at remembering why we made a decision or how we felt about it because we think about it from our current perspective. And we’re also very bad at predicting how we will feel in the future. In user research we rely heavily on people to report honestly, and they do it as best as they can, but it’s always a good idea to trust but verify.”


Netflix did exactly that when they were trying to determine what content on their home screen to present to users who had not yet subscribed. In a survey, 46 percent of users said they’d like to know all of the content that was available on Netflix before they took out a subscription. However, when Netflix carried out an A/B test, the version that allowed users to see all of the content performed worse than a second one that highlighted specific features.



Two lamp ads, where the lamp is placed either to the left or to the right of the ad.
People tend to view the lamp-on-the-left ad as more classic looking, and the lamp-on-the-right ad as more modern looking, even though it’s the same lamp.


Combatting the framing effect and using it for good


The most dangerous bias, according to David, is the framing effect, which can massively influence decisions in a way that supersedes the facts.


“If I say this brand of condoms over here is 98 percent effective, and that brand over there has a one percent failure rate, the latter is actually the better condom but my framing makes you think you should go with one that has a 98 percent success rate. Framing can lead to all sorts of horrible decisions. Donald Trump is brilliant at framing. When we now talk about immigration, we always talk about a wall. It didn’t use to be like this but he’s so closely associated the subject with the wall, that that’s become the frame for immigration.”


There are various techniques that you can use to fight the framing effect. If you’re multilingual, for example, and can think about decisions in a language other than your native one, you can decide to spend more time processing and translating your thoughts. By the time you’re done, you’ll hopefully have seen right through the illusion. This shows us that if you think more carefully about a decision, you’re less likely to fall for a bias. That’s why outdoor clothing company Patagonia deliberately slows down its users. Their approach goes against the idea of frictionless experiences.


David explains why sometimes friction can be a good thing: “Patagonia is very environmentally-conscious, and they don’t want you to ever return anything you’ve bought. If you mail back the clothes, it would double the carbon footprint of the purchase. So they slow down the content and buying experience to make sure you really want an item before you buy it.”


The company achieves this with long-form content and massive visuals on its product pages, as well as detailed instructional copy in the shopping cart. David says that in eCommerce this is anathema, because usually you hurry customers through the experience as fast as possible, but Patagonia wants its users to focus and make informed decisions instead, so that they make fewer errors.





Navigating stakeholder bias


As well as user bias, David also explores stakeholder bias in his book. Stakeholder biases affect clients, bosses and any other people who make decisions on the direction of the design and how designers spend their time and budget.

One very common bias is called loss aversion. It hurts twice as much to lose something as it feels good to gain something. This is where you can use the framing effect for good.


“If you’re dealing with a conservative company that’s used to doing things a certain way, don’t focus on all the good things that will happen if they change,” David advises. “They’re likely to not want to risk it. But if you focus on the bad outcomes – for example, the employees and the money they’re going to lose if they don’t take the risk – and paint an apocalyptic picture, it’ll be much more persuasive.”


Another useful technique is stakeholder inception: giving your stakeholder the tools and the framing to arrive at the desired conclusion on their own.


“We had a client who put some really valuable content behind a paywall,” David recalls. “It really didn’t make any sense, but no matter how many times we brought it up, it remained a closed subject. Finally, we ran an exercise and presented them with the evidence, from research, that their users were looking for that content. We then left it to the client to come up with a way that fulfilled both the users’ needs and their own needs. Magically, they came up with the idea of moving that content from behind the paywall. It was so much more effective than me banging my head against the wall!”

Even time plays a big role in how much of a risk stakeholders are willing to take. David points to another bias, called decision fatigue: as you have a limited amount of energy to make decisions, the quality of those decisions is going to decline until you can recharge.


“Judges' decisions become less favourable as you move further away from breakfast and more towards the lunch break,” David reveals. “After lunch they go back up to being more favourable, and then they start to decline again. So when you have a meeting can make a difference in terms of how close it’s scheduled to a meal.”

Even the day of the week can impact the level of risk we’re happy to take. Jet Sanders at the London School of Economics has been researching how the weekly cycle influences decision-making in everything from voting to whether or not to rob a bank.


“It turns out that we’re willing to take more risks earlier and later in the week and fewer in the middle,” David points out. “So if you’re going to ask a stakeholder to take a risk, you’d better have that meeting on a Monday or Friday and ensure they’ve eaten. If it’s a long one or a workshop, make sure you bring food for everyone!”



David Dylan Thomas working at a laptop computer


Challenging our own assumptions


The most difficult biases to fight are our own. One of the most common ones we fall for is confirmation bias: we get an idea in our head, become convinced that it’s great and even go looking for evidence to confirm that it’s true. However, David says we have a duty of care as designers to make a conscious effort to challenge our assumptions. He recommends a range of techniques that can help keep us in check.


Through participatory design, for example, you can invite other, underheard perspectives to the table and combat your own and your stakeholders’ biases, so that you can look at your designs with fresh eyes.


“Before you even do any research, you should make a map of who is involved – users, stakeholders, and anyone else who would be impacted by your product – and how much power they have,” David suggests. “You then identify the group who cares most about the outcome and give them more power. Go to them for research but also follow up and get them to collaborate on the design. Give them the power that you usually reserve for a CEO or a key stakeholder. There’s something very democractic about handing the final say to the people who are going to be most impacted.”

One method to invite other perspectives is called red team/blue team: you bring in another design team to critique your work. The blue team does the initial research and can get as far as the wireframes or prototypes but before they go any further, the red team comes in for one day to look for all the hidden assumptions and potential causes of harm that the blue team didn’t think of because they were in a blind spot.


Similarly, an assumption audit – a method David found out after the book was published – sees a team identify biases based on their own intersectional identity and then talk about who’s not there and how that absence might influence the design. The people who haven’t been identified then become potential participants or red team members in the participatory design process described above.


Another powerful tool David recommends is speculative design, which involves telling a story about a potential future use of your product.


“In design, we spend a lot of time trying to create the most delightful path, but not nearly as much time thinking about all the things that could go wrong,” David explains. “An example is Black Mirror, the show that’s taking near-future technology and then telling a story of how real human beings would use it – which is usually terrible.”


David has created an Inclusive Design workshop around his book, which is intended to help organizations mitigate bias in their design processes. One exercise is based on Black Mirror.


“We take an idea for an app and write down as many stories as possible that could be Black Mirror episodes, ways that the technology can go wrong,” David reveals. “It’s important to do that work before you build anything because then you can implement the safeguards you need or even rethink the whole concept. It’s not as hard as people think to predict some of the outcomes of our technology.”



Air samples created as part of a discussion around renewable energy
Studio Superflux in a project dealing with the future of energy for the United Arab Emirates government.


An inclusive, bias-informed practice for any project