Go to Blog archive

Cognitive biases – How to deal with Unknown Unknowns

 

This article is part of a three-part series written by Marko Kovic.

 

The art of decision-making, part 1: Don’t trust your gut

Do you trust your judgement? Probably. After all, no matter what it is that you do – maybe you’re an entrepreneur, maybe you’re part of a small team, maybe you work in a big organization – , you have to make tons of decisions, day in, day out. And for the most part, your judgements and your decisions serve you well.

The nice thing about that is that when you are making decisions, you are generally making them “on the fly”. You assess the situation, you take into account all the relevant information, and you decide on how to proceed. It all comes quite naturally. When you know what to do, you just know that you know. And even when you don’t know, you *know* that you don’t know.

But there’s a problem. Sometimes when you are making decisions, there are things you don’t know, and you *don’t know* that you don’t know. We can call these cases Unknown Unknowns. These Unknown Unknowns are blind spots in your decision-making that cause a whole lot of trouble – because you can’t even imagine that these problems might exist. The consequences can be catastrophic.

Cognitive biases and your subtly, but heavily irrational brain

You probably feel like an adequately rational person. Sure, there are crazy people out there who believe and do crazy stuff, but not you. And you’re right: You generally have good reasons for believing what you believe, and you generally make decisions in such a manner that you achieve your goals as much as possible.

Unfortunately, tons of research from cognitive psychology and behavioral economics shows us that there is another side to human rationality. Yes, people generally behave rationally – but at the same time, we are all systematically irrational in subtle, almost unnoticeable ways.

One major source of this invisible irrationality are cognitive biases. When we make decisions, we often rely on heuristics: Automated rules of thumb that our brain uses in order to make a judgement or decision quickly and without the need for deliberating too much. Heuristics are useful, but quite often, they bias our judgements and our decisions. Here’s a small quiz to illustrate this point:

  • John wants to find out whether he has a peanut allergy, so he takes a test. On average, about 1 in 1000 people have a peanut allergy
  • The test is quite reliable. It never fails to detect when someone really has a peanut allergy.
  • Furthermore, the test has a false-positive rate of only 5%: Of all the people who are identified as having a peanut allergy, only 5% do not, in fact, have one.
  • John’s result is positive: The test says he has a peanut allergy.
  • What is the probability that John really has a peanut allergy?


What is your answer? What does your gut feeling say? Think about it for a couple of moments.

 

Was your answer something like 95%? Something like that feels like the correct answer, right? The actual correct answer, however, is 2%. Seriously.

What happened here? The above quiz is an example of the famous base rate fallacy. Our brain has trouble dealing with generic base rate information. We automatically latch onto what feels like the most important piece of information and disregard the equally important, but less interesting piece of information (the 5% probability seems much more relevant, from a story-telling point, than the generic 1/1000 statistic).

The base rate fallacy is only one of dozens of cognitive biases. Some of them are fairly famous, such as the confirmation bias (we seek out information that confirms our prior beliefs), the status quo bias (we don’t like change; defaults are sticky), the sunk cost fallacy (we have trouble letting go of projects, beliefs or people because we have “invested so much”), overconfidence (we overestimate the quality of our judgements), or groupthink (groups can have a conformist dynamic that leads to collective delusional wishful thinking).

Cognitive biases are Unknown Unknowns. When we are affected by them, we are unaware that that is the case. That’s what makes them so troublesome!

Do Unknown Unknowns matter in the real world?

So people are systematically irrational. That’s interesting, but is it also important? Yes, it is massively important, because blind spots in our thinking and decision-making can lead to catastrophic outcomes. A few examples:

  • In 2018, the blood-testing company Theranos shut down after it was discovered that the technology Theranos was promising and selling did not actually exist – it was all an elaborate scam. Over the course of 15 years, Theranos managed to attract over 700 Million Dollars worth of investment. Investors never bothered to critically assess whether the promised technology existed because they fell prey to the bandwagon effect (the more investment Theranos received, the more did other investors want to jump in as well) and the halo effect (Theranos’ CEO and its board of directors had a lot of star power that gave Theranos credibility).
  • In 2003, the United States and its coalition forces invaded Iraq. The justification for the invasion was Iraq’s alleged production of weapons of mass destruction and by its ties to Al Qaeda. It was believed that the war would only last a few months. All of these assessments and decisions were catastrophically wrong – even though all the necessary information was available beforehand. The 2003 Iraq invasion was a textbook case of groupthink, confirmation bias, and intergroup bias (different intelligence services and departments harbored negative attitudes towards each other, resulting in inefficient sharing of information).
  • In 1912, one of the biggest naval disasters in history happened: The RMS Titanic sank after hitting an iceberg. That disaster was not only shocking because of its human toll, but also because the Titanic was widely believed to be “unsinkable”. That belief was indicative of the overconfidence bias that affected both the creators of the Titanic (the Titanic had far too few lifeboats given its size) and the crew and captain on board (the Titanic was driving at full speed, even though it received warnings of icebergs).

Theranos, the Iraq war and the Titanic disasters might seem like extreme examples. But they are extreme only in their outcomes. Their causes – the cognitive blind spots we all have – are just as present in a startup or in a small team.

What can you do about Unknown Unknowns in your organization?

Unknown Unknowns can cause damage in any individual, team, and organization. Given the nature of Unknown Unknowns (we are unaware of them), we usually realize that there’s a problem only when it’s too late. Unfortunately, we cannot simply bootstrap ourselves out of our Unknown Unknowns – if we could, there wouldn’t be a problem in the first place!

So what can you do? In parts 2 and 3 of this series, I will present two approaches for dealing with Unknown Unknowns: Applied rationality training and Red Teaming. Stay tuned!