The Many-Faced God and the Free City of Bias

Like the Faceless Men in Braavos, cognitive biases have many faces. Our brains developed simple, efficient rules to allow us to make quick decisions. Heuristics.

Normally heuristics serve us well, but not always. When your brain takes shortcuts, sometimes there are debts owed. The good news is those debts are consistent, we're Predictably Irrational. The bad news is it's hard to know when you're being irrational.

The availability heuristic, for example, is our tendency to estimate how likely (or frequent) an event is based on the ease with which a particular idea can be brought to mind. 

When an infrequent event is easy to recall, we tend to overestimate its likelihood. Violent deaths like murders are publicised and have a higher availability. Inversely, common but mundane deaths are hard to bring to mind, so we think they're less likely to happen than getting murdered.

Keep in mind, you're more likely to die from a cardiovascular disease than a serial killer.

It gets worse. One of my favourites is the conjunction fallacy. If I ask you which of the following is more probable:

  • a world class boxer losing the first round; or
  • a world class boxer losing the first round but winning the match.

Chances are, you probably pick the second situation and you'd be wrong. This is because your brain thinks a world class boxer making a comeback is more typical than them only losing the first round, so you overestimate the probability. You buy into the narrative despite the likelihood being greater in the first situation than the second.

Think about it. For the second example to be true, the first must be true. There's two separate probabilities in the second but one in the first. You're ignoring the numbers, you feel like the second situation is intuitively normal so you pick it. 

Another example:

Meet Peter. He's shy, loves reading books, and is frequently in the library. Is Peter more likely to be a librarian or a salesman?

That's right a salesman – there are more salesmen than librarians. Don't fall for base rate neglect. And if you've spent resources on something in the past, it doesn't mean you should care about the sunken cost(s).

Knowing about biases is insufficient. You need to internalise the ideas, so System 2 becomes System 1. No one thinks they're affected by bias – but everyone is. It's easy to see biased thoughts in others, and almost impossible in yourself.

Don't think that knowing about biases makes you less prone to them. You'll become overconfident and start suffering from confirmation bias more. Still, if we want to improve we can but it's insufficient to go to Wikipedia and read through a list of cognitive biases. No, we need deep, internalised understanding of biases.