The Many-Faced God and the Free City of Bias

Like the Faceless Men in Braavos, cognitive biases have many faces. Our brains developed simple, efficient rules to allow us to make quick decisions. Heuristics.

Normally heuristics serve us well, but not always. When your brain takes shortcuts, sometimes there are debts owed. The good news is those debts are consistent, we're Predictably Irrational. The bad news is it's hard to know when you're being irrational.

The availability heuristic, for example, is our tendency to estimate how likely (or frequent) an event is based on the ease with which a particular idea can be brought to mind. 

When an infrequent event is easy to recall, we tend to overestimate its likelihood. Violent deaths like murders are publicised and have a higher availability. Inversely, common but mundane deaths are hard to bring to mind, so we think they're less likely to happen than getting murdered.

Keep in mind, you're more likely to die from a cardiovascular disease than a serial killer.

It gets worse. One of my favourites is the conjunction fallacy. If I ask you which of the following is more probable:

  • a world class boxer losing the first round; or
  • a world class boxer losing the first round but winning the match.

Chances are, you probably pick the second situation and you'd be wrong. This is because your brain thinks a world class boxer making a comeback is more typical than them only losing the first round, so you overestimate the probability. You buy into the narrative despite the likelihood being greater in the first situation than the second.

Think about it. For the second example to be true, the first must be true. There's two separate probabilities in the second but one in the first. You're ignoring the numbers, you feel like the second situation is intuitively normal so you pick it. 

Another example:

Meet Peter. He's shy, loves reading books, and is frequently in the library. Is Peter more likely to be a librarian or a salesman?

That's right a salesman – there are more salesmen than librarians. Don't fall for base rate neglect. And if you've spent resources on something in the past, it doesn't mean you should care about the sunken cost(s).

Knowing about biases is insufficient. You need to internalise the ideas, so System 2 becomes System 1. No one thinks they're affected by bias – but everyone is. It's easy to see biased thoughts in others, and almost impossible in yourself.

Don't think that knowing about biases makes you less prone to them. You'll become overconfident and start suffering from confirmation bias more. Still, if we want to improve we can but it's insufficient to go to Wikipedia and read through a list of cognitive biases. No, we need deep, internalised understanding of biases.

What does it mean to be rational?

Usually, when we think about rational people, we think of a hyper-intellectual but emotionally stunted person. Think Sherlock Holmes, who can solve any mystery, but can't see to keep his relationships together. He relies on causal chains of logic, not intuition and impulse. 

Rationality is about making the best decision you can, with whatever information you have. It means if I drop you into a field you know nothing about; you could make the best decisions with the limited information you have. However bad the situation is, as a rational person you can make the best decision you know how to.

It's not about ignoring your emotions or hard-fought intuitions. If you think about it, being more in touch with your emotions makes life easier. And intuition can be life saving.

We've all heard about fireman who was battling a kitchen fire and felt the urge to pull out, without knowing why. The team rushed out and seconds later the floor collapsed into the basement below. It wasn't a kitchen fire. It started in the basement.

This story comes from Gary Klein, who spent two years analysing how fireman predicted danger.

"The first thing was that the fire didn't behave the way it was supposed to," Klein said. "Kitchen fires should respond to water - this one didn't. He told me that he always keeps his earflaps up because he wants to get a sense of how hot the fire is, and he was surprised at how hot this one was. A kitchen fire shouldn't have been that hot."

"Often a sign of expertise is noticing what doesn't happen, and the other thing that surprised him was that the fire wasn't noisy. It was quiet, and that didn't make sense given how much heat there was."

Rational means knowing when not to overthink things and when to trust your intuition. This means places were you've built up ample experience through thousands of hours of deliberate practice. Sometimes conscious deliberation makes us perform worse, not better.

The key thing to understand is what Daniel Kahneman calls System 1 and System 2. System 1 is fast, automatic thinking, and System 2 is slow, deliberate thinking. When you practice a skill enough, it moves from System 2 to System 1. Think about when you first learned 2 + 2 = 4; it took time to figure out. Now it's instant. If you want to learn more, read Thinking, Fast and Slow

A well-trained rationalist will be relying on System 1, their intuition. But only because they've spent thousands of hours cultivating the ability to overcome bias using their System 2 thinking.

Unfortunately for you and I, System 1 is terrible at things until you've trained it. Untrained intuition is nothing more than guessing; that's why we shouldn't rely on hunches in fields we know nothing about. 

Fortunately for you and I, we are Predictably Irrational – we make the same mistakes over, and over.

We can't trust our gut to overcome confirmation bias without help, but we can try to train our brains so System 2's slow, deliberate decision-making becomes System 1.