Updating beliefs

As I've said before, confirmation bias is our tendency to cherry-pick information that confirms what we believe in. It means two people with opposing viewpoints read the same book, and come away feeling like it supported their point of view. 

A mistake in an equation can ruin a chain of reasoning, and failing to update beliefs in an unbiased way can lead you down the wrong road. If you can't overcome confirmation bias more information won't help you. You'll interpret everything as a supporting case for your current ideas. 

If you're reading something and you feel like it supports your side of the argument, try to understand the opposing point of view, and do as Charlie Munger does.

I never allow myself to have an opinion on anything that I don't know the other side's argument better than they do. – Charlie Munger

To understand confirmation bias, it helps to see it in action. One experiment rounded up a group of Stanford students who had opposing views on the death penalty. Half were in favour and half were against.

The participants read two studies. The first supported the death penalty and the second called it into question. Both studies were fake, filled with statistics to support their case.

Bottom line: people in favour of the death penalty found the first study convincing and the second unconvincing. People who were against capital punishment felt the opposite.

At the end of the study, the two groups were again asked about their views on the death penalty. Both supported their original view more.

Understanding opposing views only helps if you can overcome confirmation bias, otherwise you're digging a deeper hole. For someone who likes to read and thinks they are pretty open-minded that's a scary thought. I herald people who are well-read but if aren't overcoming confirmation bias all they're doing is overcommitting to whatever they already believe in.

So, we shouldn't read what we disagree with. We should come up with the strongest version of the opposing argument then updating our belief based on that. And don't be afraid to say I was wrong. If you can say you were wrong, then you can start figuring out what it takes to be right.

Most of us have an opinion on everything, how many of us do the work to earn that opinion

It's hard work reading up on a topic, reading the for and against arguments and then weighting up the probability that your position is the right one. It's much easier to fool yourself into believing that you did the work and then never bothering to see if you were right in the first place.

Take the time to understand the problem before you come up with a solution.

And maybe, just maybe when someone asks you about a topic that you've only heard of via 15 second news bites, don't have an opinion. Just say I don't know or I'm still forming an opinion. Don't be a chauffeur and memorise opinions designed for applause. Know why you're getting to the answer, not only how to get there.

The Many-Faced God and the Free City of Bias

Like the Faceless Men in Braavos, cognitive biases have many faces. Our brains developed simple, efficient rules to allow us to make quick decisions. Heuristics.

Normally heuristics serve us well, but not always. When your brain takes shortcuts, sometimes there are debts owed. The good news is those debts are consistent, we're Predictably Irrational. The bad news is it's hard to know when you're being irrational.

The availability heuristic, for example, is our tendency to estimate how likely (or frequent) an event is based on the ease with which a particular idea can be brought to mind. 

When an infrequent event is easy to recall, we tend to overestimate its likelihood. Violent deaths like murders are publicised and have a higher availability. Inversely, common but mundane deaths are hard to bring to mind, so we think they're less likely to happen than getting murdered.

Keep in mind, you're more likely to die from a cardiovascular disease than a serial killer.

It gets worse. One of my favourites is the conjunction fallacy. If I ask you which of the following is more probable:

  • a world class boxer losing the first round; or
  • a world class boxer losing the first round but winning the match.

Chances are, you probably pick the second situation and you'd be wrong. This is because your brain thinks a world class boxer making a comeback is more typical than them only losing the first round, so you overestimate the probability. You buy into the narrative despite the likelihood being greater in the first situation than the second.

Think about it. For the second example to be true, the first must be true. There's two separate probabilities in the second but one in the first. You're ignoring the numbers, you feel like the second situation is intuitively normal so you pick it. 

Another example:

Meet Peter. He's shy, loves reading books, and is frequently in the library. Is Peter more likely to be a librarian or a salesman?

That's right a salesman – there are more salesmen than librarians. Don't fall for base rate neglect. And if you've spent resources on something in the past, it doesn't mean you should care about the sunken cost(s).

Knowing about biases is insufficient. You need to internalise the ideas, so System 2 becomes System 1. No one thinks they're affected by bias – but everyone is. It's easy to see biased thoughts in others, and almost impossible in yourself.

Don't think that knowing about biases makes you less prone to them. You'll become overconfident and start suffering from confirmation bias more. Still, if we want to improve we can but it's insufficient to go to Wikipedia and read through a list of cognitive biases. No, we need deep, internalised understanding of biases.

What does it mean to be rational?

Usually, when we think about rational people, we think of a hyper-intellectual but emotionally stunted person. Think Sherlock Holmes, who can solve any mystery, but can't see to keep his relationships together. He relies on causal chains of logic, not intuition and impulse. 

Rationality is about making the best decision you can, with whatever information you have. It means if I drop you into a field you know nothing about; you could make the best decisions with the limited information you have. However bad the situation is, as a rational person you can make the best decision you know how to.

It's not about ignoring your emotions or hard-fought intuitions. If you think about it, being more in touch with your emotions makes life easier. And intuition can be life saving.

We've all heard about fireman who was battling a kitchen fire and felt the urge to pull out, without knowing why. The team rushed out and seconds later the floor collapsed into the basement below. It wasn't a kitchen fire. It started in the basement.

This story comes from Gary Klein, who spent two years analysing how fireman predicted danger.

"The first thing was that the fire didn't behave the way it was supposed to," Klein said. "Kitchen fires should respond to water - this one didn't. He told me that he always keeps his earflaps up because he wants to get a sense of how hot the fire is, and he was surprised at how hot this one was. A kitchen fire shouldn't have been that hot."

"Often a sign of expertise is noticing what doesn't happen, and the other thing that surprised him was that the fire wasn't noisy. It was quiet, and that didn't make sense given how much heat there was."

Rational means knowing when not to overthink things and when to trust your intuition. This means places were you've built up ample experience through thousands of hours of deliberate practice. Sometimes conscious deliberation makes us perform worse, not better.

The key thing to understand is what Daniel Kahneman calls System 1 and System 2. System 1 is fast, automatic thinking, and System 2 is slow, deliberate thinking. When you practice a skill enough, it moves from System 2 to System 1. Think about when you first learned 2 + 2 = 4; it took time to figure out. Now it's instant. If you want to learn more, read Thinking, Fast and Slow

A well-trained rationalist will be relying on System 1, their intuition. But only because they've spent thousands of hours cultivating the ability to overcome bias using their System 2 thinking.

Unfortunately for you and I, System 1 is terrible at things until you've trained it. Untrained intuition is nothing more than guessing; that's why we shouldn't rely on hunches in fields we know nothing about. 

Fortunately for you and I, we are Predictably Irrational – we make the same mistakes over, and over.

We can't trust our gut to overcome confirmation bias without help, but we can try to train our brains so System 2's slow, deliberate decision-making becomes System 1.

You're biased

It's no secret, but for some reason, it rarely comes up when we talk, and heaven forbid, you tell someone they're biased... and even fewer people are asking what we should do about it. It's ingrained in us, and Facebook's news feed feeding us more what we want to see (with inner workings unseen) sure isn't helping.

Imagine buying a lottery ticket. The chances of winning are one in a million (the odds are much worse). Perhaps you win, and you correctly guess: it was nothing more than luck. Or perhaps, you happen to think the numbers you picked had something to do with it. 

This error isn't the cost of incomplete knowledge. Your estimate (if you think you numbers made you win) will be incorrect on average, even if you won this time. You're suffering from hindsight bias.

Now imagine you know the numbers would be before the draw and you buy your ticket accordingly. Your one in one million chance of winning is guaranteed. But that's a different game, a game that's biased in your favour. And when your method of learning about something is biased (whether in your direction or another), more information doesn't help. If you knew everyone else's numbers (but not the winning numbers), that wouldn't help you win. More information isn't always the answer. Most often we want the right information.

So when we spend time learning, we want to ensure new knowledge helps us, rather than helping us believe in what we already believe more deeply. Whether we like it or not we tend to interpret new information as a confirmation of what we already believe. Confirmation bias is everywhere. That's a shame – being wrong – is a good thing.

Hindsight bias and confirmation bias are both cognitive biases. Systematic errors in how we think, we as in you and I, and the remaining population. Not dumb, and dumber. Confirmation bias skews your beliefs so they less accurately represent reality, even if I showed you information that disproves what you believe in. Decision-making is hard enough, without having to fight with your own brain.

The first thing is to accept that you're biased like everyone else. 

No, not biased like we call each other when trying to win an argument. Cognitive biases are part of being human, a feature and a bug

And knowing about biases need not make them easier to study. If you can't trust your brain (confirmation bias), how can you trust anything?

Well, we can start by internalising what confirmation bias means, then we can introduce it into our vernacular, and finally we try to overcome it and the multitudes of other errors our mind has built in.