Updating beliefs

As I've said before, confirmation bias is our tendency to cherry-pick information that confirms what we believe in. It means two people with opposing viewpoints read the same book, and come away feeling like it supported their point of view. 

A mistake in an equation can ruin a chain of reasoning, and failing to update beliefs in an unbiased way can lead you down the wrong road. If you can't overcome confirmation bias more information won't help you. You'll interpret everything as a supporting case for your current ideas. 

If you're reading something and you feel like it supports your side of the argument, try to understand the opposing point of view, and do as Charlie Munger does.

I never allow myself to have an opinion on anything that I don't know the other side's argument better than they do. – Charlie Munger

To understand confirmation bias, it helps to see it in action. One experiment rounded up a group of Stanford students who had opposing views on the death penalty. Half were in favour and half were against.

The participants read two studies. The first supported the death penalty and the second called it into question. Both studies were fake, filled with statistics to support their case.

Bottom line: people in favour of the death penalty found the first study convincing and the second unconvincing. People who were against capital punishment felt the opposite.

At the end of the study, the two groups were again asked about their views on the death penalty. Both supported their original view more.

Understanding opposing views only helps if you can overcome confirmation bias, otherwise you're digging a deeper hole. For someone who likes to read and thinks they are pretty open-minded that's a scary thought. I herald people who are well-read but if aren't overcoming confirmation bias all they're doing is overcommitting to whatever they already believe in.

So, we shouldn't read what we disagree with. We should come up with the strongest version of the opposing argument then updating our belief based on that. And don't be afraid to say I was wrong. If you can say you were wrong, then you can start figuring out what it takes to be right.

Most of us have an opinion on everything, how many of us do the work to earn that opinion

It's hard work reading up on a topic, reading the for and against arguments and then weighting up the probability that your position is the right one. It's much easier to fool yourself into believing that you did the work and then never bothering to see if you were right in the first place.

Take the time to understand the problem before you come up with a solution.

And maybe, just maybe when someone asks you about a topic that you've only heard of via 15 second news bites, don't have an opinion. Just say I don't know or I'm still forming an opinion. Don't be a chauffeur and memorise opinions designed for applause. Know why you're getting to the answer, not only how to get there.

The map is not the territory

It's easy to forget the word for something represents what the thing is, rather than the thing. If I say the word dog to you, your brain thinks of a four-legged animal covered in hair that barks – but the word D-O-G isn't a dog, it's your mental shortcut for a dog. And a useful shortcut at that, we don't want to spend 30 seconds talking about a four-legged animal that barks before we show our friends the latest doge meme.

But D-O-G is not our four-legged friend, and the map is not the territory. If you think something will happen and it doesn't. You were wrong. Nothing weird happened, you predicted incorrectly. 

Maps are not the territory because maps represent reality. If maps were reality, a map of Australia would be the size of Australia.

The same is true for your models of reality, if what you believe doesn't fit in with what you see, you are wrong. Not reality. Our ideas are not (and we should not treat them as) concrete. Mental models are abstractions away from reality and it's wrong to treat them as anything else. You throw away any mental models that don't fit reality, and update ones that warp your predictions.

Words are both ambiguous, and the best tool we've got. Reification is when we treat abstraction (like a word) as real. 

Remember, D-O-G != four-legged hairy barking animal.

For my fellow etymology nerds out there... Reification comes from the Latin word res meaning thing and -fication, a suffix related to facere meaning to make. So it loosely means thing-making.

Reification is a natural part of our vernacular. Humans love to personify. Look at how we personify nature in water wants to flow downhill. Yes, it makes it easier to remember, but water doesn't want to flow downhill. At least not how I want a coffee. Water flows downhill because of physics, not by the same desire I have for my morning coffee.

So next time you're arguing with someone about the definition of a word, gently remind them the map is not the territory. Words are nothing more than abstractions from reality. Abstractions that can be twisted and tweaked to fit your models. And no, please don't pull out a dictionary. Just because some people can use words to explain things better than others doesn't mean we aren't all using words to make life easier. 

D-O-G != four-legged hairy barking animal.

The map is not the territory and we should keep a diligent eye on our use of reification when we are reasoning from first principles. Words can be misleading.

The Many-Faced God and the Free City of Bias

Like the Faceless Men in Braavos, cognitive biases have many faces. Our brains developed simple, efficient rules to allow us to make quick decisions. Heuristics.

Normally heuristics serve us well, but not always. When your brain takes shortcuts, sometimes there are debts owed. The good news is those debts are consistent, we're Predictably Irrational. The bad news is it's hard to know when you're being irrational.

The availability heuristic, for example, is our tendency to estimate how likely (or frequent) an event is based on the ease with which a particular idea can be brought to mind. 

When an infrequent event is easy to recall, we tend to overestimate its likelihood. Violent deaths like murders are publicised and have a higher availability. Inversely, common but mundane deaths are hard to bring to mind, so we think they're less likely to happen than getting murdered.

Keep in mind, you're more likely to die from a cardiovascular disease than a serial killer.

It gets worse. One of my favourites is the conjunction fallacy. If I ask you which of the following is more probable:

  • a world class boxer losing the first round; or
  • a world class boxer losing the first round but winning the match.

Chances are, you probably pick the second situation and you'd be wrong. This is because your brain thinks a world class boxer making a comeback is more typical than them only losing the first round, so you overestimate the probability. You buy into the narrative despite the likelihood being greater in the first situation than the second.

Think about it. For the second example to be true, the first must be true. There's two separate probabilities in the second but one in the first. You're ignoring the numbers, you feel like the second situation is intuitively normal so you pick it. 

Another example:

Meet Peter. He's shy, loves reading books, and is frequently in the library. Is Peter more likely to be a librarian or a salesman?

That's right a salesman – there are more salesmen than librarians. Don't fall for base rate neglect. And if you've spent resources on something in the past, it doesn't mean you should care about the sunken cost(s).

Knowing about biases is insufficient. You need to internalise the ideas, so System 2 becomes System 1. No one thinks they're affected by bias – but everyone is. It's easy to see biased thoughts in others, and almost impossible in yourself.

Don't think that knowing about biases makes you less prone to them. You'll become overconfident and start suffering from confirmation bias more. Still, if we want to improve we can but it's insufficient to go to Wikipedia and read through a list of cognitive biases. No, we need deep, internalised understanding of biases.

What does it mean to be rational?

Usually, when we think about rational people, we think of a hyper-intellectual but emotionally stunted person. Think Sherlock Holmes, who can solve any mystery, but can't see to keep his relationships together. He relies on causal chains of logic, not intuition and impulse. 

Rationality is about making the best decision you can, with whatever information you have. It means if I drop you into a field you know nothing about; you could make the best decisions with the limited information you have. However bad the situation is, as a rational person you can make the best decision you know how to.

It's not about ignoring your emotions or hard-fought intuitions. If you think about it, being more in touch with your emotions makes life easier. And intuition can be life saving.

We've all heard about fireman who was battling a kitchen fire and felt the urge to pull out, without knowing why. The team rushed out and seconds later the floor collapsed into the basement below. It wasn't a kitchen fire. It started in the basement.

This story comes from Gary Klein, who spent two years analysing how fireman predicted danger.

"The first thing was that the fire didn't behave the way it was supposed to," Klein said. "Kitchen fires should respond to water - this one didn't. He told me that he always keeps his earflaps up because he wants to get a sense of how hot the fire is, and he was surprised at how hot this one was. A kitchen fire shouldn't have been that hot."

"Often a sign of expertise is noticing what doesn't happen, and the other thing that surprised him was that the fire wasn't noisy. It was quiet, and that didn't make sense given how much heat there was."

Rational means knowing when not to overthink things and when to trust your intuition. This means places were you've built up ample experience through thousands of hours of deliberate practice. Sometimes conscious deliberation makes us perform worse, not better.

The key thing to understand is what Daniel Kahneman calls System 1 and System 2. System 1 is fast, automatic thinking, and System 2 is slow, deliberate thinking. When you practice a skill enough, it moves from System 2 to System 1. Think about when you first learned 2 + 2 = 4; it took time to figure out. Now it's instant. If you want to learn more, read Thinking, Fast and Slow

A well-trained rationalist will be relying on System 1, their intuition. But only because they've spent thousands of hours cultivating the ability to overcome bias using their System 2 thinking.

Unfortunately for you and I, System 1 is terrible at things until you've trained it. Untrained intuition is nothing more than guessing; that's why we shouldn't rely on hunches in fields we know nothing about. 

Fortunately for you and I, we are Predictably Irrational – we make the same mistakes over, and over.

We can't trust our gut to overcome confirmation bias without help, but we can try to train our brains so System 2's slow, deliberate decision-making becomes System 1.

You're biased

It's no secret, but for some reason, it rarely comes up when we talk, and heaven forbid, you tell someone they're biased... and even fewer people are asking what we should do about it. It's ingrained in us, and Facebook's news feed feeding us more what we want to see (with inner workings unseen) sure isn't helping.

Imagine buying a lottery ticket. The chances of winning are one in a million (the odds are much worse). Perhaps you win, and you correctly guess: it was nothing more than luck. Or perhaps, you happen to think the numbers you picked had something to do with it. 

This error isn't the cost of incomplete knowledge. Your estimate (if you think you numbers made you win) will be incorrect on average, even if you won this time. You're suffering from hindsight bias.

Now imagine you know the numbers would be before the draw and you buy your ticket accordingly. Your one in one million chance of winning is guaranteed. But that's a different game, a game that's biased in your favour. And when your method of learning about something is biased (whether in your direction or another), more information doesn't help. If you knew everyone else's numbers (but not the winning numbers), that wouldn't help you win. More information isn't always the answer. Most often we want the right information.

So when we spend time learning, we want to ensure new knowledge helps us, rather than helping us believe in what we already believe more deeply. Whether we like it or not we tend to interpret new information as a confirmation of what we already believe. Confirmation bias is everywhere. That's a shame – being wrong – is a good thing.

Hindsight bias and confirmation bias are both cognitive biases. Systematic errors in how we think, we as in you and I, and the remaining population. Not dumb, and dumber. Confirmation bias skews your beliefs so they less accurately represent reality, even if I showed you information that disproves what you believe in. Decision-making is hard enough, without having to fight with your own brain.

The first thing is to accept that you're biased like everyone else. 

No, not biased like we call each other when trying to win an argument. Cognitive biases are part of being human, a feature and a bug

And knowing about biases need not make them easier to study. If you can't trust your brain (confirmation bias), how can you trust anything?

Well, we can start by internalising what confirmation bias means, then we can introduce it into our vernacular, and finally we try to overcome it and the multitudes of other errors our mind has built in. 


Blindspots

I love looking at how companies communicate what they're doing and why you should buy it. One problem I find is a lot of companies focus on what they think is important, and not what everyone else does.

While you might care about how technically difficult your product was to build, or how you're using the latest technology, your users probably don't. Most people don't even understand how to use keyboard shortcuts, let alone open up something in Visual Studio.

It's an all-too-common flaw in messaging that even the most brilliant companies make: once you've developed a product that meets your needs – and many great products start out that way – how do you market it to a population that isn't like you at all? It's surprisingly hard to make a product appealing to the normals. 

Amazing technology is amazing for us, and it often receives a lot of funding (and well-deserved props) in startup circles, but we quickly forget about the normals, the rest of the world. We live and breathe the Internet, but a lot of people don't. 

The Internet has changed everything, and it will continue to do so. We know that but everyone else doesn't.

Ten years before the iPhone, you had to get out a map to get places. Five years before, you'd hop onto Google Maps and preplan your route. Now, you walk out of the house with the best map in the world, in your pocket (or on your wrist). No doubt, it's amazing technology – but where's the real value coming from? It's the map. 

In short, the people who use your product today are nothing like the people who built it. If you're successful, the normals will displace the fanatics. 

So while your product might be an essential component of your life, it's only a welcome addition for everyone else. We must make sure we communicate clearly, even to the normies. Don't get blindsided by big words and industry lingo. The best message is simple. Look at Donald Trump. Whether you like him or not, his ability to use simple language is world-class. It's something we should all strive towards - simple, clear language that anyone can understand.

What I'm reading online (bringing back the blog roll)

Beyond books, I spend a fair amount of time reading online, mostly blogs. Blogs fill the space between books, they're not as minute-by-minute as news is, and they're not quite a book. Generally the ideas on blogs are forever updating, in some ways they beat books.

Reading books and blogs is my way of seeking out new ways to see reality, I'm particularly interested in how people think and how technology businesses work. Through my travels, I've put together a pretty good reading list that I think are some of the better reads online. This list will likely shrink and grow with time.

What I've been reading

This page is dedicated to the books I've read. If you think there is something I should explore, please suggest in the comments.

I used to read one book every few months or so until I found audiobooks and realised I could speed them to 3x (like I do with podcasts) and still get good comprehension (probably better than I do when I read from a book). Since then, I've been "reading" while riding the train to work, at the gym, and pretty much any time I can stick my AirPods in my ears. 

It's probably the single most valuable thing I can tell you. It sounds like an ad, but you should consider downloading Audible and start listening to books at 3x while commuting, walking, relaxing, or any time when you can't be sitting down and reading.

Below is a list of the books I've read thus far, in order. If you'd like me to dive deeper into any of these books, please let me know in the comments.

2018

  1. Michael Jordan, by Roland Lazenby
  2. Deep Work, by Cal Newport
  3. The Art of Thinking Clearly, by Rolf Dobelli
  4. The Art of Learning, by Josh Waitzkin
  5. The Inner Game of Tennis, by W. Timothy Gallwey
  6. Predictably Irrational, by Dan Ariely
  7. Superforecasting, by Philip Tetlock

2017

  1. The Everything Store, by Brad Stone
  2. Hatching Twitter, by Nick Bilton
  3. Steve Jobs, by Walter Isaacson
  4. The Black Swan, by Nassim Nicholas Taleb
  5. The Art of Learning, by Josh Waitzkin
  6. 48 Laws of Power, by Robert Greene
  7. The Most Important Thing, by Howard Marks
  8. The Rise of Superman, by Steven Kotler
  9. Benjamin Franklin, by Walter Isaacson
  10. The Brain That Changes Itself, by Norman Doidge MD
  11. The Art of the Deal, by Donald J. Trump
  12. Grit, by Angela Duckworth
  13. Outliers, by Malcolm Gladwell
  14. Charlie Munger, by Tren Griffin
  15. The Upstarts, by Brad Stone
  16. The Marshmallow Test, by Walter Mischel
  17. We Learn Nothing, by Tim Kreider
  18. The Code of the Extraordinary Mind, by Vishen Lakhiani
  19. Surely, You're Joking Mr. Feynman!, by Richard R. Feynman
  20. Essentialism, by Greg McKeown
  21. Homo Deus, by Yuval Noah Harari
  22. Business @ the Speed of Thought, by Bill Gates
  23. World Order, by Henry Kissinger
  24. Basic Economics, by Thomas Sowell
  25. Team of Teams, by General Stanley McChrystal
  26. Flash Boys, by Michael Lewis
  27. Liar's Poker, by Michael Lewis
  28. The Big Short, by Michael Lewis
  29. The Hard Thing About Hard Things, by Ben Horowitz
  30. Andrew Carnegie, by David Nasaw
  31. On Writing, by Stephen King
  32. Writing Tools, by Roy Peter Clark
  33. The Fountainhead, by Ayn Rand
  34. Atlas Shrugged, by Ayn Rand
  35. Thinking, Fast and Slow, by Daniel Kahneman
  36. Siddhartha, by Hermann Hesse
  37. Nudge, by Richard Thaler
  38. Misbehaving, by Richard Thaler
  39. Influence, by Robert Cialdini Ph.D.
  40. Fooling Some of the People All of the Time, by David Einhorn
  41. The Remains of the Day, by Kazuo Ishiguro
  42. Fooled by Randomness, by Nassim Nicholas Taleb
  43. Our Oriental Heritage, by Will Durant
  44. The Life of Greece, by Will Durant
  45. Caesar and Christ, by Will Durant
  46. How Will You Measure Your Life?, by Clayton M. Christensen
  47. The Innovator's Solution, by Clayton M. Christensen
  48. Swann's Way, by Marcel Proust
  49. Peak, by Anders Ericsson
  50. Republic, by Plato
  51. Principles, by Ray Dalio
  52. Nicomachean Ethics, by Aristotle
  53. 1984, by George Orwell
  54. Fahrenheit 451, by Ray Bradbury
  55. Heart of Darkness, by Joseph Conrad
  56. Zero to One, by Peter Thiel
  57. Hillbilly Elegy, by J. D. Vance
  58. The 22 Immutable Laws of Marketing, by Al Ries
  59. Perennial Seller, by Ryan Holiday
  60. Learn or Die, Edward D. Hess
  61. Originals, by Adam Grant
  62. An Everyone Culture, by Robert Kegan
  63. The Buried Giant, by Kazuo Ishiguro
  64. Capital in the Twenty-First Century, by Thomas Piketty
  65. Leonardo Da Vinci, by Walter Isaacson
  66. Propaganda, by Edward Bernays
  67. Crystallising Public Opinion, by Edward Bernays
  68. Bored and Brilliant, by Manoush Zomorodi
  69. Gut, by Giulia Enders
  70. Deep Work, by Cal Newport
  71. Win Bigly, by Scott Adams
  72. Impossible to Ignore, by Carmen Simon Ph.D.
  73. Yes! 50 Secrets from the Science of Persuasion, by Robert Cialdini Ph.D.
  74. The Goal, by Eliyahu M. Goldratt
  75. It's Not Luck, by Eliyahu M. Goldratt
  76. Critical Chain, by Eliyahu M. Goldratt
  77. The Innovators, by Walter Isaacson
  78. American Prometheus, by Kai Bird
  79. Fermat's Last Theorem, by Simon Singh
  80. To Sell Is Human, by Daniel H. Pink

2016

  1. Scientific Advertising, by Claude C Hopkins
  2. Business Adventures, by John Brooks
  3. The Intelligent Investor, by Benjamin Graham
  4. The Undoing Project, by Michael Lewis
  5. The Grid, by Gretchen Bakke Ph.D.
  6. Chaos Monkeys, by Antonio Garcia Martinez
  7. Sapiens, by Yuval Noah Harari
  8. Deep Work, by Cal Newport
  9. Blue Ocean Strategy, by W. Chan Kim
  10. Zero to One, by Peter Thiel

Writing to learn

Richard Feynman (read Surely You're Joking, Mr Feynman!) had a brilliant way of learning something new. When he wanted to learn something new, he'd try explain it in simple language, language that a toddler could understand. If he got stuck somewhere, and couldn't explain it simply he'd go back and learn more until he could.

Given that he was awarded the Nobel Prize in Physics in 1965, I assume it works. Given that he had time to learn how to pick locks and play the bongo drums, I'd assume it works well.

In honour of Mr Feynman, and for my own selfish benefit, I want to use Feynman's technique to learn more about my own cognitive biases by writing about them, and hopefully overcome them. 

My intention is to improve my own thinking and decision-making. By taking what I learn when I read and trying to communicate it to you in simple language, ideally with actionable insights that you can apply to your daily life. I think once you get your daily life under control you can then start solving the big, difficult, important problems that most people never get to (because they haven't built out systems to deal with the minutiae).

It's easy to find a list of cognitive biases, just look to Wikipedia. It's a little harder to understand the theory, but again, still pretty easy. What's hard is learning how to practice seeing your biases. I haven't figured that part out yet either...

I think writing about it will help.

What I currently believe is you need to internalise biases not know them by name, as Josh Waitzkin says, you need to study numbers to leave numbers. Ideally, you have a community of people who are willing to learn with you, I hope this blog becomes that community.

I also believe that taking the time to improve your decision-making will pay dividends, it's a valuable past time, one that's not taught in schools. It's not even taught in a systematic way, there are a lucky few who pick it up along the way, pulling together information from here and there. Or they learn it indirectly through another discipline, like science or engineering, where they teach to say "I was wrong" and then work out why your initial hypothesis didn't work.

But you don't need to be working in a lab to benefit from the scientific method. Being able to say you were wrong is valuable, no one wants to do it. But If you can't say "I was wrong" then figure out where to go from there, what's the point reading more books? Confirmation bias is killing the new information anyway. Learning about our biases can help us stop doing that.

Decision-making is hard to learn, but it's the bedrock of our economy, and it's fascinating. It just takes looking at the same thing from multiple angles until it clicks. That's what I'm trying to do.

In the end one thing matters, that it improves our lives. Who knows, I could be wrong, maybe I'll never overcome my biases and maybe the best you can hope for is knowing each bias by name. If I am wrong, I hope I'll be able to say "I was wrong" and figure out what to write about from there.