This reading is a webcomic that can be found at the following links:
https://theoatmeal.com/comics/believe
If swearing bothers you, the "clean version" can be read instead.
If you need a screenreader-friendly version of this comic, please email me and I will send it to you. I've created a text-only version of the comic for accessibility purposes, but since this comic is under copyright, I cannot publicly post the comic on our class website.
BY-NC-ND www.yourbias.is
You favor things that confirm your existing beliefs.
We are primed to see and agree with ideas that fit our preconceptions, and to ignore and dismiss information that conflicts with them. You could say that this is the mother of all biases, as it affects so much of our thinking through motivated reasoning. To help counteract its influence we ought to presume ourselves wrong until proven right.
Think of your ideas and beliefs as software you're actively trying to find problems with rather than things to be defended. "The first principle is that you must not fool yourself – and you are the easiest person to fool." - Richard Feynman
Further reading:
Wikipedia | You Are Not So Smart | Thinking, Fast and Slow
If a conclusion supports your existing beliefs, you'll rationalize anything that supports it.
It's difficult for us to set aside our existing beliefs to consider the true merits of an argument. In practice this means that our ideas become impervious to criticism, and are perpetually reinforced. Instead of thinking about our beliefs in terms of 'true or false' it's probably better to think of them in terms of probability. For example we might assign a 95%+ chance that thinking in terms of probability will help us think better, and a less than 1% chance that our existing beliefs have no room for any doubt. Thinking probabalistically forces us to evaluate more rationally.
A useful thing to ask is 'when and how did I get this belief?' We tend to automatically defend our ideas without ever really questioning them.
Further reading:
Wikipedia | http://amzn.to/2vHa0Tx
You see personal specifics in vague statements by filling in the gaps.
Because our minds are given to making connections, it's easy for us to take nebulous statements and find ways to interpret them so that they seem specific and personal. The combination of our egos wanting validation with our strong inclination to see patterns and connections means that when someone is telling us a story about ourselves, we look to find the signal and ignore all the noise.
Psychics, astrologers and others use this bias to make it seem like they're telling you something relevant. Consider how things might be interpreted to apply to anyone, not just you.
Further reading:
You unfairly favor those who belong to your group.
We presume that we're fair and impartial, but the truth is that we automatically favor those who are most like us, or belong to our groups. This blind tribalism has evolved to strengthen social cohesion, however in a modern and multicultural world it can have the opposite effect.
Try to imagine yourself in the position of those in out-groups; whilst also attempting to be dispassionate when judging those who belong to your in-groups.
Further reading:
Wikipedia | You Are Not So Smart
When some aspect of your core beliefs is challenged, it can cause you to believe even more strongly.
We can experience being wrong about some ideas as an attack upon our very selves, or our tribal identity. This can lead to motivated reasoning which causes a reinforcement of beliefs, despite disconfirming evidence. Recent research shows that the backfire effect certainly doesn't happen all the time. Most people will accept a correction relating to specific facts, however the backfire effect may reinforce a related or 'parent' belief as people attempt to reconcile a new narrative in their understanding.
“It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.” - Mark Twain
Further reading: