Biases, Fallacies, and Errors in Reasoning II

Here's a list of some of the most common mistakes that we make when we reason:

Motivated Reasoning: People criticize preference inconsistent information with excessive skepticism, while lowering those critical standards for information that would corroborate favored beliefs. That is, they are more critical and demand a higher level of evidence concerning conclusions that conflict with things they already believe, and they are less thoughtful or skeptical when evidence supports their favored views. A prior held belief steers the search for and analysis of information rather than an unbiased gathering and evaluation of evidence leading to the most reasonable conclusion. Motivated reasoning may or may not commit confirmation bias as well. Motivated reasoning is reasoning that is directed at achieving a particular conclusion, no matter what the truth or the evidence indicates. Confirmation bias is a particular kind of filtering of the evidence that leads to a conclusion.

Confirmation Bias is the mistake of looking for evidence that confirms a favored conclusion while neglecting or ignoring evidence that would disprove it. Susan reads her horoscope and it tells her that Virgos are outgoing. She believes in astrology, and she searches her memory for times when she’s been outgoing. She thinks of a few cases where it seemed to be accurate and concludes that astrology works. Juan thinks that he has prescient dreams, or dreams that tell the future. Out of the thousands and thousands of dreams he’s had over the years, he easily recalls the one or two times that he dreamt about some event and then the next day it seemed to happen. He fails to notice that the vast majority of his dreams did not work out like this. Those thousands of other dreams are easily forgotten and neglected.

You're Not So Smart: Confirmation Bias

The Texas Sharpshooter Fallacy is the mistake of observing some event that is otherwise unremarkable and not unusual or improbable, but then afterwards, retroactively constructing an argument for its improbability, unusualness, or its uniqueness. The Texas sharpshooter fires a gun at a barn, and then draws a target around the bullet hole. Heather is in love with Kyle. She marvels over the fact that if she had not been in that Starbucks, standing in line at just that moment, and if he had not had to stop at the stop light just when he did resulting in his arriving at Starbucks at just the right moment, then they never would have met.

You're Not So Smart: The Texas Sharpshooter Fallacy

Going Nuclear is the mistake of artificially elevating the standards of proof against an opposing position and arguing that it is unjustified unless it is known with absolute, deductive, or excessive levels of certainty, particularly when one cannot meet those standards oneself, or those standards are not met for many other claims that we take to be justified. A Congressman is skeptical about the evidence for global warming. He hears many highly qualified scientists review the evidence. They explain that there is a widespread consensus among the best experts in the scientific field that global warming is happening. But the Congressman resists, saying, “Well, perhaps, but in the end is it really possible to prove it? I mean, can you prove with absolute certainty that it is happening? Especially when you might possibly be wrong?” Meanwhile, the Congressman’s doctor tells him that the test results indicate a good chance that he has a bacterial infection, so the Congressman agrees to take antibiotics, even though the evidence for the diagnosis is preliminary. Specifically, the Going Nuclear mistake is elevating the standards of proof to a level that cannot be satisfied by any amount of evidence. It invokes global or extreme skepticism in order to refuse a conclusion. Motivated Reasoning, as we have explained in here, is applying different standards of proof to evidence that is for or against a view you already hold.

Philosopher Stephen Law explains Going Nuclear.

The Planning Fallacy: People systematically underestimate how much time, money, energy, and resources a project will take them to reach completion when they are starting out. And the low projections persist, even when we have lots of counter evidence from projects requiring more of our time, money, and energy. City councils estimate building a bridge will take less money and time than it actually takes. Students believe they will be able to take more classes, get better grades, and work more hours at their jobs than they are actually able. And people continue to make the mistake even when they have evidence from past failures to plan conservatively.

The Optimism Bias is a cognitive illusion that leads us to overestimate the likelihood of our experiencing good events in our lives, and underestimate our likelihood of experiencing bad events in our lives. We think we are smarter, better drivers, more attractive, and more healthy than we actually are. And we think we are less likely to get cancer, die young, or be in an accident. In empirical studies, on almost any criteria that is judged as positive, people will estimate themselves as rating higher than others, and for negative criteria or events, they judge themselves to be lower than average.

Your Child Is Fat, Mine is Fine: Optimism and Obesity

Tali Sharot's TED talk on Optimism Bias:

The Sunk Cost Fallacy: When a cost in time, money, or resources has already been incurred and cannot be recovered, it is sunk. The sunk cost fallacy is the mistake of treating a sunk cost in the past as relevant information in a present or future decision.

Smith might order a big steak dinner in a restaurant and find that he’s full after eating half of it. But he presses on and eats the rest of it, even though it makes him feel overly full and he doesn’t need the calories. But he thinks that since he’s paid for it, he should finish it. But the bill for the dinner will be the same whether he eats the whole steak or half of it.

Jones has been working at her job in an office for two years. She doesn’t like the job and she can get another job in another office that would have more opportunities. But she sticks with her job because she feels like she has invested so much time and energy already into the company. Her mistake is that that time and energy that she’s put in are already gone, whether she continues to work there or not.

Mitchell buys a ticket to a major league baseball game, but starts feeling very sick in the third inning. He refuses to leave early and go home to rest or go to the doctor because he wants to get his money’s worth out of the expensive ticket. Mitchell’s mistake is failing to recognize that the ticket cost him the same amount whether he watches all or part of the baseball game. There is no additional value from the ticket except perhaps the pleasure he would have from watching more innings. But that pleasure has been lost by his being sick. In fact, he might be worse off staying at the game.

When the notion of “getting our money/time/energy worth” comes up, we should ask ourselves, “Is the cost I’ve incurred here recoverable?” If not, then there may be no more value to derive from the situation.

Freakonomics Podcast: The Upside of Quitting

Train Yourself to See Sunk Costs

Gambler's Fallacy: With random number generators like a slot machine, rolling a dice, a roulette wheel, or selecting a card from a shuffled deck, the different trials/spins/rolls/draws are independent. That is, each pull of the slot machine arm or roll of the dice is causally separate from the others. There is no memory or record, or causal influence of one roll on the next roll. The dice don’t remember what they rolled previously. The slot machine or the lottery card you scratch off today doesn’t keep a record of what happened in previous cases. So the odds of winning or getting a particular outcome don’t increase over time. Your odds of winning the lottery or of rolling double sixes don’t improve because they haven’t come up for a while. The dice or a slot machine or other independent events in your life don’t become more likely simply because they haven’t been happening. The dice aren’t “due to win” when they’ve been losing for a while.

Begging the Question is the mistake of assuming what is to be proven as part of the evidence or argument in its favor. A man’s bumper sticker says, “God said it in the Bible, I believe it, that settles it.” Someone asks him, “Why do you believe the Bible?” He says, “because God wrote it.” Then someone asks him, “Why do you think that God wrote it?” and he says, “because it says so in the Bible.” The point that needs to be supported or proven is being used to prove itself.

You're Using "Begs the Question" Wrong!

Anchoring Bias: A number, such as a manufacturer’s suggested retail price or a reserve price on Ebay, can be set arbitrarily or randomly and it will affect or anchor the way that people think about the situation. Consumers can view an item that has been labeled with a “$99.99 regular price, 25% off sale price” more favorably than the same item labeled with a “$70.00” regular price. The introduction of a number, value, or set point, even when it is random or arbitrary, fixes that value in our minds and then we frame our decisions around it.

Daniel Kahneman on Anchoring Bias:

Availability Bias: We tend to conflate ideas, experiences, memories, or information that are readily available to our minds with their objective likelihood, rates, or occurrences in the world. We think that shark attacks are more probable when we have been watching Shark Week on the Discovery Channel. We are more worried about car wrecks when someone we know has recently been in one. We mistakenly think that there are more words that start with the letter “r” in English than have “r” as the third letter because we can more easily recall words starting with “r.”

Cognitive System 1 and System 2- Daniel Kahneman

Humans evolved a fast but sloppy (higher error) system for making decisions, and a slower one. System 1 is very good at making quick judgments that get us out of harm’s way immediately. System 2 is the slower set of cognitive processes associated with the frontal lobe. When we reason carefully through a problem, our error rate goes down, but it takes more time.

Daniel Kahneman explains System 1 and System 2 thinking.

Loss Aversion Bias or the Endowment Effect: Research has shown that when people own something, they put a higher value on it than they would if they didn’t own it and were asked how much they would be willing to pay for it. When we want to sell a house or a car or anything else, we optimistically endow it with a higher value. Somehow, magically, our owning something makes it more valuable. The problem is that you are prone to put too high a value on your own stuff, making it harder for you to sell it, or giving you a distorted picture of the world.

More generally, humans are disproportionately concerned about loss compared to how much they favor gain. We have more aversion to loss than we desire an equivalent gain.

Behavioral Economist Daniel Ariely explain Loss Aversion and the Endowment Effect:

Hindsight Bias: The tendency towards thinking that things must turned out the way that they actually have. After some event, we selectively review the past and find information, clues, or reasons that seemed to have led to that outcome, increasing the feeling that the outcome was inevitable or foreseeable.

You're Not So Smart: Hindsight Bias

Placebo Effect: People with medical problems will often experience recovery, relief, or improvements from medical treatment or attention that is in addition to any pharmacological or causal effect that the treatment has. That is, the belief or expectations that they are going to get better by themselves have the effect of making the person feel better and, in some cases, they actually get better.

http://www.humphrey.org.uk/papers/2004Placebo.pdf

http://www.radiolab.org/story/91539-placebo/


The Umpire Effect is a particular form of motivated reasoning. Enthusiastic fans of sports teams tend to find fault in referee calls that go against their team and accuse the umpire of bias, but they will be more satisfied with referee calls that favor their team. Conservatives often argue that the news media is too liberal, while liberals insist that the news media is too conservative.

The Sliding Scale Fallacy: Motivated reasoners will apply a more skeptical, critical set of standards against the evidence for views that they oppose, while letting mistakes, sloppiness, or a failure to meet those same standards go in cases where evidence or arguments are being presented in for of conclusions they favor. See Motivated reasoning.

Actively Open Minded Thinking. In actively open minded thinking, we gather the full body of relevant evidence for a question as objectively and as indifferent to outcomes as possible, we apply the cannons of inductive or deductive logic to that evidence as impartially as possible, then we draw whatever conclusion is implied. AOT is a method for finding the best justified conclusion that is content or outcome neutral.

Upside down thinking is the opposite of actively open-minded thinking. This individual begins with a conclusion or a belief, and then the search for or inquiry into evidence is steered by that belief to be favorable to it. The belief guides the gathering and analysis of the evidence rather than having the evidence motivate the conclusion.

Promiscuous teleologists: People are prone to find more purpose, intention, and design in events than is actually there. Even when there is no deliberate planning or intelligence behind some event or phenomena, research shows that people are inclined to explain it in terms of some mind acting deliberately with a goal in mind to bring it about for a purpose.

HADD: Hyperactive agency detective device. People are prone to attribute agency, consciousness, mental states, or minds to more things in the world than are actually there. “It’s better to mistake a boulder for a bear than a bear for a boulder.” We have a high false positive rate concerning minds or agents, rather than a high false negative rate.

Entrenchment: Research shows that when faced with counter evidence to a belief, particularly in publicly known or social contexts, rather than lose confidence, people will sometimes express even higher confidence that the belief is true. They will “double down” on the belief rather than revise it and face public embarrassment.

Paredolia occurs when people attach significance or special meaning to some event or phenomena that is vague, random, or ordinary, such as the man in the moon, faces in the clouds, or faces in food. The mistake frequently occurs concerning faces, in part, because our brains are highly sensitive to the presence of faces in our environment and prone to find many false positives.

Supernaturalism: A significant body of research shows that education and belief in magic, religiousness, spiritualism, and other phenomena beyond the natural world (supernaturalism) are inversely correlated. That is, as education increases, supernaturalism decreases. People are prone to attribute natural events or explain phenomena that have physical, chemical causes in terms of entities, beings, powers, or causes that are outside the natural world.

Confusing possible with probable: If some state of affairs contains an explicit or implicit logical contradiction, then it is logically impossible. A married bachelor is logically impossible. As long as some state of affairs does not include a logical contradiction, then it is logically possible. The class of naturally possible events are events that can happen, even ones that are exceedingly rare or improbable, given the laws of nature. Being struck by lightning on your way home from work is naturally possible, but it is unlikely. That is, many events, even though they are naturally possible, they are unlikely or rare. The odds of their happening are less than 50%. Tossing 10 rocks onto a pile and having them land on each other and stack up to form a tower is exceedingly unlikely or improbable, although it is not logically or naturally impossible.

Typically, we need reasons or evidence that makes a conclusion probable in order for it to be justified. If the evidence is neutral, or runs against a conclusion, we should be agnostic or we should disbelieve it. In many cases, a person has a belief and when confronted with skeptical challenges, she will retreat to arguing that it is possible that her belief is true instead of presenting grounds for thinking that it is probably true. The possibility that there was a 9-11 conspiracy does not make it reasonable. While it is possible that there could be an afterlife, that possibility alone doesn’t make the conclusion justified. While it is possible that the set of lottery numbers could win, it wouldn’t be reasonable to believe that they are the winning numbers (unless you have more evidence.) Justifications for a belief that slip from arguing that it is probable to offering evidence for its possibility are confusing possible with probable.

Non-disconfirmable hypotheses: A thinker is advocating a non-disconfirmable hypothesis when there are no circumstances, no evidence, no arguments, or no scenarios, even hypothetically, under which she would acknowledge that it has been disconfirmed or that it is false.

Defeasibility: A defeasible belief is one that you are prepared to revise in the light of new evidence. It is responsive to evidence for its justification. Your confidence that it is correct might increase if the new information supports it, but if new information runs contrary to it, then it would diminish your confidence that it is true. That revision could lead to your suspending judgment about it, or even coming to disbelieve it.

A dogmatic belief is one that is not defeasible. Dogmatism is the mistake of having a belief that is not responsive to evidence or argument; one that you do not revise in the light of new information. The mistake is not confined to religious contexts, but it often happens there.