Why do otherwise rational people turn to unproven or implausible treatments? Why does nonsense sell?
Let’s follow the money and turn our attention to economics.
Until recently, economists often thought of economic decision makers as ideal, rational beings.
“Homo economicus” is a rational, self-interested agent, who makes logical decisions based on real probabilities of maximizing his or her economic goals. Members of this mythical species are often referred to as “Econs”. Econs maximize utility as consumers, and maximize profit as producers. They minimize losses and maximize gains without distorting probabilities. Econs do not exist; humans do.
An improvement to this ideal model was Expected Utility Theory. In the 18th century, Daniel Bournoullli proposed that people made decisions about monetary gambles with respect to their current wealth. He pointed out correctly that people are basically loss averse. Under the theory, people make decisions to avoid lowering their current wealth. There is a limit to the risks that people are willing to take due to loss aversion.
However, Expected Utility Theory, which assumes that people will be emotionally neutral about their current wealth, was flawed. In 1979, psychologists Daniel Kahneman and Amos Tyversky pointed out that people would not be neutral about their current wealth if they used to be wealthier. The emotional feeling of loss will distort their view of their current status. Likewise, those who achieved recent gains will have a fonder and prideful perspective of their current wealth.
Otherwise reasonable people do not anchor their risky decisions specifically to their current wealth, but to their perceptions of gains and losses.
People are loss averse. If they have perceived a significant loss, they will be willing to take risks to recoup the loss. If they perceive a gain in wealth, they will avoid risks to protect the gain.
They showed that our choices are highly influenced by ‘framing’. We can feel good about a glass that is half full, and bad about a glass that is half empty. We act one way when a situation is presented as a loss, and another way when the same situation is presented as a gain. We can and do make contradictory choices about identical situations according the framing effect.
Kahneman and Tyversky also pointed out that weights we place on decisions are not linear with actual probabilities. We weigh very low probabilities higher than we should. We weigh higher probabilities lower than we should.
They called their theory of decision-making Prospect Theory.
From this, skeptical doctors should accept some fundamental points about human decision making.
We are not rational deciders.
- We think in terms of gains and losses with respect to our perceived and expected status quo.
- We fear losses more than we like gains.
- We distort probability according to how a prospect is framed (in terms of gains and losses).
- Our choices are highly dependant on how scenarios are framed.
- We overvalue probability (risk) when thinking in terms of loss (risk seeking).
- We undervalue probability when thinking in terms of gains (risk averse).
- These heuristics may lead to irrational health choices by patients and doctors alike.
We can use examples of money gambles to illustrate these quirks in human decision making. Their expected values are easy to calculate. Kahneman and Tyversky confirmed experimentally that people actually do make choices according to prospect theory with scenarios such as these.
In the following scenarios, choose A. or B. In other words, which choice is more desirable.
1. A. 80% chance to win $1000.
B. $700 for sure
This choice is framed in terms of high chance of significant gains. Most people will favor risk aversion. Most choose B. However, the mathematical expected value of A is $800 (80% x $1000). An “econ” would choose A.
2. A. 80% chance to lose $1000 (and 20% chance of losing nothing).
B. Lose $700 for sure.
This is framed in terms high risk of significant loss. Most would favors risk seeking. Most take A. However the expected value of A is to lose $800 ( -$1000 x 80%). B is the better economical choice.
3. A. Bet $10 on a .1% chance to win $9,000.
B. Do nothing.
This scenario is framed in terms of small chance for significant gain. Many will favor risk seeking and choose A. The economical choice is B because the expected value of A. is to lose $1
([.1% x $9,000] -$10).
4. A. 1% chance to lose $100,000.
B. Pay $1,100 for insurance against a 1% chance to lose $100,000.
This choice is framed in terms of a small chance for significant loss. Most will favor risk aversion and choose B. By now we can see that the economical choice is A because its expected value is $1,000. However, most of us would sleep better with choice B.
We overvalue loss. We tend to overweight risk when faced with possible loss of a perceived gain. That is, when we are afraid of losing what we have gained, we are prone to type 1 errors (believing a risk is greater than it really is).
We also undervalue risk when faced with certain loss. We are willing to risk losing more in small hopes of recouping losses. That is, when faced with certain loss, we are prone to type 2 errors (believing a risk to be less than it really is).
The distorted perception of risk probability has been quantified experimentally by Kahneman and Tyversky.
The table below represents the relationship between real and “weighted” probabilities. The “weighted” probability (wp) is the probability that we tend to actually assign in our thought
process, as opposed to the actual probability, p. Weighted probabilities for gains and losses follow similar lines.
For real probabilities of about 35% or less, we tend to weigh risks higher than they actually are. For real probabilities over 35%, we tend to weigh risks higher than the actual numbers. As we approach the extremes of 0% and 100%, most rational people weigh these risks fairly.
So, most ‘normal’ people overestimate small risks and underestimate larger risks except for the extremes.
For our purposes, the ‘normal’ people that we are concerned with are average patients and average doctors.
These concepts can be applied to decision making in medicine, for both doctors and patients alike. The situations described below are quite common. Medical decision making often does not follow utilitarian, logical and economic rules (in which everyone behaves rationally and is not subject to human heuristics, biases and faulty logic).
Many undesirable medical decisions can be predicted, however, when thought of with respect to Prospect Theory.
In his excellent book, “Thinking Fast and Slow”, Kahneman described a “fourfold pattern” for the predictions of Prospect Theory. We can adapt this for decisions relevant to medicine.
The left upper box represents practices that present high probabilities of significant benefits. Favorable practices will tend to be undervalued in favor of status quo if patients feel relatively comfortable. In this box, people fear disappointment and become risk averse. Investors will tend to sell good stocks to lock in gains. Kahneman used the example of a plaintiff in a legal battle who has a very strong case and takes a reasonable settlement, rather than a likely larger pay-off in court.
In health terms, the “risk” in this box may refer to the high probability of significant health benefits. When people are happy with their status quo, they will tend to discount this probability or even ignore it altogether.
Patients who feel reasonably comfortable with their health may refuse to quit smoking and refuse proven screening procedures such as mammograms and colonoscopies. Doctors may not even stress the need for such procedures if the patient seems the least bit averse to them. Doctors and patients alike may deny proven scientific preventative measures such as vaccines (see Vaccine Denialism).
Situations that carry high risks for significant losses fall within the right upper box. Risky practices will likely be sought in an unlikely attempt to rid oneself of certain loss. Kahneman refers to the defendant who is likely to lose a large case. The defendant is likely to reject a somewhat lesser settlement in favor of the slim chance in court of being exonerated. Likewise, gamblers who have experienced significant losses will often continue to gamble in attempts to recoup losses, rather than cutting their losses and going home. Investors will hold on to failing stocks for the same reasons.
It is also here that we may find people seeking dangerous pseudoscience in the presence of unwanted diagnoses. People who feel significant loss of health and well-being will be desperate to correct the loss (even if the chances are slim to nil). Parents of autistic children seek dangerous therapies such as chelation, chemical castration, and bleach therapy. Desperate cancer patients may seek various pseudoscience cures such as homeopathy or faith healing instead of chemotherapy in misguided hopes for a cure, rather than life-prolonging treatment. Doctors may also over-hype the benefits of chemotherapy to terminally ill cancer patients even when there is virtually no chance of benefit instead of settling for palliative care. We are willing to grasp glimmers of hope in the face of certain loss, even if the glimmers are illusions.
In the lower left square, we see practices that carry a very low (or nil) probability for significant gains. These will be given more probabilistic weight than is reasonable. Plaintiffs who file frivolous lawsuits often will reject settlements in hopes of getting the big bucks. People begin gambling at casinos for a slim chance of a big payout.
It is in this box that people may be heard saying things like, “What the heck!” or “What’s the harm?”
Patients may choose to take various supplements, homeopathic potions or go for reiki therapy on the whim that these practices will give dramatic benefits, even if they recognize the low likelihoods of their success. As long as they perceive their probable benefit as non-zero, they will tend to overvalue such practices. Doctors likewise may encourage such practices (or fail to discourage them) for similar reasons (“What the heck! What’s the harm?”)
The right lower square describes our risk aversion to perceived threats that are low in relative probability but are associated with significant hazard. This will be given more probabilistic weight than it deserves. It is here that the defendant in a frivolous case may offer a settlement, rather than going to trial out of the small and unlikely chance of a loss. When we find ourselves facing a low likelihood of significant loss, we buy insurance against such loss and sleep better at night.
It is in this square that unwarranted (or useless) diagnostic testing may be sold to patients. Examples include nuclear stress testing for patients with little risk of heart disease, MRI scans of the back for people with common back pain, total-body MRI scans for healthy rich people “just to be sure”, and vascular ultrasounds sold to random people at “health fairs”. These are unwarranted (but profitable) “screenings”. Such practices violate Bayesian thinking by overestimating the pretest probability of relatively rare conditions within certain populations.(“base rate neglect”).
Average, everyday humans do not make decisions according to strict, economic probability. When issues are framed in terms of emotional gains and losses, our image of probability becomes distorted through the fisheye lens of loss aversion. The distortion is, however, predictable.
Prospect theory understands the distortion and may allow us to predict the unfavorable decision making that is often observed among average patients and doctors.
Most of us are prone to these decision making heuristics because humans are naturally loss averse. We fear loss more than we desire gains. Prospect theory predicts when average patients and average doctors will engage in risk averse and risk seeking behavior.
Skeptical doctors can learn to predict erroneous decisions and attempt to guard against them . However, we can only rescue people who get their foot caught in the Matrix-like rabbit hole of pseudoscience. Once they fall in, reason becomes irrelevant.
John Byrne, MD
Abellan-Perpiñan, Jose Maria, Han Bleichrodt, and Jose Luis Pinto-Prades. "The predictive validity of prospect theory versus expected utility in health utility measurement." Journal of health economics 28.6 (2009): 1039-1047.
Bomlitz, Larisa J, and Mayer Brezis. "Misrepresentation of health risks by mass media." Journal of public health 30.2 (2008): 202-204.
Wittenberg, Eve, Eric P Winer, and Jane C Weeks. "Patient utilities for advanced cancer: effect of current health on values." Medical care 43.2 (2005): 173-181.
"Prospect theory - Wikipedia, the free encyclopedia." 2003.
Kahneman, Daniel, and Amos Tversky. "Prospect theory: An analysis of decision under risk." Econometrica: Journal of the Econometric Society (1979): 263-291.
"Prospect Theory." 2004.