How can a person hold two mutually contradictory beliefs at the same time? Psychologist Leon Festinger speculated in 1956 that holding two contradictory beliefs would produce an unpleasant feeling. He called this situation "cognitive dissonance".** "Cognitive dissonance is a theory of human motivation that asserts that it is psychologically uncomfortable to hold contradictory cognitions. The theory is that dissonance, being unpleasant, motivates a person to change his cognition, attitude, or behavior." This theory was first explored in detail by Festinger, who described it this way:
"Dissonance and consonance are relations among cognitions, that is, among opinions, beliefs, knowledge of the environment, and knowledge of one's own actions and feelings. Two opinions, or beliefs, or items of knowledge are dissonant with each other if they do not fit together; that is, if they are inconsistent, or if, considering only the particular two items, one does not follow from the other" (Festinger 1956: 25).
He argued that there are three ways to deal with cognitive dissonance. He did not consider these mutually exclusive.
Bertrand Russell echoed this:
"Conventional people are roused to fury by departure from convention, largely because they regard such departure as a criticism of themselves."
Festinger supported his theory with an experiment that put student subjects in a position that required them to dishonestly persuade others that a boring activity was really interesting. Because the feeling of dishonesty is naturally unpleasant, he predicted that the students would actually change their opinion of the activity; they would convince themselves that the boring activity was actually interesting, thereby reducing the unpleasantness of being dishonest. The experiment confirmed the hypothesis. The students actually changed their memory of the boring experience to that of an interesting one. Details can be found here and here.
Elliot Aronson proposed that cognitive dissonance mainly arises when the conflicting ideas are between one's positive concept of 'self' (i.e. "I am a good person") and the realization that one's actions or beliefs may be considered to be negative (i.e. "I am a bad person"). For instance, most of us consider ourselves to be kind, fair and just with respect to others. If one were to harbor feelings of bigotry, or treat someone unfairly due to this bigotry, then one would reconcile this with a rationalization ("He deserved it because of ....").
The mere act of choosing something (for whatever reason) over a previously favored thing, causes us to actually change our preference in favor of the new thing. For instance, a happy owner of a Ford car may be given some incentive to buy a Toyota, which he reluctantly does. Cognitive dissonance theory would predict that simply buying the Toyota would change the buyer's preference a bit from Ford to Toyota.
Izuma et. al conducted a study involving fMRI brain scanning supports this "choice-induced preference change".
"Here, using a proper control condition and two measures of preferences (self-report and brain activity), we found that the mere act of making a choice can change self-report preference as well as its neural representation (i.e., striatum activity), thus providing strong evidence for choice-induced preference change. Furthermore, our data indicate that the anterior cingulate cortex and dorsolateral prefrontal cortex tracked the degree of cognitive dissonance on a trial-by-trial basis. Our findings provide important insights into the neural basis of how actions can alter an individual's preferences."
Harris et. al also used fMRI studies to examine the brain's response to propositions that differed from our beliefs. They also looked at statements about ideas in which there were no prior firm beliefs. Statements that contradicted previously held beliefs showed a negative emotional response akin to 'disgust'. Statements that confirmed beliefs were greeted with positive responses. Statements about topics in which the subjects were uncertain did not produce such an emotional response either way.
"Belief and disbelief differ from uncertainty in that both provide information that can subsequently inform behavior and emotion. The mechanism underlying this difference appears to involve the anterior cingulate cortex and the caudate. Although many areas of higher cognition are likely involved in assessing the truth-value of linguistic propositions, the final acceptance of a statement as "true" or its rejection as "false" appears to rely on more primitive, hedonic processing in the medial prefrontal cortex and the anterior insula. Truth may be beauty, and beauty truth, in more than a metaphorical sense, and false propositions may actually disgust us."
So, a brain faced with information that contradicts a previously held belief reacts with a primitive, emotionally negative response. This emotionally negative feeling of 'disgust' informs the higher brain that the information must be false, no matter how objective the information may actually be. Remember Dr. Shermer's thesis (see Philosophy and Science - Induction and Deduction): we form our beliefs first, then we attempt to rationalize them.
The opposite of 'dissonance' is 'consonance'. If dissonance produces a deeply negative emotional response, consonance produces an immediate positive emotional response. We have all experienced this as well. We clap, we cheer, we nod in approval when a speaker makes statements that are consonant with our beliefs.
It follows from the research in Neuroscience and Psychology that people will embrace - with little or no scrutiny - arguments that agree with their beliefs. That is, claims that produce cognitive 'consonance' are accepted on face value alone. Sometimes, a research paper may be proudly touted just by reading the title and the abstract's conclusion. Rarely is a consonant study's materials and methods section scrutinized for biases, errors and uncertainty. On the other hand, we pick apart the details of studies that contradict our beliefs. We just know that there must be an error, because our brains have already decided that such studies must be wrong.
It would seem that science advocates and skeptics may fundamentally differ from pseudoscience advocates in the way in which cognitive dissonance is reduced. Ideally, when faced with two conflicting ideas, the skeptic weighs the likelihood of each idea against accepted knowledge and new information, then adopts the most parsimonious position. This may require the skeptic to change a previously held belief. The skeptics reduce dissonance by accepting that they were wrong. Skeptics change their minds.
Pseudoscience advocates and pseudoskeptics do not seem to do this. When faced with unambiguous, conflicting, scientific evidence against a favored idea, advocates reject or rationalize the disconfirming evidence as meaningless. They do this through the process of "motivated reasoning" (see below). They hold onto their previously held, favored belief. In fact, Festinger et al. showed that, in some cases, their belief actually may become reinforced after they rationalize away the disconfirming evidence.
As we learned on the Pseudoscience page, one cannot tout science when it supports a belief, and then reject science when it does not. You can't have it both ways.
Most people consider themselves to be rational. We all like to think that our positions on certain issues reflect the most reasonable stances to take based on the available evidence. But when presented with evidence that conflicts with our positions (especially those with which we self-identify in some way), it is natural to be extra critical of the new data. We are prone accept confirming evidence (confirmation bias) with little skepticism. However, we become expert skeptics when faced with data that may falsify our firmly held ideas.
There are lots of ideas and theories out there and many of them conflict with each other. Unless we try to falsify our theories, we will likely get stuck making type 1 errors (believing falsehoods). But we naturally want to defend our ideas, just as we want to defend our selves. We naturally don't like the feeling we get when we realize that we are wrong -- especially if others realize that we are wrong as well -- perhaps because this is an attack on our sense of "self". It violates the very basic belief about our self, that is : "I am a smart person". Somehow, we seem to naturally equate being wrong with being dumb on some level. This attack on the self produces dissonance.
To avoid this dissonance, we are motivated to preserve this basic belief about our selves. "I can't be wrong, therefore this new data must be wrong". This is the crux of motivated reasoning. We end up accepting statements uncritically that agree with us, because such statements do not violate our sense of self and therefore create no dissonance. We become defensive when faced with the possibility that we may be wrong (and thereby threatening our image of a smart self). We look for cracks in the opposing claims. We become critical of the methodology used in such studies that have conclusions that conflict with our views. We often resort to logical fallacies to attack our opposition, such as ad hominems and strawmen.
But such reasoning will often lead us astray. It is the nature of scientific thinking to pursue the falsification of our theories and ideas. We should be trying to falsify the theories that we hold dear. But we don't. This is not science's fault; it is a human fault. We should be equally critical of our established ideas as we are of opposing ideas.
Psychologist Steve G. Hoffman was quoted: "Rather than search rationally for information that either confirms or disconfirms a particular belief," he says, "people actually seek out information that confirms what they already believe." "For the most part, people completely ignore contrary information" and are able to "develop elaborate rationalizations based on faulty information."
It is the evidence, not our egos, that should lead to our conclusions.
The Skeptic's Dictionary states : "The "backfire effect" is a term coined by Brendan Nyhan and Jason Reifler to describe how some individuals when confronted with evidence that conflicts with their beliefs come to hold their original position even more strongly". People tend to dig in their heels and insist even more fervently that their position is true if they sense that they are being attacked. They use motivated reasoning to preserve their belief and reject the attack.
This explains why is seems impossible to change somebody's mind about a deeply held belief simply by pointing to conflicting evidence. One cannot simply tell an anti-vaxxer, "You're wrong. Vaccines are safe and effective and here's the data to prove it!" That causes too much cognitive dissonance and "backfires".
It takes a different a different approach to avoid cognitive dissonance and the Backfire effect.
For an adorable demonstration of these concepts in action, see this video. Aaliyah's mother's solution is great. Enjoy.
In their book, Mistakes Were Made (but not by me), Carol Tavris and Elliot Aronson explain that people do not often simply adopt an unscientific or irrational concept whole-cloth. They get there in small steps. Each step may produce some dissonance, which leads to rationalization and reinforcement. Through each successive small, dissonance-reducing step, they get deeper and deeper into the concept.
For instance, one may have a loved one who uses Homeopathy. Even if they have some knowledge that homeopathy is unscientific, they may not question it because of their fondness for the person. If one day their loved one suggests that they try a homeopathic remedy for some self-limited ailment, and they do, there may be some dissonance between their understanding of science and the positive experience of using the remedy at the loved one's suggestion. They may rationalize this and conclude that "science may not know everything". Eventually, they may begin using homeopathy for other conditions, convincing themselves that it is effective and that they know better than science. Such an individual may even begin promoting the concept to others, using the usual litany of logical fallacies to defend the position when faced with conflicting evidence.
A wonderful description of this comes from Dr. Mark Crislip's description of J.D. Handley's immovable anti-vaccine position despite being continuously shown that the scientific evidence contradicts his position. The following is Dr. Crislip's response to an article by Dr. David Gorski on the Science Based Medicine blog:
"Imagine a hypothetical person who has triumphed in the business world and amassed a small fortune.
They see something wrong, a truth unrecognized by others:
So they spend time and emotion and money, maybe lots of money, to spread the word: vaccines and mercury cause autism.
If he is right, then they are a prescient saviour of thousands.
Problem is, science and reality increasing disagree with his position.
He has a choice: admit that he was wrong, wasted a small fortune and many years of his life, that, after success in one field of endeavour, he is a total failure in another. A perfect example of the Peter Principal.
Or cling to an increasingly untenable position and descend farther into crankdom.
Terrible choice for someone to have to make and it will be a choice they will be unable to make; their response will not be pleasant for those who know them. Anger, irrational behaviour, lashing out. Anything to avoid the realization that they have pissed their life away. You see it all the time in addicts.
It is sad, a tragedy, when these things occur. To know, deep in your soul, that all your time and money and heart and soul was wasted and is increasingly irrelevant."
Many doctors hail themselves as being "anti-aging" experts. They usually promote the use of hormone therapies, such as testosterone (the male sex hormone), for the purposes of 'vitality' and reversing the effects of aging. In fact, their entire livelihood has become dependent on the use of hormone therapy.
But on November 6, 2013, a large study was published in JAMA that showed a correlation between testosterone therapy and cardiovascular events (heart attacks and strokes). The study's authors concluded:
"Among a cohort of men in the VA health care system who underwent coronary angiography and had a low serum testosterone level, the use of testosterone therapy was associated with increased risk of adverse outcomes. These findings may inform the discussion about the potential risks of testosterone therapy."
This throws up a major red flag against the use of testosterone therapy. Immediately, there was an defensive outcry from "anti-aging" doctors.
For example, "I’m concerned that men will stop taking testosterone because of this new study,” Erika Schwartz, M.D., told Newsmax Health. “It contradicts all the previous research that shows the benefits of this form of therapy. When taken properly, the results of testosterone therapy can be amazing.” This doctor should be less concerned about losing business and more concerned about the possibility that her "treatment" may be harmful in ways that she did not know about previously. She should be skeptical of her old data in light of the new, not the other way around.
Another example can be seen in this T.V. clip from an Australian show featuring a skeptical look at "Power Balance" wrist bands. Tom, the advocate (and company owner) is faced personally with unambiguous, disconfirming evidence against his claim. At the end of the video, instead of questioning his claim, he reinforces it by arguing that anecdotal evidence should trump the randomized controlled trial.
Psychologist Ray Hyman recounted an experience that demonstrates rationalization in the face of disconfirming evidence. (Note, this was also reproduced in the Skeptic's Dictionary article on Cognitive Dissonance.)
Some years ago I participated in a test of applied kinesiology at Dr. Wallace Sampson's medical office in Mountain View, California. A team of chiropractors came to demonstrate the procedure. Several physician observers and the chiropractors had agreed that chiropractors would first be free to illustrate applied kinesiology in whatever manner they chose. Afterward, we would try some double-blind tests of their claims.
The chiropractors presented as their major example a demonstration they believed showed that the human body could respond to the difference between glucose (a "bad" sugar) and fructose (a "good" sugar). The differential sensitivity was a truism among "alternative healers," though there was no scientific warrant for it. The chiropractors had volunteers lie on their backs and raise one arm vertically. They then would put a drop of glucose (in a solution of water) on the volunteer's tongue. The chiropractor then tried to push the volunteer's upraised arm down to a horizontal position while the volunteer tried to resist. In almost every case, the volunteer could not resist. The chiropractors stated the volunteer's body recognized glucose as a "bad" sugar. After the volunteer's mouth was rinsed out and a drop of fructose was placed on the tongue, the volunteer, in just about every test, resisted movement to the horizontal position. The body had recognized fructose as a "good" sugar.
After lunch a nurse brought us a large number of test tubes, each one coded with a secret number so that we could not tell from the tubes which contained fructose and which contained glucose. The nurse then left the room so that no one in the room during the subsequent testing would consciously know which tubes contained glucose and which fructose. The arm tests were repeated, but this time they were double-blind -- neither the volunteer, the chiropractors, nor the onlookers was aware of whether the solution being applied to the volunteer's tongue was glucose or fructose. As in the morning session, sometimes the volunteers were able to resist and other times they were not. We recorded the code number of the solution on each trial. Then the nurse returned with the key to the code. When we determined which trials involved glucose and which involved fructose, there was no connection between ability to resist and whether the volunteer was given the "good" or the "bad" sugar.
When these results were announced, the head chiropractor turned to me and said, "You see, that is why we never do double-blind testing anymore. It never works!" At first I thought he was joking. It turned it out he was quite serious. Since he "knew" that applied kinesiology works, and the best scientific method shows that it does not work, then -- in his mind -- there must be something wrong with the scientific method. This is both a form of loopholism as well as an illustration of what I call the plea for special dispensation. Many pseudo- and fringe-scientists often react to the failure of science to confirm their prized beliefs, not by gracefully accepting the possibility that they were wrong, but by arguing that science is defective.
Cognitive dissonance is an integral part of our psychology. It is a barrier that prevents individuals from recognizing when they are wrong. In the world of science and medicine, recognizing one's errors is vital. Patients' lives may depend on it. We are all prone to cognitive dissonance and to using motivated reasoning to overcome it.. By recognizing it at work in ourselves and being open to being wrong are hallmarks of good, skeptical doctors.
John Byrne, MD
"cognitive dissonance - The Skeptic's Dictionary - Skepdic.com." 2006.
"Lies of Mass Destruction - The Daily Beast." 2011.
Festinger, Leon. A theory of cognitive dissonance. Stanford University Press, 1957.
"Talk:Bertrand Russell - Wikiquote." 2004.
"Leon Festinger - Wikipedia, the free encyclopedia." 2004.
Festinger, Leon, and James M Carlsmith. "Cognitive consequences of forced compliance." The Journal of Abnormal and Social Psychology 58.2 (1959): 203.
"Elliot Aronson - Wikipedia, the free encyclopedia." 2005.
Izuma, K. "Neural correlates of cognitive dissonance and choice-induced ..." 2010.
Harris, Sam, Sameer A Sheth, and Mark S Cohen. "Functional neuroimaging of belief, disbelief, and uncertainty." Annals of neurology 63.2 (2008): 141-147.
Festinger, Leon; Henry W. Riecken, Stanley Schachter (1956). When Prophecy Fails: A Social and Psychological Study of a Modern Group that Predicted the Destruction of the World. University of Minnesota Press.
"www.caroltavris.com." 2007. 30 Sep. 2012
"Motivated reasoning - Wikipedia, the free encyclopedia." 2011.
Hyman, R. "How People Are Fooled by Ideomotor Action - Quackwatch." 2006.