LIMITATIONS TO HYPOTHESIS TESTING

"Confirmation Bias" is a pervasive danger to hypothesis creation and testing.


The information that we sense and perceive is filtered through our beliefs and experiences (Elstein, 1999). For example, how people perceive and act on information is often based on heuristics (simple rules) instead of strong reasoning (Tversky and Kahneman, 1974). Heuristics are one example of "cognitive biases," or tendencies for people to think and act in consistently distorted ways (Hicks and Kluemper, 2011). Therefore, cognitive biases can affect scientific, clinical, and political decision making.


One cognitive bias that is particularly relevant to science is "confirmation bias" (Nickerson, 1998).


DEFINITION: Confirmation bias is seeking or interpreting evidence in ways that confirm existing beliefs, expectations, or hypotheses.


Confirmation bias reflects the tendency for people to resist changes to their preconceptions by selectively focusing on information consistent with previous beliefs and expectations while ignoring information that conflicts with their preconceptions (Stanovich et al., 2013). For example, the pre-conception that the world is flat can make it difficult for children to conceptualize a spherical world. Children try to interpret new information (e.g. the world is round) to support their preconception of a flat earth, and think of the world as a pancake shape instead of a sphere (Vosniadou and Brewer, 1989). Therefore, confirmation bias can prevent or hinder learning.

Confirmation bias is sometimes confused with skepticism (being critical of evidence). Skepticism is an important part of scientific reasoning. For example, the author Carl Sagan famously wrote: "Extraordinary claims require extraordinary evidence." Whereas skepticism is being critical of ALL evidence, confirmation bias is being selectively critical of evidence that doesn't match preconceptions or expectations (Taber and Lodge, 2006). Therefore, confirmation bias actually hinders skepticism by limiting critical thinking to pre-determined areas.

Confirmation bias affects many aspects of science. For example, when making basic measurements, researchers may double-check measurements that conflict with the expectations of the researchers, but may not double-check measurements consistent with expectations. Therefore, errors that make measurements more consistent with expectations are less likely to be caught than errors that make measurements less consistent with expectations. Confirmation bias therefore affects premises used for both deductive and inductive reasoning.

Confirmation bias can also affect deductive and inductive arguments in other ways. Confirmation bias can influence the questions and hypotheses that individuals (or even entire communities) develop. Questions that challenge existing beliefs or expectations may simply not be asked in favor of questions structured to support existing ideas. For example, science reflects prevailing social and cultural assumptions. When racial prejudices were common and more widely accepted, many researchers sought scientific evidence to confirm prevailing biases (Gould, 1996). Objective data, internally-consistent biological models, and quantitative research has ultimately led to the rejection of most race-based hypotheses. However, some social "scientists" and others continue to use confirmation bias (among other fallacies) to promote prejudiced viewpoints (e.g. Hernstein and Murray, 1994).

Confirmation bias is a particular concern for inductive reasoning. Inductive reasoning often draws from large bodies of information, presenting the possibility for "cherry picking" information to support pre-determined conclusions. For example, opponents of efforts to reduce climate change cherry-pick data to make misleading arguments. Representatives of the extractive industries have used a single year (1998) that was abnormally warm to argue that global temperatures are not rising, despite data from more than a century that clearly show increases in global temperature (Temple, 2013). Therefore, confirmation bias can lead to un-reasonable judgments, particularly for individual inductive arguments. 

Encouraging and increasing diversity in science can mitigate some of the problems associated with cognitive biases. 


Science is an evolutionary process that requires extensive communication among individuals and communities of scientists. Scientific discoveries typically "emerge" from interactions among many scientists (Hill, 1933). Although scientists often pride themselves on individuality, scientific progress is more than the sum of contributions by individual scientists. Instead, scientific progress depends on the composition of the entire scientific community.

A diverse scientific community helps to mitigate Confirmation Bias and facilitates scientific progress. A diverse scientific community is more likely to result in many viable alternative hypotheses for any particular problem. Scientists can be attached to particular hypotheses so long as the scientific community is diverse, and there are other groups of scientists in the scientific community attached to other hypotheses. Even if each group explicitly champions a particular hypothesis, over time the hypothesis most consistent with data will prevail. Therefore, all forms of diversity (of scientific perspective, gender, race/ethnicity, background, etc.) strengthen scientific inquiry.

If the overall scientific community is diverse, then individual scientists may not need to be completely objective when they interpret data (objective data collection remains critical for science, however). Inductive reasoning can help scientists evaluate which hypotheses are most consistent with objectively-collected knowledge given a diversity of alternative hypotheses. 

Cognitive biases like Confirmation Bias can affect scientific judgment. Individual scientists can reduce the impact of cognitive biases by understanding what cognitive biases are, and how biases can affect reasoning. Diversity within scientific communities can reduce the impacts of cognitive biases by increasing the number and range of alternative hypotheses.