Introduction

To think critically, we must first understand how our minds actually work. Human cognition is not as neutral or logical as we might assume. Much of our everyday thinking is guided by mental shortcuts called cognitive biases—automatic patterns that help us make quick decisions but often at the cost of accuracy (Tversky & Kahneman, 1974).. These biases are useful in fast-paced situations but can lead to faulty reasoning when we’re trying to evaluate arguments or make important decisions.

One of the most common cognitive biases is confirmation bias. This occurs when we give more attention to evidence that supports what we already believe, while ignoring or discrediting information that challenges our views. For example, a person who is convinced that social media is inherently harmful might only seek out articles that highlight its dangers, while dismissing studies that suggest positive effects. This bias makes it difficult for us to change our minds, even when new and reliable information becomes available.

Another mental shortcut, the availability heuristic, affects how we assess likelihood and risk. If something easily comes to mind—such as a vivid news story or personal memory—we tend to believe it happens more often than it does. Someone who recently saw a report about a plane crash might start to feel that flying is dangerous, even though air travel is statistically safer than driving. Our brains confuse familiarity or vividness with frequency, and that confusion can distort our judgment.

Cognitive science has also identified two distinct modes of thinking, often referred to as System 1 and System 2  (Kahneman, 2011). System 1 is fast, automatic, and intuitive. It helps us make snap judgments, recognize patterns, and respond quickly to emergencies. However, it is also where most of our biases operate. System 2, in contrast, is slow, effortful, and analytical. It activates when we solve a math problem, weigh pros and cons, or critique a news article. Most of the time, we rely on System 1 because it saves energy—but critical thinking happens when we slow down and switch into System 2.