Start by asking yourself: When was the last time I changed my mind about something important? What made me reconsider? If you can’t think of an example, that might be a sign that you’re surrounding yourself with people and information that always agree with you. Growth often begins at the edge of our comfort zones, and being open to change is a sign of intellectual humility—not weakness.
Another useful practice is to keep a critical thinking journal (APA, 2020). After a conversation, a lecture, a news story, or a personal conflict, jot down some reflections: What claims were made? What evidence was used? Did I feel myself reacting emotionally, and if so, why? What questions could I ask to understand more deeply? Even five minutes of writing can clarify your thinking and reveal deeper patterns in how you respond to information.
You can also try rating yourself on key skills using a 1–5 scale (1 = rarely, 5 = consistently). For example:
I ask clarifying questions before accepting a claim.
I can identify when I’m being influenced by emotion more than logic.
I seek out alternative viewpoints when researching a topic.
I recognize when I’m using stereotypes or assumptions.
I stay open to being wrong, even in areas I care about deeply.
These kinds of questions don’t have right or wrong answers—they are simply mirrors. They help you track your growth and recognize areas where more practice is needed. Over time, what begins as a checklist becomes a habit of mind.
Another powerful tool is the practice of intellectual accountability. This means building a community—peers, classmates, friends—who can challenge your ideas without attacking you. It means choosing to see disagreement not as a threat but as an opportunity to sharpen your thinking. Ask yourself: Do I surround myself with people who agree with me, or do I make space for healthy debate? Do I react defensively when someone pushes back, or do I pause and reconsider?
Finally, it’s helpful to return to your “why.” Why are you studying critical thinking? Why does it matter to you personally? Some students say they want to be better communicators. Others want to navigate conflict more effectively, understand political issues, or be less vulnerable to misinformation. Whatever your reasons, writing them down and revisiting them can keep you motivated when the process feels frustrating or slow.
Critical thinking is not about always having the right answer. It’s about being willing to ask better questions. It’s about being honest with yourself, curious about others, and humble enough to keep learning. The journey begins with awareness—and awareness begins with reflection.
Availability heuristic: A mental shortcut where we judge the likelihood of something based on how easily we can recall examples—often influenced by recent or emotionally vivid experiences.
Bandwagon effect: The tendency to adopt an idea, behavior, or belief because it is popular or widely accepted by others, rather than because of strong reasoning or evidence.
Cognitive bias: A consistent thinking error or shortcut in judgment that can lead to distorted conclusions; examples include confirmation bias and the availability heuristic.
Confirmation bias: The tendency to focus on information that supports our existing beliefs and ignore or dismiss information that contradicts them.
Credibility: The quality of being trusted, believed, or taken seriously. Credibility depends on factors like expertise, honesty, clarity, and context. In critical thinking, evaluating credibility means asking whether a person, source, or institution is reliable, knowledgeable, and free from obvious bias or conflict of interest.
Critical thinking: The ability to think clearly, fairly, and logically by analyzing evidence, questioning assumptions, and remaining open to multiple perspectives.
Digital activism: The use of online platforms—such as social media, petitions, or hashtags—to raise awareness, organize movements, and advocate for social or political change.
Disinformation: Deliberately false content created to deceive or manipulate.
Echo chamber: A closed environment, often online, where people are only exposed to ideas or beliefs that match their own, reinforcing their views without challenge.
Ego Defense: The way we instinctively protect our sense of identity and self-worth.
Engagement algorithms: Software tools used by social media platforms to predict and promote content users are most likely to interact with—often reinforcing emotional, attention-grabbing material.
Filter bubble: A personalized digital environment created by algorithms that limits exposure to different viewpoints and reinforces what we already believe.
Groupthink: When a group prioritizes harmony and consensus over critical evaluation, leading to poor decisions and suppression of dissent.
Identity: Includes our background, values, experiences, culture, and social position—shapes how we see the world and how we engage with ideas.
Implicit bias: Unconscious attitudes or associations we hold about others based on race, gender, age, class, etc., which can influence our behavior and judgments without us realizing it.
In-group bias: The tendency to favor people who are part of our own group (racial, social, political, etc.) and to view outsiders more negatively or suspiciously.
Intellectual Courage: Being willing to challenge misinformation in conversations with friends or family, to ask difficult questions at a community meeting, or to admit when we don't know enough. It involves recognizing when our beliefs need updating in light of new evidence, and being open to voices that complicate our worldview.
Intellectual humility: The awareness that our beliefs might be wrong or incomplete, and the willingness to revise our views based on new evidence or better reasoning.
Lateral Reading: The practice of opening new tabs, checking sources, comparing headlines across outlets, and asking, “Where is this information coming from, and can I trust it?”
Logical fallacies: Errors in reasoning that weaken or invalidate an argument. They often seem persuasive but fail under closer scrutiny. Common examples include slippery slope, false cause, and ad hominem attacks.
Media literacy: The ability to critically analyze, interpret, and evaluate media content—including who created it, for what purpose, and how it may influence our thinking.
Mental laziness (cognitive ease): The natural tendency to avoid effortful thinking, preferring simple, familiar answers even when deeper reasoning is needed.
Overconfidence bias: The tendency to overestimate the accuracy of our knowledge, memories, or judgments—even when our confidence is not supported by facts.
Power: The ability to influence or control people, decisions, or resources. Power can be personal, institutional, or systemic. It plays a role in shaping laws, norms, access, and credibility, often in invisible or uneven ways.
Policy Discussions: From healthcare and education to immigration and policing. These conversations often involve value judgments, trade-offs, and long-term consequences.
Privilege: Unearned advantages or access that come from one's social identity—such as race, gender, class, ability, or citizenship. Privilege often operates in the background, benefiting some people while making others invisible or excluded. Recognizing privilege is a key part of developing critical self-awareness and evaluating fairness.
Public discourse: The open exchange of ideas, arguments, and opinions in shared spaces such as media, classrooms, town halls, or online platforms. Public discourse shapes collective understanding and plays a vital role in democracy, culture, and social change.
Rhetoric: The strategic use of language and symbols to persuade or influence others. Rhetoric includes both what is said and how it is presented, often using appeals to emotion, logic, or credibility.
Self-assessment: The practice of evaluating your own thinking habits, reasoning strengths, and areas for growth to improve critical thinking over time.
Social identity: A person’s sense of self based on group memberships (e.g., ethnicity, religion, gender, nationality) that shapes how we see ourselves and others.
Sponsored content (native advertising): A form of advertising designed to look like regular content—such as an article, video, or social media post—while being paid for by a brand, organization, or political group. It often blends in with news or entertainment, making it harder to recognize as promotional. Disclosures like “sponsored,” “ad,” or “paid partnership” are required but often subtle.
System 1 thinking: Fast, automatic, emotional thinking that helps us react quickly but is prone to bias and shortcuts.
System 2 thinking: Slow, deliberate, analytical thinking used for solving problems, evaluating arguments, and making reasoned decisions.
Value judgment: A statement or decision that reflects a belief about what is good, bad, right, wrong, beautiful, fair, or important. Value judgments are based on personal or cultural values rather than objective facts, and they play a central role in moral, legal, and aesthetic reasoning.
Visual rhetoric: The use of images, colors, layout, and design in media to persuade or influence viewers, often through emotional or symbolic means.
American Psychological Association. (2020). Publication manual of the American Psychological Association (7th ed.). https://doi.org/10.1037/0000165-000
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
McGrew, S., Breakstone, J., Ortega, T., Smith, M., & Wineburg, S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory & Research in Social Education, 46(2), 165–193. https://doi.org/10.1080/00933104.2017.1416320
Moore, B. N., & Parker, R. (2017). Critical thinking (12th ed.). McGraw-Hill Education.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124
Wineburg, S., & McGrew, S. (2017). Lateral reading: Reading less and learning more when evaluating digital information. Stanford History Education Group. https://purl.stanford.edu/fv751yt5934
Zimdars, M. (2016). False, misleading, clickbait-y, and satirical “news” sources. Harvard University Library Resource. https://guides.library.harvard.edu/c.php?g=310271&p=2071512