A subgroup of steering committee members analyzed seven, pre-existing critical thinking and logical reasoning assessments. This process fits within the larger context of our project: our primary goal is to create effective interventions to help students consciously develop skills in critical thinking and logical reasoning. We recognize that the lack of these skills is detrimental to students, as it stifles their capacity for building the type of interdisciplinary and critically-aware analysis that is required for problem solving and innovation. The lack of critical thinking and logical reasoning skills is particularly challenging in the STEM classroom. STEM disciplines function with critical thinking and logical reasoning as an oft-unexpressed core, presupposing that students both understand the need for critical thinking and logical reasoning, and come to class possessing the skills and knowledge of when to apply them. Careful assessment of critical thinking and logical reasoning skills are an essential precondition for helping students progress towards greater mastery of these skills.
The analysis aimed to determine what was being measured in each assessment with specific attention to larger assumptions that each assessment makes about what constitutes critical thinking and logical reasoning (e.g., how they are defined). We also considered which student populations were, and were not, prioritized by each assessment and gave attention to the amount of knowledge of critical thinking and logical reasoning that was required of the course instructor in order to utilize or interpret the assessment results.
Our review suggested that these assessments present themselves as an external assessment of critical thinking as a disposition. The mechanics of the assessments vary. Some are disconnected (from courses or programs) surveys using almost a clinical approach to determine the extent to which a participant operates as a critical thinker or logical reasoner. There are several challenges with these summative/dispositional approaches. They situate critical thinking and logical reasoning within a set of contexts with which the participant may not be familiar. Our larger research suggests that critical thinking and logical reasoning are each context-dependent skills. Further, using these assessments within the STEM classroom is challenging because the context for critical thinking is not always science specific. Relatedly, output generated from these instruments often constitutes an absolute summative assessment of a student’s ability to think critically and reason logically in the abstract. That is, they comment - or can be taken to comment - on a student’s abilities to develop critical thinking and logical reasoning skills. Finally, existing critical thinking and logical reasoning assessments do not frequently recognize student initiative, problem solving from their context, or invite individual differences or differences of approach into the conversation. In this way, they look a lot more like gatekeeping tools or elements of colonial behavior control - aimed to condition users towards a particular method and result of thinking - rather than an open invitation to think more and better.
We reviewed the following instruments individually:
AACU-VALUE CT Rubric (we used this resource to norm our discussion of evaluation approach to the CT instruments)
Watson-Glaser CT
Cornell CT Test
Professional Judgment Rating Form (PJRF)
The Holistic Critical Thinking Scoring Rubric
The Health Science Reasoning Test (HSRT)
California Critical Thinking Skills Test (CCTST)
After initial review of each instrument using a common template, the subgroup met synchronously to discuss the affordances and challenges of the assessments.
What we valued in the assessments:
Instruments that merged measuring skills/capabilities and critical thinking; a tool that situated the questions in situations/environments where logical/critical thinking was necessary.
Ease of use (uptake, resources to interpret, and review)
Affordability factor
Tools that were able to not oversimplify CT dimensions
Potential drawbacks of these assessment for classroom-level instruction:
While ease of use was valued, a few of the instruments lacked sufficient context for any one instructor to apply to their classroom.
Any cost applied to the instrument: “any mention of pricing” closes down interest.
Numerical score applied to developmental competencies.
The goal of the assessment of student competencies in these areas is not to quantify the amount or extent of the skills students possess, rather the assessment is itself positioned as part of the process of development.
To this end, we propose formative rather than summative assessments. Formative assessment provides vital information to the learner and teacher about their performance on a specific task at a particular point in time. The formative assessment is therefore a vital tool for the continued learning of the student, providing them and their teacher with many jumping off points from which to continue their development.
Further, we propose a system of assessment that is itself active, and which facilitates the intentional interaction of student with student, student with faculty, and faculty with student. Because our starting point is with the student and their learning, assessment functions ever and always as a step towards learning. In this way, part of the goal of the assessment is to invite the student to engage in different and deeper conversations about their learning.
Recommendations
Thoughtful criticisms of the instruments yielded information in the form of recommendations for what an assessment or instrument could be leading to recommendations for the Logistics team and future STEM instructors.
These recommendations could help us design or suggest better possibilities for STEM practitioners:
Small Assessment: On the analogy of the pedagogical strategies that James Lang advances in Small Teaching, we suggest a bite-sized assessment plan. Rather than reconfiguring the assessment of student learning in STEM classrooms wholescale, we suggest a more mediated revision of assessment strategies, primarily advocating short assessments that improve the acquisition of critical thinking and logical reasoning skills in individual class sessions, or components of class sessions.
Formative Assessment: Rather than viewing the assessment of critical thinking and logical reasoning skills as an abstract, summative assessment, we propose using assessments to help facilitate and guide student learning. Formative assessment is a long-used strategy in learning focused classrooms, which provides vital information to students and instructors alike regarding the level of comprehension and progress towards learning goals. Given the foundational nature of critical thinking and logical reasoning for long-term success in STEM disciplines, we suggest that assessment of these skills be formative in nature.
Local Assessment: Assessment is not done in the abstract. Specific learners are assessed, for the purpose of continuing and informing their learning. Our recommendation is to assess the actual students in the classroom, to measure their understanding of the Critical thinking and logical reasoning skills as covered in the class, as a foundation for later work in STEM, and in relation to their particular experiences and education.
Situated Assessment: Critical thinking and logical reasoning skills are a vital foundation to success in STEM disciplines, assessment of these skills within STEM classes should be focused on these skills within STEM, not apart from STEM. That is, the focus should be on critical thinking and logical reasoning as they exist in particular STEM fields, not as abstract philosophical concepts.
Progressive Assessment: Learners are by nature making progress towards a set of learning goals (both their own and those set for them by instructors and programs). Assessment of critical thinking and logical reasoning should itself be considered progressive in that it is both part of a student's progress towards acquiring these skills. Further though, learning creates progress - so assessment of critical thinking and logical reasoning should start from the presupposition that all learners - teacher and student alike - are in flux. The assessment of critical thinking and logical reasoning in STEM should, therefore, contribute to a better understanding of the ways that critical thinking and logical reasoning contribute to a learner's success in STEM fields.
In addition to these theoretical foundations, a few practically oriented suggestions are worth presenting:
Embedding project definitions in our assessment strategies: More than one instrument narrowly defined critical thinking. Any assessments we propose should include a clear and specific definition of key terms and jargon. Particularly given the variety with which the term “critical thinking” is used, it should be carefully defined and curtailed for students in any assessment created.
Situated contexts: Specific attention should be given to contexts and scenarios presented in assessments. Given that assessment rests on an understanding of context, and that context is never neutral or fully general, it is imperative to provide care when creating the context in which students are going to apply their critical thinking and logical reasoning skills in the setting of an assessment. We advocate using the context of the classroom as a rather neutral starting point, though we recognize that students experience events in class in a variety of ways.
Growth mindsets and holistic rubrics: Connecting the holistic instruments to part of another assignment or project situates the purpose of assessing complex skills and communicating back to students some sort of progress. However, simplifying complexities of CT/LR works against what we value about our disciplines when shared in discussion/conversations related to this project.
Further areas to explore to assess critical thinking and logical reasoning:
Development of formative assessments as applicable to development of logical reasoning and critical thinking
“As a formative tool - it would be useful to think about ways of first measuring presence of individual elements of critical thinking (i.e., is there evidence? Is it integrated into the argument?)”
Adapting or framing existing Classroom Assessment Techniques or established active learning strategies within the project’s emerging definitions as these techniques might not be as readily adopted as we might imagine.
We recognize that this type of assessment is labor intensive, and requires a different set of faculty knowledge. However, a shift towards an active formative assessment strategy does not need to be done all at once. Any shift towards this kind of learning has benefits for the learner. (Freeman et al, article). Shifts can be labor intensive, or they can be less time consuming (Lang, Small Teaching). What we propose then is a series of interrelated or overlapping small shifts in the pedagogy of assessment, each of which are meaningful on their own, but which taken together provide increased benefit to the student and their learning.
Qualitative assessments vs numerical assessments: The amount of value applied to having the autonomy and ability as a professor to situate the assessment within a course outweighed a numerical categorization.