InTASC Standard #6: The teacher understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making
Assessments must be utilized purposefully, with accurate and useful data, in a way that drives both teacher and student decision-making to improve learning outcomes. If students monitor their own growth and view their own progress towards mastery, but do not make informed decisions based upon the data collected, then the assessment tool may prove useless or, worse still, unhelpful. Instead, it is critical that assessment data be thoughtfully considered and utilized effectively and productively; assessment data must be used to improve learner outcomes and help both teachers and learners alter future practices. In my classroom, I utilize several different approaches to implementing data-driven decisions that are based upon multiple methods of assessment designed to engage learners in their own growth and development. These methods include elements that allow for self-monitoring by students. These methods also allow for ongoing alteration of learner-centered decision-making as well as changes in my own practice as well. Data-informed methods include the use of diagnostic data, test corrections, peer feedback, and teacher-altered instructional strategies.
At the beginning of the year, my students take a science-based diagnostic assessment, (shown below,) that is grounded in the process of scientific inquiry. Students are asked about basic terms that are critical to scientific understanding such as the application of their understanding of concepts such as a "control" for an experiment and what an "independent" and "dependent" variable are in scientific experiments. This data helps me to understand the content knowledge my students have based on prior background knowledge and experiences. This data provides me with crucial information about how students might approach or address complex scientific problems. Further, this data guides me by helping me decide how I will alter my instructional practices such that I build upon what students already know regarding scientific approaches to learning as well as particular scientific concepts. This data, when gathered through questions that are asked in an applied as opposed to a general way, as the examples below demonstrate, can help me figure out how to clarify patterns of misconceptions and confusions. I monitor student understanding across the year; students take an equivalent assessment at the mid point of the year as well as at the end of the year. In this way, I can simultaneously monitor student growth, continuously alter my instructional practice, and use data that is specific and actionable and so, immediately useful in altering my plans for teaching.
For example, data from the beginning of the year showed that the averages in each of my 7th grade science classes feel in the "Needs Intervention" category. The averages in all three classes were 69% or below. That data guided my decision-making for the starting unit of the year. Instead of reviewing scientific inquiry quickly before beginning unit 1, How can I make new stuff from old stuff?, I had to significantly review main concepts about the steps of scientific inquiry and teach it in ways that would hold meaning for students. I did this through a review of all key scientific vocabulary, several in-class practice collaborative opportunities, and two experiments. Students completed a peanut butter and jelly experiment to learn the importance of a detailed procedure when performing scientific inquiry and completed a paper plane experiment in order to understand how to record, graph, and analyze data - both of which were missing concepts based on diagnostic student data. These instructional strategies can be viewed in the applied student learning section of my instructional strategies. Thus, in this way, diagnostic data informed my teacher decisions in terms of re-teaching much of the process of scientific inquiry, as well as provided me with insight regarding future assessments and in class activities; namely, that I must actively plan for scientific inquiry led by students themselves in my science class to solidify this important topic in their memories.
Scientific Inquiry Diagnostic
This scientific inquiry diagnostic, given to students at the beginning, middle, and end of the year provides me with data to meet students where they are, challenge current misconceptions, and build on students' prior knowledge and understandings. From this data, I gain a better sense as to how in-depth their understanding of core scientific concepts is along with their ability to apply those ideas in a scientific example. Also from this data, I can make more informed decisions about how to focus our instructional time together.
Student growth on this assessment is monitored at three different points of the year; this data helps me understand how to better support students in comprehending science content and using student-led inquiry to tackle real world problems. In this particular example, students are asked to generate their own hypothesis and so, tests their own ability to engage as learners in the scientific inquiry process. This data along with the process of asking these questions on this assessment provides students with a means for understanding what the scientific inquiry process entails along with an opportunity to experience, first-hand what that process is.
Following written assessments, (quizzes or written summative assessments), students have the opportunity to perform test corrections if they have not yet met mastery (80%). Students are not required to perform test corrections. However, I share with students that they will gain back points on their assessment based on their engagement and effort on the test corrections. This, then, allows for further reflection and self-monitoring, as well as engagement in their own learning process and progress. As opposed to tests and quizzes taken during class, without access to their notes and assignments, students are allowed to access their study materials when making test corrections. Students complete the test corrections form demonstrating where and why they made mistakes on the assessment. Further, they must explain how they arrived at the correct answer. To support students in gaining the intended learning, I will often allow for some class time devoted to test corrections so that I can circulate and provide intentional targeted supports for students based on their areas of confusion. Specifically, I support students in reaching learning objectives by pointing to labs, handouts, homework, and activities that may help them gain the intended learning and correct their errors. In this way, students are fully expected to engage in their own learning.
For example, several of my students struggled in the assessment, "Chemistry Test 1." Although I created scaffolds and supports to help many of my students improve and succeed, (explained in more detail below in my teacher-altered instructional strategies section,) there were still a few students who did not fully demonstrate their abilities. As such, I gave every student who did not achieve mastery, (anyone who scored below 80%,) the chance to perform test corrections. Several students who did perform test corrections did so thoroughly and carefully, as observed in the example below. These students effectively utilized their original assessment performance data and made the decision to do corrections and improve. To help guide learners' decisions and answers to these test corrections, I offered myself as support during several different lunch hours over the week and I encouraged students to use my detailed comments on their original tests to help them amend their answers. Through the process of analyzing their own data, making the decision to improve their score, and going through the process of performing the corrections , students grew in their science content knowledge and in their testing confidence. Students using test corrections to continuously grow and improve demonstrates students' willingness to use data to make positive academic decisions.
Test corrections allow students to utilize their own data from their own assessments in order to move forward and improve. Test corrections further allow me to understand where students are in their learning and explicitly how they still need to improve, both as individuals and, to the extent their are patterns in understanding/lack of understanding, as a class. In this way, students are engaged in their own progress and growth and my instructional decisions are altered based on this student data.
Original Test - Needs Intervention Example
Above is one student's original "Chemistry Test 1" assessment. This student was in the "Needs Intervention" category based on his score (52%). I marked up a significant portion of this student's test such that when he performed test corrections he could effectively understand why he got points off and could use this data to improve. Test corrections allow students, such as this one, to internalize and analyze their own data from their own assessments in order to move forward.
Revision - Test Corrections Example
Above is the completed test corrections for the "Chemistry Test 1" assessment example (to the left). This student utilized his class notes, labs, activities, and feedback from me to correct his answers and explain his reasoning. This data demonstrates to me that this student has grown in his understanding of key science concepts, has monitored his own growth, and has taken steps to improve his learning based on assessment data. As a result, this student gained back points and moved from "Needs Intervention" to "Approaching Mastery" on this test.
Peer feedback is critical to helping students engage in their own learning while helping other learners. Through sharing their own understanding of particular scientific concepts and in various formats, students come to understand areas in which they need to improve. Further, by reflecting on peers' work, such as a presentation, students come to reflect on their own ability to communicate understanding to others. As a result of this process, students are better prepared to make effective decisions about how to focus their attention and time when facing complex scientific material to learn. They might learn, through the process of giving feedback on a given presentation, for example, just how much they didn't actually understand about a certain scientific topic, such as a chemical reaction, and so, identify areas for their own improvement. Further, they might discover new insights about how the presentation of scientific ideas can influence another person's understanding.
I constantly offer support to students during the learning cycle as well as give them specific tips for improvement following assessments, whether those assessments are based upon written tasks, oral tasks, project-based tasks, or other formats of assessment. Across these modes of assessment, peer feedback in the form of peer scoring rationale or peer editing, can deepen students' understanding of where they reached mastery on a given topic as well as how they could have improved. Students generally respect their peers and internalize peer feedback in a way that positively informs their long-term progress. Therefore, when done well and often with thoughtful guideposts and parameters, I find that it can be extremely effective as a data-driven approach to improving student-centered learning.
Peer feedback also helps to guide learners' future decisions regarding their own growth trajectory. This feedback provides all of my students with specific glows and grows from their peers that they can use to understand how to improve for the next time they share their data with one another. Specifically, students are encouraged to note one portion of the feedback that they will try and improve upon for their next presentation. In this way, students can make a commitment to their own learning that is based upon data and feedback that they value and are motivated to incorporate into their own decisions regarding their own learning and development moving forward.
Peer Feedback Examples
Students are encouraged to grade their peers in a way that is positive and productive. Following presentations, I return peer feedback, omitting all graders' names. In this way, students can monitor their presentation growth and use the data to guide their decisions about future ways to improve.
Example 1 (top image)
For example, in the peer feedback form taken from the Elements Project summative assessment, (the top image,) one grader wrote that the presenter included detail and a helpful model; (seen in the the summative assessments section of my multiple methods of assessment). The grader also noted that this presenter could have improved in terms of their knowledge of their individual element and their ability to answer audience questions. This feedback from this grader, as well as from other students, helped the presenter improve in those areas for subsequent presentations.
Example 2 (center image)
The second example, (the center image,) demonstrates a slightly altered peer feedback form. This form was used for students' Earth Day Environmental Issue projects (explained in the student-led research section of my instructional strategies). I wanted students to receive qualitative data on this assessment so that they could understand how they were able to captivate, empower, and move their audience, as well as share rigorous content; from this data, students could decide what persuasive strategies to use for their next presentation.
Example 3 (bottom image)
The third example, (the bottom image,) shows a different peer feedback form. This form was utilized in the Wind Turbine Design project, which was a collaborative group activity (explained in more detail in the collaborative opportunities section of my instructional strategies). Here, students were able to give their peers data regarding how they best work in groups. This data helped students work more effectively in future groupings.
I utilize data from assessments to inform my future instructional strategies and to make data-informed alterations based upon patterns of comprehension and confusion. For example, 60% of students with IEPs performed poorly on a scaffolded assessment about density; this data demonstrated to me that I need to differentiate further in future assessments to support all learners in my classes. Patterns that emerge from data gathered from multiple assessments can play a critical role in the decisions I make about how and what to teach as a science instructor. I must diagnose the data by working inductively as well as deductively, analyzing the data in a bottom-up fashion, across the assessments to search for patterns in the data. Following, I must check my own inferences by testing out alternative ways of teaching as we progress in the year. The data I gather can be especially valuable in helping me determine learning needs among my different classes, among groups of students, and for individual learners who need varied supports. It forces me to challenge my own assumptions about what does/not translate across groups of learners as well as ways I might need to provide additional differentiation on assessments. I can utilize this data moving forward as I create and modify future assessments.
As an example, I concluded from previous assessment data that students with IEPs as well as my ELL students were not performing well on quizzes. Specifically, students averaged 71% ("Approaching Mastery") on their "Introduction to Chemistry Quiz" with several of them receiving 40-50%. In class qualitative and quantitative data showed that my IEP and ELL students understood the concepts and gained mastery in formative assessments such as exit tickets. However, they were not demonstrating this mastery to the best of their abilities on tests. I collaborated with the special education teacher and spoke with my students about what helps them in a stressful testing environment. These two points of data, (from students and the SPED teacher,) helped me realize that I need to purposefully include: more visuals, more space for writing answers, word banks for scientific vocabulary, bolded important words, and options for demonstrating answers (writing, drawing, or creating a diagram). Additionally, this data showed me that I should read prompts, (especially long answer questions,) outloud to students because they are better able to answer questions accurately when they hear the question outloud, rather than reading it on their own (shown through in-class data). Reading outloud also supports my lower readers as well. Thus, in this way, assessment data helped guide my decisions and assessment structure so that all learners could succeed. These alterations significantly helped my students. My students with IEPs averaged 77% on their next assessment, "Chemistry Test 1," shown below. This data is significantly higher than the previous assessment average, 71%, with multiple students improving their assessment scores up to 20% more than their prior scores. One student in particular, who struggles to finish work in class, went from a 40% on their previous assessment to an 82% on "Chemistry Test 1." A second student with an IEP went from a 40% on his last quiz to a 91%. I still have additional work to do to help all of my IEP and ELL students meet or exceed mastery. However, including purposeful and targeted scaffolding based on student performance and assessment data clearly improves student outcomes. Data-informed assessment decisions and alterations are necessary such that all students are able to access the learning and demonstrate their full understanding on assessments.
Altered Assessment by data - High example
Based on in-class data as well as prior assessment data, it was clear to me that my students in my "high" class were "Exceeding Mastery". Specifically on their previous assessment students averaged 92%. As such, I created this altered assessment such that it was more rigorous, and required students to extend their thinking beyond in class examples. In this class' assessment, compared to other classes, they had more application questions, more blanks to fill in with scientific vocabulary, and more True/False and explanation questions. Even with these challenging alterations to push my students, they still succeeded. Students on average "Met Mastery" through their assessment score of 88%. This data shows me that I effectively challenged students and increased rigor; this also demonstrates that I must continue to create opportunities for them to extend their knowledge and continue their individual growth.
Altered Assessment by data - Mid example
After collecting both in class activity data as well as data taken from their previous assessment, I found that many students were "Meeting Mastery" in my "mid" class; students received on average 83% on their prior assessment. However, some students, especially my ELL students and my lower readers in the class, still struggled with questions that require them to read a longer text. Though students comprehend concepts in class, they still need to be able to access the intended learning on their assessments. From this initial data, I decided to scaffold questions on this assessment such that my "mid" students still gain the intended learnings, but my questions are less wordy and more accessible. Students on this assessment scored an 85% average and thus "Met Mastery." In this particular example this student, who is a lower reader, received a 63% on their first assessment and a 91% on this assessment, which shows significant growth and improvement based on these changes.
Altered Assessment by data - IEP/ELL Example
In order to support my students with IEPs and my ELL students, in this assessment I included more visuals, more space for writing answers, word banks for scientific vocabulary, bolded important words, and options for demonstrating answers (writing, drawing, or creating a diagram). I also read prompts outloud to students to help my students with IEPs and my ELL students decode text; this assessment alteration further supports my lower readers as well. These changes led to success for many of my ELL students and students with IEPs. My students with IEPs averaged 77% on this assessment ("Approaching Mastery"). One student in particular, who performs well in class but struggles to demonstrate her knowledge on assessments, went from a 40% to a 91% on this assessment.
Thus, data demonstrates that I was able to alter my own decision-making as an instructor regarding how to test for understanding on critical scientific content. The standards did not change and my high expectations did not change; however, my approach for achieving those targets for understanding did. I believe this enabled me to more effectively and equitably facilitate the learning of all of my students. Now, moving forward, I will be able to include this new assessment in the battery of approaches previously mentioned, including peer feedback and test corrections, to continuously assess how these changes in my own instructional practices do or do not provide useful data that can engage students and accelerate student mastery.
Interventions backed by Assessment Data
Data shows that this student, similarly to other students in the class, struggled identifying the reactants and products of chemical reactions on this quiz. I discovered this data based on this assessment grade (47%,) and, more specifically, his response to question 2 on the assessment. Many students were able to understand the differences between chemical reactions and non-chemical reactions. However, most students, including this student, could not identify which materials start the reaction and which ones end the reaction. This data showed me that I needed to guide learners in this concept more to help them achieve mastery.
This student increased significantly from her last assessment (78%). On this assessment she "Exceeded Mastery" with a 91%. This student performed significantly above her peers; they scored, on average, 76% on this assessment. Thus, during collaborative work time, I empowered this student to support her peers in reaching the intended objectives and learnings about reactants vs. products, given that the majority of students did not do well. Peer tutoring and support is a crucial intervention because often peers can explain concepts differently from me in ways that other students will more effectively understand.
Interventions Provided based on Assessment Data
Performance on the assessment above shows that 43% of students in my "low" class were in the category "Needs Intervention" for their "Chemical Reactions Quiz." This data, combined with class observational data that students were specifically confusing the terms reactants and products, guided my decision-making. As a result, I created this scaffolded activity for students, "Chemical Reactions: Reactants vs. Products."
Students worked collaboratively on this assignment to determine the reactants and products for each sample chemical reaction. To support all of my learners, I used examples that we had already discussed in class and ones that I thought they would be able to relate to. Students then worked independently to answer a challenging application question about why pennies turn green if leave them outside for a long time in the rain. Students thus had a chance to guide their own learning and decisions, apply their knowledge, and demonstrate that they now do know the difference between reactants and products in a chemical reaction. I will continue to use a variety of assessment data to inform the decisions my learners and myself make.