InTASC Standard #6: The teacher understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making
Multiple methods of assessment are critical to gauge where students are in their learning and what supports I need to provide to help all students observe their own growth and meet mastery on our unit's science objectives.
Formative assessments in my middle school science classroom include a variety of formats and structures. These periodic checks are an important component to my teaching practice because they help explain where students are in their current level of understanding and learning journey; this data guides how I alter my subsequent lesson plans. The data collected from formative assessments help me tailor my instruction such that it is differentiated to particular student needs as well as scaffolded to effectively meet students where they are. Formative assessments demonstrate individual student's current knowledge and, as a collective, demonstrate class-wide patterns of comprehension and confusion. I utilize a variety of formative assessments to collect these valuable data. These include daily exit tickets, lab stations and activities, quizzes, as well as written formative assessments. These different forms of assessment capitalize on students’ varied strengths. I can utilize data gathered from these different kinds of formative assessments to guide my instructional decisions and to provide students with feedback regarding their own progression through the unit and thus, their own decisions about how to focus their learning efforts moving forward.
Following every lesson, students complete a daily exit ticket regarding their progress toward the day’s objective. Exit tickets are given to students at the end of the lesson and students work on these assessments independently. The question for the exit ticket is on my PowerPoint, and students work silently and independently to complete it. They place their exit tickets in a folder by the classroom door as their “ticket” to leave the class. Following, for every one of my class sessions, I review the exit tickets and sort them into three piles – “Met or Exceeded Mastery,” “Approaching Mastery,” and “Needs Intervention.” Based on the daily objective, I can determine which students need additional support and which students understand and have met that day's learning objective. The example below highlights student responses towards the exit ticket organized by ability to meet mastery. This data guides my instructional decisions for the next day.
In addition to daily lesson goals, exit tickets allow me to monitor students’ progress toward unit objectives. Exit ticket information helps me understand if and to what extent particular students as well as the class as a whole comprehend the day’s learning and met the objective in their progress in a unit. I always word the exit tickets so that students are encouraged to utilize the day’s learning and apply it in a new context; that way, students' learning, as opposed to simply their recall, is assessed. In addition, these questions are worded such that there is a clear and correct as well as an incorrect response. Using this data and organizing student responses based on mastery can provide me with important information regarding patterns of understanding as well as areas of confusion or misunderstanding. Targeted interventions during the learning cycle help me to support students toward the goal of mastery and to tailor my instructional practice for the class as a whole. With this data, I can alleviate confusion, dispel misunderstandings, and allow for new strategies for improved comprehension. Adapting my practice and determining alternate modes and methods on an ongoing basis is critical to ensuring that all students can gain intended learning and succeed.
In my classroom, I set high expectations for my students daily, particularly regarding scientific content knowledge. With that being said, if I notice, through data collected from exit tickets, that students do not meet the intended objective or demonstrate confusion or difficulty with the added rigor, I do not give up on my students' ability to meet the intended goals. I will continue to demonstrate to students that they are capable and can continuously learn and improve and further adapt my own practice to support them. This strategy of utilizing ongoing feedback through this particular kind of formative assessment provides me with key insights that help me support students individually as well as on a class-wide level. As a result, I can plan future lessons and content that is personalized and differentiated to meet student needs and help them access rigorous content.
Exit Ticket Examples
Exit Ticket data below is based on the formative assessment prompt: Baby powder is white and soluble. Which of these observations is a chemical property? Which of these observations is a physical property?
Students in the category who "Met or Exceeded Mastery" were able to answer that soluble is a chemical property and white is a physical property. Students who were "Approaching Mastery" understood the meaning of chemical and physical properties but misidentified one or both observations. Students who "Need Intervention" demonstrated that they do not yet understand the concepts of physical and chemical properties and were not yet able to apply this information in a new context.
This image shows Whole Class Groupings (met, approaching, intervention)
This example shows an exit ticket from a student who "Met Mastery"
This example shows an exit ticket from a student who is "Approaching Mastery"
This example shows an exit ticket from a student who "Needs Targeted Intervention"
Exit Ticket data below demonstrates students' understanding of density as it relates to mass and volume. Students answered the density exit ticket question: What is happening here? Why? based on the image of a large wooden ball floating on water and a small metal ball sinking. The data showed me that students in the "high" class surprisingly had a less complex understanding of density compared to my other two "lower" classes. Students from the "lower" classes understood that an object that is less dense will float, while the majority of my "high" class did not grasp that concept based on exit ticket data. Monitoring progress with this type of formative assessment helped guide my instructional decision-making. After analyzing the data I received from these exit tickets, I reviewed density with my "high" class in order to correct misunderstandings and to support my learners in improving.
This example shows a student who "Met Mastery" in my "low" class
This example shows a student who "Met Mastery" in my "mid" class
This example shows students who "Need Intervention" in my "high" class
Targeted intervention was used the next day to clarify confusions regarding density
Applied student learning activities, as assessments, provide me with critical information about students' understandings of particular scientific content and their ability to apply this content as scientists performing important work. Students always apply their content knowledge paired with a scientific laboratory skill such as, for example, graphing, measuring, and weighing items. Students do this so that I can see where they are in both their content understandings as well as their laboratory skills. This assessment information, in turn, helps me understand how I must support students in their understanding of key scientific concepts as well as their ability to design and implement experiments to test core ideas.
In the example below, students were provided with random items including salt, honey, baby powder, baking soda, and soap, at three different stations. Students had to apply their content knowledge of physical and chemical properties by listing multiple physical and chemical properties for each object at each of the three laboratory stations. Following, students answered check-for-understanding questions, which forced students to apply their knowledge to a real-world example. In this particular example, students recorded their observations, compared the items in the lab stations, and answered applicable questions, which connected this assessment to our Unit Driving Question, How can I make new stuff from old stuff?
This mode of assessment requires students to apply their learning in a hands-on context. Not only do labs and station activities require students to apply their learning such that it is based on real phenomena and experiences, this type of assessment also helps me to differentiate my own practice for my tactile learners and my students with individualized education plans (IEPs). Following this assessment, I am able to understand if students grasp the scientific concepts enough to apply this learning in new ways or if they are only able to recall and recite information previously provided. My district and school emphasize Next Generation Science Standards, which requires students to practice hands-on, student-led inquiry. Labs and station activities are appropriate methods aligned with these standards because students are able to guide their own learning process, arrive at the learning themselves, and test their knowledge in challenging and novel context.
This type of assessment is critical to students' growth as scientists in that they must apply scientific concepts they have learned by pulling content knowledge from a variety of sources, and, simultaneously, utilize scientific skills, such as laboratory skills, in the process. By basing students’ grades on both their lab performance as well as their responses to scaffolded laboratory worksheets, I can gain a better understanding of their progress towards mastery as well as their improvement in practicing important laboratory skills.
Station Activity Examples
This example above demonstrates work from a student that "Met Mastery" for a station activity, which tested students' mastery of properties.
Students demonstrated understanding of chemical and physical properties through challenging questions that followed the lab, as shown above.
This example shows work from a student that was "Approaching Mastery." This student understood "toxicity" as a chemical property of matter, but did not know the other properties.
I utilized data that this student was confused on the worksheet above to guide my next day instruction. I supported this student by checking in during a turn and talk on chemical properties the next day.
Lab Activity Example
Above is an example assessment from a different lab activity based on density. Here, students analyzed the density of different objects in vegetable oil. Students used student-led inquiry and hands-on learning to deduce which items had high or low densities. Following the lab, students answered check-for-understanding questions about density.
Lab items included: cotton swabs, coins, pieces of soap, pieces of tissue, honey, water, post-its, and small plastic animals. All items were placed in vegetable oil one at a time and students recorded observations in their scaffolded data sheets.
In order to assess student understanding of scientific concepts, I utilize quizzes as formative assessments approximately every other week. These formative assessments demonstrate student comprehension of content and provide me with data regarding concepts I need to review at the whole class level. Given that quizzes are always given to students as silent, solo assessments, these types of formative assessment help both students and myself understand their individual comprehension. In turn, the data gleaned from quizzes help both me and my students make more informed decisions about how to focus our teaching and learning time. Quiz questions require students to recall information, showcase their understanding of important constructs, as well as analyze and apply data in new contexts. As such, students demonstrate their understanding on multiple levels of Bloom's Taxonomy.
Assessing students in this way gauges their ability to apply their thinking and learning in new ways and to demonstrate this understanding to me in a written format. The quizzes that I design for my students require that students utilize their scientific literacy skills as well as compile evidence to present their ideas in their answers. From this data, my students and I are better equipped to understand the extent of their progress toward mastery on multiple objectives, and , in addition, to identify opportunities for improving study and test-taking skills.
Quizzes are a direct and meaningful mechanism for assessing students' ability to comprehend scientific concepts, to progress toward learning objectives, and to utilize evidentiary support from a variety of class experiences, including labs and real world application of their scientific knowledge. Data gathered from quizzes helps me understand what patterns and concepts students have mastered as well as what information students still do not understand. For example, in the quiz shared below, all but five students mislabeled one or both of the scientific tools on page 2 of the quiz; 94% of students did not correctly label the scale and the triple beam balance. As a result of this information, I made the decision to both review these tool names in class as well as include these tools in the next lab comparing mass and volume. In a subsequent assessment, I could then confirm that this scientific information, was, ultimately, learned.
Quiz Example - Exceeded Mastery
This student "Exceeded Mastery" on the concepts of matter, mass and weight through this quiz.
Unlike the majority of students, this student did accurately label the two scientific tools above.
Quiz Example - Needs Intervention
This test shows an example of a student who "Needs Intervention." To engage in the process of their own growth, this student was invited to fill out test corrections based on their test performance.
There were clear gaps in comprehension for this student. Similar to most of my other students, this student did not properly label the scientific tools or understand the tools' measurements.
Summative assessments in my science classroom allow me to understand students' cumulative progress towards reaching multiple objectives and mastery in any given scientific unit. This end-of-unit information helps me understand the extent to which students gained the intended skills, knowledge, and understandings associated with that particular unit; this data then guides my planning and decisions for the units that follow. For summative assessments in my classroom, students complete a variety of tasks, including completing models with written explanations, engaging in project-based learning tasks, and scientific literacy assignments.
Utilizing a plethora of summative assessments guides my teacher decision-making as it allows me to accurately determine the extent to which students ultimately understood the material, while capitalizing on learners' different strengths and needs. The data gleaned at this point in the unit is critical to pedagogical decisions that I can make to tailor my practice moving forward as well as student decisions regarding areas of focus for learning. If students demonstrate patterns of confusion for a particular critical concept, I will make sure to review this concept prior to starting the next unit of instruction. For example, if the majority of students were unable to understand that atoms combine to make molecules, then I would use this data and decide to incorporate additional instructional strategies, such as an element computer simulation, to clarify this concept for my learners and to allow them ample opportunity to meet the intended learning objective.
Students are able to demonstrate their overall knowledge of any given unit through the creation of models. Modelling is a scientific skill that shows that learners not only understand the concept in a way that is applicable to real life, but also that they are using NGSS scientific engineering practices as it applies to course content.
This concept is especially important as a method of assessment for my current chemistry unit, How can I make new stuff from old stuff? because we cannot see atoms but can utilize models to represent various elements and their properties. In the examples shown, students picked their own elements and created models demonstrating the number of protons, neutrons, and electrons for their element. Students not only designed and produced their own models, they also presented their work. Specifically, they shared their work orally with their peers and, in writing, they provided relevant information, regarding what materials they used for their model and why these materials were chosen to be used in a particular way to demonstrate their ideas. In this case as always for model summative assessments, I provide summative assessment feedback through a rubric, which grades students on their models, written explanations, as well as verbal explanations. The summative assessment data gathered in this case showed me that students understood the differences between chemical elements, comprehended what atoms are made up of, and were able to accurately model this arrangement.
Model Summative Assessment Examples
Below are student examples of Elements Project models. Each student had their own element and therefore, the number of protons, neutrons, or electrons displayed varied by student. Students were allowed to utilize any materials to create their own atom model. Many students were very creative with this and used innovative materials to represent their atoms including: candy, balloons, clay, stickers, cotton balls, beads, and more.
Elements Project Model
Above is an example of a student's Elements Project model. This student received a 4 (on a 1 to 4 scale) on the physical model section of the rubric because it was "creative, neat, and accurately symbolized the element" picked. Students were allowed to utilize any materials they wanted to create their model.
Elements Project Rubric
This rubric above was shared with students to help them understand the extent to which they could master the different parts of the project including the model, written report, presentation, participation, and peer feedback.
Elements Project Model Examples
Students were assessed on their model's creativity, neatness, and content accuracy based on the rubric. In addition to their model, students were scored based on their ability to present their ideas to others, write a clear and compelling report, engage during the process, and listen to their peers.
Models were displayed for the other classes to view following student presentations. In order to succeed in this assessment, students needed to meet or exceed mastery on all aspects of this project, including but not limited to the design and production of the model itself.
Another type of summative assessment that I utilize in my science classroom are project-based learning (PBL) assessments. These assessments are grounded in student creativity and their personal interpretation of the learning. These assessments incorporate and build upon students' personal strengths and require students to decide how they will be able to accurately express the depth of their own learning. Project-based learning assessments are extremely involved because not only do students have to compile and bring together a large amount of scientific vocabulary and understandings, they must also demonstrate creativity and use their imagination to showcase their complex understanding of the unit's objectives.
Students demonstrated their understanding at the end of the life sciences unit, What is going on inside me?, through project-based learning. Students were required to demonstrate the complex processes that occur inside of our cells through the creation of a cell cartoon. Each object in the cartoon represented a different part of the cell, performing a different job. One example below portrays a student example in which the student used the analogy of a beehive as the cell.
Based on student work for this summative assessment, including the examples shared below, I came to understand that the majority of my students had an advanced conceptual understanding of the function of cells in our bodies. Thus, the majority of my students "Met or Exceeded Mastery" in their ability to answer the unit-driving question, What is going on inside me? I was able to make these inferences regarding mastery, based on my qualitative observations. Specifically, I saw that students were able to make analogies between cell parts and the real world and to use data appropriately and correctly. I could also conclude that the students had excelled based on quantitative data from the summative assessment rubric. Together, the qualitative and quantitative assessment data helped me make decisions about how to bridge to the next instructional unit and to understand the extent of student comprehension of material.
PBL Summative Assessment Examples
Above is an example of a student's cell cartoon who "Exceeded Mastery." This student used the analogy of a beehive to demonstrate each of the different parts of an animal cell. In addition to creating these cartoons, students shared their analogies with classmates.
This example above of a student who "Met Mastery" used the analogy of the cell as a zoo. This includes the lion as a nucleus, the fence as the cell membrane, a watering hole as the vacuole, and the garbage can as a lysosome. Other exemplars of the cell cartoon project included the cell as a: school, playground, aquarium, factory, soccer field, and more.
PBL Summative Assessment Rubric Example
This is the marked up rubric for the student's work above who completed the Cell Cartoon Project using the analogy of a cell as a beehive. This rubric was based upon students' participation, presentation, content accuracy in the project, and creativity/engagement.
Written scientific literacy assessments are another method of summative assessment that I utilize in order to understand students' comprehension of scientific vocabulary as well as to test their ability to apply in-class understandings to real world contexts at the end of a unit. NGSS emphasizes real world phenomena-based learning so that students remember the content, understand the underlining importance of the content, and utilize the content to brainstorm scientific strategies and innovations to improve our world.
In the following written scientific literacy assessment, students were required to analyze the ways in which plastic negatively impacts our world. Students worked to answer the question, What are plastic water bottles made out of and how can we re-use them to make new materials? Students read a scientific article for background knowledge, answered content-rich check-for-understanding questions, then clarified and defended an argument regarding how we can solve this issue. In constructing their argument, they needed to demonstrate their understanding of the makeup of the molecules inside of plastic.
Scientific Literacy Summative Assessment Example
In the scientific literacy assessment below, What are plastic water bottles made out of and how can we re-use them to make new materials?, student data revealed that actually more students with IEPs "Met or Exceeded Mastery" when compared with their peers. The data demonstrated that this material was scaffolded sufficiently and with a significant amount of instructional support such that all students could succeed.
This scientific literacy assessment asked students to apply their learning regarding complex and simple molecules to the real-world problem of plastic pollution in the environment (page 1).
Students answered check-for-understanding questions based on content knowledge, a video, and a reading. The goal of this summative assessment was to test for comprehension (page 2).
Once students answered the initial check-for-understanding questions, they were then challenged to defend an argument in response to the following question by using scientific evidence: Should we recycle? why or why not? (page 3).
Following this assessment, I provided students with detailed feedback based on their ability to construct an accurate scientific argument based on multiple pieces and sources of evidence (page 4).
In order to truly test student comprehension through a summative assessment, I need to first prepare my students with the background content knowledge. Instructional materials helped prepare all of my students for this assessment, especially my students with IEPs.
Prior to this summative assessment, instructional materials were differentiated and included opportunities for students to experience videos and images and to have multiple opportunities for practice.