Action Research - Assessment PHASE 1

When I found out we would be able to shape our own Professional Development using the Action Research Framework I was excited but then instantly overwhelmed... what on Earth would I focus on? There is so much I want to learn, how can I narrow things down? The answer came through some professional reading I started doing while on maternity leave... Assessment! Always a passion of mine but rarely had I been given the time and opportunity to really try to make some positive changes.

My official focus is on empowering students to take a more active role in assessing their own learning & Moving towards feedback vs grades. I will do my best to document the journey here.

Resources, Reading, Reflections & Professional Development Opportunities

Information is power so I've been trying to gather as much information as I can! I started with reading some highly recommended books.

As I make my way through this pile, I am continuously inspired to look at assessment through a different lens than I have been used to.

Classroom Implementation

This was the part I really needed to wrap my head around and take the biggest leap of faith. All the reading in the world would be useless if I didn't just jump in and start trying. So I started by having some discussions with my classes. We discussed what assessment was, who used it, what its purpose is, the difference between formative and summative assessment (in their eyes).

It was very eye opening.

I made sure that I recorded what students said, without judgement, with very little input of my own, just trying to get them to give me their honest thoughts on how they've experienced assessment in the past. I saved the evidence of these discussions using Notebook and used it to plan the next steps.

One of the biggest, most thought-provoking questions I posed to students was:

"If I told you that you had the opportunity to assign your own grade in science this term, but that you had to provide evidence to support it, what would that evidence look like?"

Students were a bit stunned. Some wondered "would you really do that?", others automatically said they'd assign an "A" (of course) and, very interestingly, some students said "that sounds like a lot of work".

First they recognized that they would need to know what they had to know - enter student-friendly outcomes (see left). Then we discussed what this might look like at different levels of understanding in general terms. I'm not going to lie, this was a challenge. In the end, I pieced together some definitions that students agreed upon. They are all still works in progress (as is this entire process) but it's a good starting point.

What do they need to know?

One thing I have wanted to do for a long time is translate the Program of Studies into student-friendly language. This is a work in progress but I tried to break things down to over-arching outcomes with steps that are involved.

The hope is that if students can perform all of the steps independently, that they (and I) would be confident that they know and understand the outcome to AT LEAST a proficient level.

Two outcomes from the Light & Optical Systems Unit in Science 8.

What does learning look like at Exemplary, Proficient and Adequate levels?

Using outcomes and descriptors to design assessments

Once we had settled on descriptors, it was time to start thinking of how both the students and myself could assess learning. Yes, I want students to take a more active role, but let's face it, I'm the professional and it's ultimately my responsibility to provide evidence supporting how a student is assessed. I decided to try something a little bit different and "level" my assessments. That is to say, for each outcome, I would design questions that would assess at an adequate, proficient and exemplary level and students could "move up" or even potentially challenge the level they felt they wanted to try.

Science 8 Reflection - sample questions testing the same outcome at Adequate, Proficient and Exemplary levels of understanding.

Adequate:

Proficient:

Exemplary:

Assessment Options

Students often say they 'don't do well on tests'. Ok. So my question then was how else could you demonstrate what you know and can do? We brainstormed a list and then discussed the strengths and pitfalls of each for demonstrating understanding at the different levels. Students quickly realized that some of their "go-tos" (like making a presentation) wouldn't be easy to show exemplary learning.

Many said they like to talk or show what they know. For this I started experimenting using Flipgrid. It's an online program that allows the teacher to create topics that students can respond to in short video clips using the chromebooks. First I just let them play around with it which was a lot of fun, but when it came down to figuring out how to use it for an actual assessment, it became obvious some more guidelines needed to be discussed and I needed to sort out logistics of having students performing several different types of assessments of the same outcomes.


First Assessment Day - Science 8 Light and Optics

My first attempt at this was with my grade 8s. They were given the outcomes, steps and level descriptors ahead of time and they had some time in class to decide how they would be able to show their best learning. They told me what they wanted to do so I could print tests, plan where students doing flipgrids would be etc.

I met with the students before we all began to give instructions and confirm who would be doing a written assessment, video etc. and then we were off!

Overall class instructions for first leveled assessment in science 8.

I used tri-fold boards to make "isolation booths" for students using Flipgrid.

It was great to see who had planned well for doing their assessment using Flipgrid - they had scripts and asked for materials to demonstrate with. Some learned quite quickly how important those planning stages were! It was also interesting that during the assessment one student looked up from test writing and said "I really like doing it this way. It really makes me see what I know and what I don't". Music to my ears!

Once everyone was done, they needed to fill out their Student Assessment of Understanding sheet that they had received to guide their studying a week earlier. They needed to determine what level they believed they should be assessed at now that they had performed their tasks and why. This was by far the BEST part of the whole task.

My Reflection on First Assessment Day

I thought two outcomes was no problem for one class. I figured I usually give quizzes etc covering much more material in that time frame and it wasn't an issue... well what I didn't consider was how thoroughly I was assessing each outcome in this way. It took every student the entire class period and many had to come back for extra time. SO for the future I either need one outcome only or simpler questions or something... still mulling that one over.

BUT on the positive side... I LOVE how much information I got from doing the assessment this way! It is very clear to what level students understand the information and it makes reporting their progress so much easier. It's all right there. I also really like that the students seem quite aware of where they are now by doing this. If they couldn't tackle the exemplary questions they know they need more help to get to the next level. It was a lot of work to set this up but I think it was worthwhile, especially if I am doing fewer but more meaningful assessments.

Reflection after Reviewing Student Assessments

WOW! First of all, what a great way for me to identify, specifically and right away, the concepts that needed reteaching, reviewing and relearning! It became so clear because everything was in order and leveled. I could see right away, for example, that students needed reminders on how to measure angles of incidence and reflection in a ray diagram. It was also fantastic when marking to be able to see exactly where students started to struggle as they moved "up" in the levels of understanding. Because they'd had clear expectations of what each "level" looked like they were very aware of what they'd achieved!

On a constructive level, I noticed that giving students the choice to challenge themselves with one of the higher levels was challenging when they did not demonstrate understanding at this level. For instance, a student challenged Exemplary only for reflection and was not able to complete any of the assessment. So at this point what choice do I have but to indicate the student has demonstrated Limited understanding? So based on this I have decided all students must complete the Adequate level assessment prior to challenging any higher levels.

Student Feedback

I did a debrief discussion with each of my classes after they were done. I wanted to know what they liked, what they didn't and if they would change anything the next time around. I shared with them that I had already determined that two outcomes assessed this way was definitely too much and I could tell that they appreciated that this was noticed.

It was a very interesting discussion! Some students loved getting to work their way up through the leveled assessments, others just wanted to be able to challenge whichever level they wanted to no matter what. Still others said they liked that they didn't feel pressure to move on and challenge higher levels when they knew they couldn't do them.

Many students said they liked knowing right away where they "stood" by seeing where they started having difficulty. Surprisingly though, there were a group of students who said they DIDN'T like that they could see where they were struggling and even said they'd rather I just mixed up all the questions and then told them in the end where they'd been assessed. This one made me chuckle because what it told me was that my assessments were, in fact, doing what I'd hoped. They were making students more aware of what they know and can do but some students didn't like that newfound awareness. What a testimony to how they've viewed assessment in the past.

Reassessment = less marking than "Retakes"

Anyone who knows me knows I love my job. But they also know the bane of my existence is marking... ugh. I am so bad at keeping up with it. Not because I am lazy but because I have always been more interested in creating classroom experiences for learning and those take a lot more time. So if the choice is mark or develop a new lab or activity, the marking does not win.

BUT here is what I am finding out with this system I am developing... there is less actual paper marking to do - read more about that in my Interactive Notebooking journey - and when I do mark it's so much more meaningful. However, the biggest bonus by far is how the reassessments are reducing my marking load. Instead of kids rewriting a bunch of stuff they've already done and likely getting the same mark, they ONLY challenge what they need to. This means I ONLY need to mark what they need reassessing on. Can you see me doing my happy dance?

Getting out of the "does this count/can I redo the test" mentality

Probably one of the least enjoyable questions a teacher gets is "does this count?" when assigning a task or preparing students for a test. I will admit, it grates on me every time and I have to take a breath before I respond. Unfortunately, I think sometimes our communication of what formative and summative assessment is can perpetuate these kinds of questions from our students. Making statements that formative assessment "doesn't count" leads kids to think they don't need to worry about trying when, in fact, formative assessment is often the most important tool we have! Formative feedback is what kids use to improve. I LOVE it when I have assigned a lab analysis paragraph and I have students lining up to have me read them and offer advice, this means they want to do better. But I think what they are missing is that this is formative assessment. Finding out what is missing, re-working, re-writing, re-wording... this is the most important part of learning.

By having assessment based on outcomes and reporting their progress in formative descriptors until such time as a summative grade is assigned for the unit, students are recognizing that everything they do "counts" towards their learning. By seeing that their learning is fluid, they seem to be more willing to come in and review and relearn material.

In this vein I am also trying to change our vernacular around "retakes". It is not called a retake. It is called a "reassessment of learning". Students only have to challenge the outcome and level that they feel they need to improve. Why do I need to make them rewrite an entire "test" when they've already shown me that they understand one outcome really well and that they just needed a boost to get to "exemplary" instead of proficient for the other? The answer is: I don't.

As we have progressed through the year, I have found students ask these questions less and less. If one student forgets and asks "does this count?" usually the rest of the class will respond with "everything counts", which is incredibly helpful.

Parent Teacher Conference Discussions

Prior to conferences this year I crafted a message to post on SchoolZone for parents to read and digest before interim reports were posted because I knew that things would look slightly different. Instead of "Reflection Lab" and "Topics 1-3 Quiz" with summative marks, parents would see curricular outcomes and formative descriptors. I wanted to head off any concern by giving an explanation as to what was happening and why there would not be summative grades posted at this point. What I found was that the vast majority of parents were very happy with the system. They saw WHAT their kids were learning and how they were performing against the outcomes so far. The fact that all students had the chance to review and relearn material and then reassess meant that learning was the goal, not just "let's get through the curriculum". A few parents said they'd been a bit confused by the SchoolZone post but that once we discussed it in person it made more sense. Still a few more expressed slight concern that students wouldn't be 'prepared for high school'... this is always something a junior high teacher has to endure, just as, I am sure, high school teachers have to deal with people being concerned about their kids being 'prepared for university'. All I can say is, as a teacher, my job is to teach kids, help them learn the curriculum and do my best to make sure they feel confident in their abilities. If I do that, students should be prepared for anything.

Progress Reporting

This was my first year back from maternity leave and I found that there were LOTS of changes to how we reported student learning this year. It was a relatively steep learning curve for me but what helped was that I was already assessing and reporting student learning based on outcomes, it made evaluating strengths and areas of growth more straightforward. I could report to the specific learner outcome where students needed assistance and I could program for their needs more easily.

We also release Interim Reports 3 times per year at our school, a few weeks before each progress report comes out. I found it was helpful to be able to include a comment with the overall outcome and how the information was collected. If a specific student did something very well or showed an area for growth I could put it right in as I entered their data, again improving the communication between home and school. I found I had very few inquiries after interims or progress reports because parents were receiving sufficient information.

Co-constructing Criteria

This is easily one of the most essential tools I have found when getting students to take ownership, but it can be hard to decide the best times to co-construct and how to implement. For science, I have found it is best done when we are looking at skill based outcomes such as problem solving and lab analysis.

This year I have very large classes (30+) students so doing a more traditional co-construction where students write criteria on large strips of paper and lay them out on the floor. Instead, this year, I handed out copies of student answers to questions I had given my classes the day prior. These are answers I have collected over years of teaching to demonstrate various levels of understanding. I asked students to identify which answers looked like they represented exemplary understanding and why. I gave them post it notes to record their ideas on - one idea per post-it.

Once we had gone through the answers and students had their post-it ideas on their paper, I posted large sheets of paper around the room and asked students to put their ideas on each sheet that matched their criteria.

It was great to see students picking out common themes. If we found criteria that didn't seem to match one of our posters, we created a new one!

I used each poster to use student language to develop our co-constructed criteria for exemplary lab analysis. This was printed on to a piece of bright yellow paper and given to each student to put into their interactive notebook. I also colour coded each of the criteria so students could use this to better evaluate their own work before submission.

Using Co-Constructed Criteria

The best part of doing this process was that students were able to see the results and use the criteria immediately. They had already completed some analysis questions for assessment and I asked them to go back and highlight the criteria in the colours I had given them. If they were missing something they were encouraged to add it in, either at the bottom or using a post-it note.

Students referred to this criteria throughout the remainder of the year and the improvement in quality was definitely noticeable. There is still some work to be done helping students to distinguish between actual criteria and connecting words and phrases, however this is absolutely something I will continue with next year.

When submitting any similar work for reassessment, students were permitted to add information in post-it note fashion as well. As long as the information was there, it didn't always have to be 'pretty'.

Keeping the connections - using Interactive Notebooks

There is so much to this topic that I cannot include here BUT one thing I tried this year was for students to keep all experimental design and analysis as part of their notebook. No more separate handouts or worksheets. It's all in their notebooks. While this has posed a bit of a logistical challenge for marking, it has been incredible for students to use their own work as a tool!

Flipgrid to supplement written assessments

Social media has really been a blessing when it comes from learning from amazing educators. I have gotten so many ideas from seeds that were planted through online professional learning networks. I love the idea of assessing students in low-stakes situations because what they know may be more authentically expressed. I also was not loving how most of my assessment had been in written form so I decided to mix things up with an assessment in grade 8 Mechanical Systems.

I did the assessment in 3 phases:

1. Pre-test flipgrid: I posted two simple topics in Flipgrid - work and mechanical advantage - and I asked students to tell me what they thought they knew about the topics. They could talk, write and post, act it out, whatever.

2. Written assessment: similar to my other leveled assessments.

3. Follow Up Flipgrid: the next day I gave students the chance to tell me anything they had remembered or learned since they had written their assessment. If they wanted to tell me how they should have solved a problem differently or if they mixed up a formula, this was the place to do it. I also gave them the option just to write it on paper if they wanted to. Students were thrilled to get the opportunity to do this!

My reflection on this though... it sure added a lot of time to the marking process. I am not yet sure how I can fix this though so I will need to reflect on it more throughout the summer.

Back to Top

Looking forward... what do I want to add or change?

After a year of beta-testing there are definitely things I hope to change moving forward. While the bones of how I assess will continue - outcome based, leveled assessments - there are some improvements in how I want to report student understanding to the students themselves.

1. Don't report a "level" right away.

While I managed to move away from a numerical total for most assessments, I still ended up reporting "ADEQ, PROF, EXEM" which, in essence is still telling students whether they achieved an A, B or C. I would like, instead, to have a more detailed checklist that I can fill out for each outcome. I could still record the level in my own gradebook but sometimes I found students still just looked at that instead of reading through the feedback given. Hopefully students will feel more compelled to demonstrate learning at higher levels if they just see a checklist for each step.

2. "Everything I know that I wasn't asked"

I always hated when I studied so hard and then I wasn't asked about everything I knew. I want to give students a chance to show what they know by providing a blank sheet after they write the assessment and they can write everything they know that they weren't asked.

3. Pre-progress reporting conferences

My goal coming up is to have a summative grade report ready for students 1-2 weeks prior to progress reports and give them the option to conference with me about this. If they agree with their grade, they can sign off, if they do not they will have to come in and see me with evidence that supports their argument that their grade should be different.

4. More blueprinting

While I definitely improved my assessments, I know there is work to be done. Further matching with the POS is essential and I would like to especially focus on the verbs for each outcome, ensuring that question and DOK match the verbs in the curriculum.

5. Using Google forms to document information

Documenting observation and conversation evidence is essential but time consuming. I have been given the idea to use a Google form to document these and would like to implement it in addition to an "acronym tracker" such as one described by Sandra Herbst.

Back to Top