Assessment from a distance presents some unique challenges. Below are some general ideas that may be helpful as you think about assessment. Though there is a lot written here, there is far more that could be said.
These points are to be considered about assessment in general, and as such can be of consideration for both formative and summative assessment.
Teaching from a distance is difficult enough to figure out, let alone assessment. With final reporting coming in the next few weeks, no doubt teachers are feeling anxious. But the children of today will not be telling their grandchildren how way back in the 2020 pandemic their math teacher gave them a particular grade. Far more important is giving our children meaningful connections (both personally and with mathematics) that will carry with them in their journey as learners.
We will do the best we can, and we need to take comfort even if it is not all we hope for under regular classroom conditions.
Standards-based assessment is a good idea all of the time, and it is ideally suited for assessment during these times.
Standards-based assessment can be summarized by looking for quality over quantity. Students' progress is not measured according to marks accumulated (ie. gathering points out of a total), but rather by assessing the proficiency of a learner, ie. what level they have shown evidence of learning with respect to learning standards. Rather than being about how many questions a student answers correctly, it is about which questions they respond to, and how well they respond to them. Instead of turning in more work to get extra marks, students show improvement by providing evidence of new or deeper learning.
In the BC curriculum, learning standards include content & curricular competencies. Often teachers find it helpful to combine these when assessing. Alternatively you can assess these separately even when looking at a single piece of evidence. Either way, it is important to identify what you are looking for in the students evidence of learning, and ideally to do so at different proficiency levels. See the next section for more about proficiency.
Another advantage of standards-based assessment is that it allows for re-assessment upon evidence of new learning by just targeting a specific standard rather than a whole unit.
As mentioned, standards-based assessment is ideal for these times. With a marks-based approach, a teacher needs to collect and mark answers and steps shown, but it is difficult to know how authentic the evidence provided is, not to mention what the meaning of that evidence is as an indicator of understanding and mathematical thinking. A standards-based assessment looks for that understanding and thinking, and with effective questions can elicit more authentic evidence. See the section below on asking effective questions.
Across much of the province, the proficiency scale that is most common currently is: Emerging → Developing → Proficient → Extending
But proficiency is about so much more than the labels used to describe different performance levels.
I like to use the graphic on the right as it shows how the performance levels are progressive, and additive. In other words, each level positively describes what a student CAN do (unlike other rubrics which have language about what a student cannot do). A student at the Proficient level can do what a student at the Emerging or Developing level can.
The dashed lines indicate that there is overlap/transition between levels. The journey across this progression doesn't jump from one level to another, but progresses over time, with some ebb & flow.
As an example, consider the competency of using strategies to solve a problem, which has two facets: the use of strategies, and the level of problem being solved. One way we can describe a proficiency progression for this is shown below:
Similar descriptive progressions can be written for other competencies as well.
For Content proficiencies, there are at least two alternatives to consider, or perhaps a combination of them:
Assess the level of competencies demonstrated for that content – even when given the same problem, students may perform at different levels when they communicate their solution to the problem.
Assess the depth of understanding demonstrated. See "Questions which target specific proficiency levels" in the next section.
Ultimately a teacher must decide for themselves what is the essential learning they are intending for their students. It helps to think of what you are looking for at a Proficient level first, then try to describe what it can look like at the other levels.
To assess a performance level, we need to give students the opportunities to demonstrate their level. Different proficiency levels can be observed in two ways (at least):
Questions which are aimed at all students, but admit of a progression of responses
Questions which target specific proficiency levels
Questions which have a low floor and high ceiling are ideal for assessing proficiency levels. One needs to anticipate possible responses at each level, then look for that evidence in the student work. Sources of low floor and high ceiling tasks include:
Open questions: Marian Small is a great source for these, as are many of the examples on OpenMiddle. For example, what would responses at different levels look like for these open questions?
The answer is 12. What could the question be?
A pattern has the numbers 5 and 12 in it. What could the pattern be?
Write a word problem that can be solved using subtraction.
Numeracy tasks: Numeracy tasks involve a context in which one needs to identify a problem that can be modeled then solved using mathematics. The types of problems solved, how they are solved, and how the solution is communicated, all can reveal different levels of proficiency. Peter Liljedahl has good examples, as does the Grade 10 Numeracy Assessment.
Rich problems: Problems do not have to be complicated to be rich. Rich problems can be solved using different approaches, and can often be extended and/or generalized. Our links page has many recommendations.
Another way to elicit a variety of proficiency levels is to use curricular competency prompts. (Note: competencies would also be observable in the tasks mentioned above). Examples include:
Agree or disagree…
Explain your strategy. Can you do it a different way?
Choose and justify…
Create a problem such that…
What is a mistake that someone would make when....? How would you help this person's thinking?
How would your solution change if...
Most of the tasks on our grade-band/course pages integrate competency prompts, so they work very well for assessment.
This means asking questions at different conceptual levels. It is important that the levels be conceptually progressive:
Having a context does not necessarily indicate a higher conceptual level. It is the mathematics required to solve the contextual problem that does. One could ask a contextual question that is at an Emerging or Developing level. One could also ask a non-contextual question at a Proficient or Extending level.
Harder numbers do not necessarily indicate a higher conceptual level. For example comparing 8/17 to 11/21 is not more conceptually deep than comparing 3/7 to 5/9.
Other aspects that may influence the depth of a task is the level of challenge (how problematic), novelty (as opposed to routine), and connections to other concepts.
The table below shows examples of content-focused conceptual progressions for various grades. For each there are other question types that could also be asked.
Authenticity of evidence is especially challenging when assessing from a distance. The authenticity can be improved by:
Asking more effective questions, like those described in the above section. If competencies are embedded in our questions, authenticity of evidence becomes more likely. Traditional right/wrong or specific-step types of questions potentially lead to scripted responses that may not come from the student or reveal mathematical understanding and thinking.
Gathering evidence in multiple ways, which is described below.
Evidence of student learning can be gathered through artifacts/products, observations, and conversations. This triad is often referred to as triangulation. To get a more accurate picture of what one knows and is able to do, evidence from one source can be verified (or not) by evidence from another source. Using all three sources is not necessary if two of the sources already agree. It also means that the sources of evidence do not need to be the same for all learners.
The graphic on the right was adapted by SurreySchools for assessment during this time. It indicates various tools that can be used for assessing from a distance. As "Responsive Instruction" these are formative assessment tools, but they can also serve for summative purposes. See the Tools section below for specific examples (coming soon).
Depending on the grade, a final summative mark may either be a proficiency rating (K-7), letter grade (8-9), or percent (10-12). This final grade does not tell the story adequately. It is far better to provide a grid like what's on the right, with the specific standards described. One can see at a glance which specific areas are strong and which need improved learning.
But as we still need to report a final mark, how do we take this information and translate it? One way is to look at it holistically. One the whole, I would report Proficient for the student on the right. For a letter grade, I would give a B. Others may give an A depending on their view of what proficient means and what their expectations were to demonstrate it.
Percents (10-12) could also be based on a wholistic view using benchmarks (eg. 50%, 60%, 67%, 75%, 80%, 86%, 90%, 95%, 99%). Alternatively one could translate the proficiencies into scores. If one were to convert these on a 5 point scale (ie. Emerging=2; Extending=5), a percentage could be calculated. This is a frustrating solution and does not properly reflect what standards-based assessment is about, but the real frustration lies in the need to report percentages.
In this section we highlight tools that are available and how to use them, with examples. More will be added.
Because the Surrey School District uses Office 365, the first tool shown here will be the use of Forms to create assessment questions.
The video on the right highlights some of the features available through Forms. You may wish to expand the video to full screen (especially if you get a message that says it can't display), or access it here. Question types include:
Multiple choice (either single or multiple response)
Text (students write their response)
Rating
Ranking (move statements into the correct order)
File Upload (students can upload an Office document, image, or video)
Likert Scale (which can be adapted for other use)
For additional support you may wish to check out a series of videos posted as part of SurreySchools #365in30seconds.
Examples
Below are some examples that show what the forms look like to students. Because of the File Upload feature, only SurreySchools teachers can access these forms. For others, a PDF version has been provided so you can see the sample questions.
These are samples, and are not intended to provide a comprehensive assessment of each topic. Rather, in addition to showing some of the different features of Forms, they were designed with the ideas shared above about asking effective questions.
Grade 4: Adding & Subtracting Decimals (pdf)
Grade 7: Adding & Subtracting Integers (pdf)
Foundations of Mathematics & Pre-Calculus 10: Linear Functions (pdf)
Desmos Activities engage students in a digital environment. You may wish to check out the Desmos section on our Teaching page first.
The Activity Builder in teacher.desmos.com is also a useful tool for assessment, ie. gathering evidence of what students know and what they can do.
There are several different tools within Desmos activities for gathering evidence: multiple choice, sketch, input, ordered list, graph, and many more!
Try this student activity, Introduction to Assessment Using Desmos. which is a Guide with Examples. You can also access it with the instructions below. It is written for teachers, placing them in the role of students.
Here is the link if you wish to copy and edit the Guide. You will need a free Desmos account. You may wish to use some of the examples as templates to design your own assessment questions.
An important feature that is not highlighted in the video is the ability to give feedback. It is recommended that students log in so that they can return to the activity and respond to the feedback.