Peer Marking in Maths Tutorials

Post date: Mar 17, 2017 4:55:40 PM

Last week I tested peer marking in 6 of my tutorials. I've had read about the benefits of peer marking in various book, as well as hearing lecturers from other departments trialling it out. So I set out to see if it could be implemented for my tutorials.

One week before I decided to run this, I warned the students about my intentions, telling them that in the next tutorial they will be marking each other's work. Upon receiving their homework that week, I photocopied each scripts. This allowed me to mark as normal, and still have an unmarked copy for the exercises. I also had to decide who was going to mark which script. In the end, I settled on the approach that students would mark a scrip that was similar to their own. My intention being that students would be critical of the script they mark, while realising that it was similar to their own, this way they develop skills in being critical of their own work.

While I made sure to choose a week that had a fair few proves (as it is often the case that first year mathematics students find proving things the hardest), the homework was relatively harder than previous homework. As a result, I had many scripts with some unanswered questions, defeating part of the purpose of peer marking. But everyone answered at least one question, which was still enough to run the peer marking exercise.

In each tutorials, I gave the students 10 minutes to mark someone else script without access to the solutions. As I had a few complaints on how they were supposed to mark without having the solutions, this allowed me to point out that with proves questions the solution is not necessarily unique. Therefore, they should instead read through the answer and check whether it makes sense. Do they understand the general direction of the argument? Is every step here? Are there any assumption made that should not have been? Have every assumption given in the question used? etc. I saw some students check some calculations on a spare piece of paper, to which I pointed out that if they had to do that, then the script was missing details.

At the end of the exercise, I asked students how they found it. The two common responses were "This was hard" and "This answer looks right but its different to what I wrote". To the later, I remade the comment that some questions had more than one way to approach it. For the first comment, I got them to elaborate on what they found hard. I discussed with them how they can make sure that improve their solutions, so that it was easier to read (mathematically), more rigorous and hence easier to mark. We discussed how the hardest scripts to mark are those which seems right, but we are not quite sure. There might be a detail missing, or we are not sure if the next step really follows from the previous step. So I re-iterated the point that they should make sure they put enough details, and check if every step is logical.

Overall, I think this exercise enabled them to realise the importance of making their work understandable to a third-party. That the questions they were asking when marking a script, should be the questions they are asking when re-reading their own work (before they hand it in).

While this exercise was some-what useful, I think it was more the message of the exercise that ended up being useful. I noticed that most students did not actually mark the scripts in front of them (i.e., write some feedback and a possible grade). While they all still took part in the discussion, which had the main message I wanted to give, I think I will try to re-run this exercise in a future tutorial. This time, I will use a question they are more comfortable with to mark, and get them to actually mark them.