Current Projects

A selection of what I am currently working on: further down the page are projects approaching some degree of 'done'.

1. Expert guided crowd-sourced learning content.

For some years, I have been incorporating assessments into my courses that require students to act as creators of learning and assessment content, through getting them to write Multiple Choice Questions (MCQs), using the PeerWise tool.

This year in my Physics 101 course, we are trying to take things a step further, with letting students develop a wider range of learning content associated with their course. Each week, a fraction of the cohort will be tasked with producing an original learning object (e.g. MCQ, worked example problem, mediacast (screencast, podcast, pencast), slides plus narrative etc) based on the pre-reading content set for that week of the course. These will be submitted ahead of the start of the next week of the course, graded and the best ones utilized in class, online as practice resources / problems, in homework and on the exams. Students whose work meets or exceeds a quality threshold will be invited to attach a Creative Commons BY-NC-SA license to their work, and the bundle of such resources made freely available as open educational content at the end of the course.

I got distracted one evening and made a short video about the idea using PowToon.

The project is funded as part of the first wave of funding through UBC's Flexible Learning Initiative: brief details on the project are available on the FL project site.

Towards the end of the project, I was interviewed about it - this 10 minute clip gives a good overview of what happened and what we learned doing this.

2. Adaptive comparative judgement tool

So PeerWise provides a way for you to let students create one type of assessment content (ie MCQs), but when your course is done and you have >2000 of them, how do you figure out the best ones. One way would be to use machine learning techniques to look at the sentiment of the comments students attach to particular questions (thinking about that......)

An alternative approach is to look at peer evaluation of a different type of question and answer. What about if students submitted short-text answers to questions (that were posed either by instructors or their peers)? From a whole pile of text-based answers, how do you determine which ones are the 'best' from students?

We have built a prototype online tool to help with this, that uses the idea of adaptive comparative judgement. Here's a one page summary:

ACJ One pager

Projects approaching some degree of 'done-ness'

3. PeerWise-Community.org

As we built up some practical experience using PeerWise, and disseminated what we were doing through various talks, posters etc, we would often get asked the same questions by others who wanted to get started with the system. These questions were things like "How should I grade students?" "Is anyone using this is <insert discipline>?" and so on.

We decided set up an online community of practice site to serve a number of purposes:

  • To provide a forum and online space for those using the system to connect with others, ask questions and share wisdom.
  • To act as a central repository of articles, publications and resources associated with using the system in courses (or at least, collect all of these into one space, even if we didn't actually host them)
  • To publish occasional blog posts to the community about topics related to PeerWise.

At the time of writing (Sept 2013), the site: http://www.PeerWise-Community.org has been live for nearly a year, with just over 300 registered members across the world. Members can post to the site, but the content is open to all, with the site registering nearly 5000 unique visitors since launch.