Case Study 1

Begin Programming Massive Open Online Course

Introduction

University of Reading’s first Massive Open Online Course (MOOC) “Begin Programming: Build your first mobile game” (#FLMobiGame), a seven week course expecting 3-4 hours a week commitment (roughly containing the concepts covered in 8 first year credits) was offered via FutureLearn platform. I was a team member for the planning and creation of this course and also was the lead facilitator of the course.

Situation

This was a beginner level course that used a simple Android game framework to present programming concepts to beginners (see a short demo) as an outreach and brand extension project as well as to be in line with the changes happening in higher education. Because we introduced this as a beginner level programming course, many who did not have the chance to experience programming could get a taste of what programming is through our course whatever their current circumstances were [V1,V2, V4].

MOOCs are open for registration thus large numbers of people from various backgrounds register on them. While some are experienced in higher educational settings others are less so. For example, the youngest participant completing our MOOC was a 10 year old who participated in the course with his parent (to register on FutureLearn one has to be 13 years or above). When this participant first posted on FutureLearn platform I contacted the team leader Prof. Williams, as we were supposed to report under-aged participants. (Once reported, the platform blocks them from accessing content). But we wanted to retain this young participant and we decided to take an alternative action of asking him to use the platform with a responsible adult. This allowed the young participant to engage in our course while ensuring his safety and security in the online environment [V1,V2].

In this case study I am focusing on the challenge of “facilitating a meaningful learning experience for a massive wide ranging learner group”.

Evaluation of potential options/ solutions

Seven weeks (3-4 hours/week) is a short period to teach programming to complete beginners. Thus we as a team decided to create the MOOC as a taster course with a follow up course (Java SPOC – a Small Private Online Course) for those who want to learn a full course on programming. This would allow a larger group of people to engage with programming while also giving the opportunity for the learners who would need to further their knowledge to join the follow on course [V1, V2]. I designed the learning activities, discussions and formative quizzes for each week. The videos where created by Dr. Lundqvist, the lead educator of the course while I created all other content (assessment, quizzes, and articles) [K1, K2, A1, A2].

I created multiple-choice tests with increasing difficulty due to the practicalities of assessing massive number of course participants with the limited functionality offered by the platform [K4, A3]. Participants successfully completing the course were offered ‘Certificate of Participation’ by FutureLearn. This was an important and conscious decision because at the time of this course being offered, FurtureLearn lacked functionality to verify learners (similar to Coursera SignatureTrack scheme). Thus, unlike other FutureLearn partners (in particular the University of Warwick), I was not convinced that University of Reading certificate was an appropriate endorsement for an ‘unverified’ learner. When the University was considering offering certificates to MOOC participants, I wrote an email to project leader (Prof. Williams) and the Head of the School detailing my experience with other MOOC platforms and the issues I saw around such an offering. I was surprised to see my email forwarded to the senior management and the issues I raised forming the basis for discussions in offering certificates to MOOC participants [K6, V4]. (The email is attached as Evidence 1).

We anticipated participants would need a lot of support in the first week where installation and setting up of software were required. As the lead facilitator of the MOOC, in supporting these large numbers I managed a small team of (seven) upper year undergraduate student mentors [K3, A4]. We gave them a brief training on supporting students on the platform and every week we met on Wednesdays to review the week’s activity and learn lessons from our experiences [K5]. I also actively supported peer-learning by facilitating the formation of a learning community around the MOOC. This learning community acted as a community of practice (Wenger, 1998).

Scaffolding Learning

I used my blog as a supporting tool to help learners who had difficulties understanding learning materials. I created blog posts that complemented the content in the course [A2, A4,K1-K4]. Evidence 2 shows an appreciation I received for one of the blogs from a participant in the MOOC.

However, it is impossible to support learners individually in a massive course where participants to facilitator ratio are in the range of thousands. Learners in MOOCs can also suffer from “Network Effect”, especially information overload, even though the network effect postulates that “the value of a product or service increases with the number of people using it” (Ferguson & Sharples, 2014). As there were learners from a range of abilities, the best way to support the MOOC was to encourage peer-learning. It takes time to create a conducive environment for peer-learning, especially in an online environment where building of trust is important. Peer-learning was actively supported by the course team. For example, simple measures such as mentors and educators acknowledging answers by other participants (“More Expert Other” as suggest by Vygotsky (Smidt, 2009)) seemed to provide a positive reinforcement. It also reassured the learners of the “validity” of the answer. This reassurance is likely to have played an important role in the learner community’s acceptance of the “More Expert Other”. Because programming is a technical subject, many coming from other disciplines do not have the knowledge to critically evaluate a given answer. In such situations, it could be speculated that learners are in need of a reassurance from their educators. Once an educator provides the reassurance, it “endorses” the “More Expert Other” as trustworthy.

Experienced programmers in the course personalised ‘their’ games using the expert knowledge they have and created resources explaining how others could also add these features. For example see this blog post http://www.robsbots.org.uk/yet-more-rubish-on-the-net/flmobigame/ [This link was not working July 2018] . It shows how to add sounds to the game, a topic that was not discussed in the course. However, many learners were able to extend their learning to reach that ‘extra’ with the support of their peers. This is an excellent example for Vygotsky’s Zone of Proximal Development theory where the more expert other helps the learner to achieve more than what they individually are capable of.

The first run of the course attracted learners from over 107 territories, meaning that this was a very diverse classroom representing a cohort of global population. As the lead facilitator, I had to be supportive of participants who had very little understanding of English whose comments sometimes could be unintentionally offensive. I also had to support participants who had various disabilities (hard of hearing, colour blind for example) and had issues with connectivity [V1, V2].

Evaluation of results/impact

The School is in a strong position with respect to student recruitment. School has seen a 74% increase in firm acceptances of offers and the management of school believes this is largely due to the positive promotion associated with the school’s online education offerings. For example, following extract is from an email sent by the head of the school to the staff on 23 June 2014.

“The headline figures are as follows.

  • For CS/IT we have 149 firm acceptances (as against 83 at this time last year) and
  • For Cyb/EE/Rob/AI we have 53 (as against 33).

This is a 74% increase in firm acceptances, which is truly remarkable, particularly given that application numbers have only risen about 5% (963 this year, 919 last year, including both Home and Overseas)….

All in all though, this achievement is a testament to the hard work that has gone into improving our visit day experience and the impact of our online education offerings.”

Our learning from this project was shared at OER14 conference (Liyanagunawardena et al, 2014) and currently several papers are under review for journal publications [A5, V3].

Some of the participants in this open course have developed their mobile game at a level where they can publish it as an app on Google Play (Figure 1), which is a remarkable achievement for an open online course and participants themselves.

A screenshot of Brix n Blox Google Play games

Figure 1. Google Play games

The final week test scores (as of data received on January 06, 2014) shows that out of the people who attempted the test 68% scored over 90% while 96% scored over 50%. The average score was 88.2%. Learner comments on the last week of the course was very positive and these together with the test scores suggest that the learners in the course have acquired some knowledge and the MOOC has delivered meaningful learning to its massive audience [K5, V3].

I have analysed the learner data (provided by the platform) to identify the topics that learners find it hard to grasp. For example, by analysing the number of participants getting a question wrong can provide an indication of the difficulty of grasping the concept presented by the question or the difficulty of understanding the question for example, wording. This gives me the opportunity to reflect on the materials and improve the course for subsequent runs, for example by rewording a question, adding more worked examples etc. (Figure 2) [K3,K5, K6].

A spreadsheet showing how participants have answered each quiz question

Figure 2. Using learner analytics to improve course

Conclusion

Begin Programming was University’s first MOOC and as a co-creator and the lead facilitator of the course I learnt many important lessons. In a MOOC, educators cannot please everyone (Liyanagunawardena, Lundqvist & Williams, 2014). On reflection, I feel we could have said in the course blurb that this course is “challenging but fun” and also to have a “techie” friend in hand if the participants think that they are unable to download and install software. In fact, we have included these in our subsequent run course descriptions.

Some consider numbers (enrolled, participated and/or completed) to show how successful a MOOC is. However, various stake holders in MOOCs may define success differently (Liyanagunawardena, Lundqvist, Parslow & Williams, 2014; Liyanagunawardena, Parslow & Williams, 2014). As the lead facilitator of the course, I have seen many non-programmers take to the waters of programming with our MOOC. It then opens a different perspective for them; a glimpse to the world of programming. They cherish the game that they developed and are proud of what they have achieved. For example,

  • “Thanks very much for an interesting course... Not sure that I will do much more but at least I have a game on my phone that I can show to my grandchild and say “I did some of that!”.

It gives me a great pleasure knowing that I was part of the team that made it possible. At the same time I am proud to have helped the University get the first mover advantage in the world of MOOCs with FutureLearn. As an educator I always want to instigate the desire to learn more within my students. From the participants’ comments (see below) I know I have achieved it in the MOOC.

  • “Excellent introduction, enjoyed it a lot. Love to attend another course taking what we've learnt to the next level.”
  • “Thank you for an amazing course. Although I did not contribute often to the discussions, I enjoyed this course and learned more than I expected. I intend to continue with Android application development now that you have given me the start I needed.”
  • “Great beginner's course to java programming. I would very much like to learn more about developing complex algorithms that represent situations in everyday life, using these to develop much more complex apps.. Let's all learn together to help each other.”

In my view, we have offered a very successful course and it has given me the opportunity to reach learners across the world in massive numbers and use my expertise to help them learn programming and create a desire to learn more.

References

  • Ferguson, R. and Sharples, M. (2014). Innovative Pedagogy at Massive Scale: Teaching and Learning in MOOCs, EC-TEL 2014, Graz, Austria, September 19, 2014.
  • Liyanagunawardena, T. R., Lundqvist, K. O., Micallef, L. and Williams, S. A. (2014) .Teaching programming to beginners in a massive open online course. In: OER14: building communities of open practice, 28-29 April 2014 , Newcastle.
  • Liyanagunawardena, T. R., Lundqvist, K. O., and Williams, S. A. (2014). Who are with us: MOOC learners on a FutureLearn course, under review for British Journal of Educational Technology.
  • Liyanagunawardena, T. R., Lundqvist, K. O., Parslow, P and Williams, S. A. (2014). MOOCs and Retention: Does it really matter?, EC-TEL 2014 workshop, Graz, Austria, September 19, 2014.
  • Liyanagunawardena, T. R., Parslow, P and Williams, S. A. (2014). Exploring ‘Success’ in MOOCs: Participants’ Perspective, under review for Research in Learning Technology.
  • Smidt, S., (2009). Introducing Vygotsky: A guide for practitioners and students in early years education, Routledge, Oxon
  • Wenger, E. (1998). Communities of Practice: Learning, Meaning, and Identity, Cambridge University Press, Cambridge.