University of Reading’s first Massive Open Online Course (MOOC) “Begin Programming: Build your first mobile game” (#FLMobiGame), a seven week course expecting 3-4 hours a week commitment (roughly containing the concepts covered in 8 first year credits) was offered via FutureLearn platform. I was a team member for the planning and creation of this course and also was the lead facilitator of the course.
This was a beginner level course that used a simple Android game framework to present programming concepts to beginners (see a short demo) as an outreach and brand extension project as well as to be in line with the changes happening in higher education. Because we introduced this as a beginner level programming course, many who did not have the chance to experience programming could get a taste of what programming is through our course whatever their current circumstances were [V1,V2, V4].
MOOCs are open for registration thus large numbers of people from various backgrounds register on them. While some are experienced in higher educational settings others are less so. For example, the youngest participant completing our MOOC was a 10 year old who participated in the course with his parent (to register on FutureLearn one has to be 13 years or above). When this participant first posted on FutureLearn platform I contacted the team leader Prof. Williams, as we were supposed to report under-aged participants. (Once reported, the platform blocks them from accessing content). But we wanted to retain this young participant and we decided to take an alternative action of asking him to use the platform with a responsible adult. This allowed the young participant to engage in our course while ensuring his safety and security in the online environment [V1,V2].
In this case study I am focusing on the challenge of “facilitating a meaningful learning experience for a massive wide ranging learner group”.
Seven weeks (3-4 hours/week) is a short period to teach programming to complete beginners. Thus we as a team decided to create the MOOC as a taster course with a follow up course (Java SPOC – a Small Private Online Course) for those who want to learn a full course on programming. This would allow a larger group of people to engage with programming while also giving the opportunity for the learners who would need to further their knowledge to join the follow on course [V1, V2]. I designed the learning activities, discussions and formative quizzes for each week. The videos where created by Dr. Lundqvist, the lead educator of the course while I created all other content (assessment, quizzes, and articles) [K1, K2, A1, A2].
I created multiple-choice tests with increasing difficulty due to the practicalities of assessing massive number of course participants with the limited functionality offered by the platform [K4, A3]. Participants successfully completing the course were offered ‘Certificate of Participation’ by FutureLearn. This was an important and conscious decision because at the time of this course being offered, FurtureLearn lacked functionality to verify learners (similar to Coursera SignatureTrack scheme). Thus, unlike other FutureLearn partners (in particular the University of Warwick), I was not convinced that University of Reading certificate was an appropriate endorsement for an ‘unverified’ learner. When the University was considering offering certificates to MOOC participants, I wrote an email to project leader (Prof. Williams) and the Head of the School detailing my experience with other MOOC platforms and the issues I saw around such an offering. I was surprised to see my email forwarded to the senior management and the issues I raised forming the basis for discussions in offering certificates to MOOC participants [K6, V4]. (The email is attached as Evidence 1).
We anticipated participants would need a lot of support in the first week where installation and setting up of software were required. As the lead facilitator of the MOOC, in supporting these large numbers I managed a small team of (seven) upper year undergraduate student mentors [K3, A4]. We gave them a brief training on supporting students on the platform and every week we met on Wednesdays to review the week’s activity and learn lessons from our experiences [K5]. I also actively supported peer-learning by facilitating the formation of a learning community around the MOOC. This learning community acted as a community of practice (Wenger, 1998).
I used my blog as a supporting tool to help learners who had difficulties understanding learning materials. I created blog posts that complemented the content in the course [A2, A4,K1-K4]. Evidence 2 shows an appreciation I received for one of the blogs from a participant in the MOOC.
However, it is impossible to support learners individually in a massive course where participants to facilitator ratio are in the range of thousands. Learners in MOOCs can also suffer from “Network Effect”, especially information overload, even though the network effect postulates that “the value of a product or service increases with the number of people using it” (Ferguson & Sharples, 2014). As there were learners from a range of abilities, the best way to support the MOOC was to encourage peer-learning. It takes time to create a conducive environment for peer-learning, especially in an online environment where building of trust is important. Peer-learning was actively supported by the course team. For example, simple measures such as mentors and educators acknowledging answers by other participants (“More Expert Other” as suggest by Vygotsky (Smidt, 2009)) seemed to provide a positive reinforcement. It also reassured the learners of the “validity” of the answer. This reassurance is likely to have played an important role in the learner community’s acceptance of the “More Expert Other”. Because programming is a technical subject, many coming from other disciplines do not have the knowledge to critically evaluate a given answer. In such situations, it could be speculated that learners are in need of a reassurance from their educators. Once an educator provides the reassurance, it “endorses” the “More Expert Other” as trustworthy.
Experienced programmers in the course personalised ‘their’ games using the expert knowledge they have and created resources explaining how others could also add these features. For example see this blog post http://www.robsbots.org.uk/yet-more-rubish-on-the-net/flmobigame/ [This link was not working July 2018] . It shows how to add sounds to the game, a topic that was not discussed in the course. However, many learners were able to extend their learning to reach that ‘extra’ with the support of their peers. This is an excellent example for Vygotsky’s Zone of Proximal Development theory where the more expert other helps the learner to achieve more than what they individually are capable of.
The first run of the course attracted learners from over 107 territories, meaning that this was a very diverse classroom representing a cohort of global population. As the lead facilitator, I had to be supportive of participants who had very little understanding of English whose comments sometimes could be unintentionally offensive. I also had to support participants who had various disabilities (hard of hearing, colour blind for example) and had issues with connectivity [V1, V2].
The School is in a strong position with respect to student recruitment. School has seen a 74% increase in firm acceptances of offers and the management of school believes this is largely due to the positive promotion associated with the school’s online education offerings. For example, following extract is from an email sent by the head of the school to the staff on 23 June 2014.
“The headline figures are as follows.
This is a 74% increase in firm acceptances, which is truly remarkable, particularly given that application numbers have only risen about 5% (963 this year, 919 last year, including both Home and Overseas)….
All in all though, this achievement is a testament to the hard work that has gone into improving our visit day experience and the impact of our online education offerings.”
Our learning from this project was shared at OER14 conference (Liyanagunawardena et al, 2014) and currently several papers are under review for journal publications [A5, V3].
Some of the participants in this open course have developed their mobile game at a level where they can publish it as an app on Google Play (Figure 1), which is a remarkable achievement for an open online course and participants themselves.
Figure 1. Google Play games
The final week test scores (as of data received on January 06, 2014) shows that out of the people who attempted the test 68% scored over 90% while 96% scored over 50%. The average score was 88.2%. Learner comments on the last week of the course was very positive and these together with the test scores suggest that the learners in the course have acquired some knowledge and the MOOC has delivered meaningful learning to its massive audience [K5, V3].
I have analysed the learner data (provided by the platform) to identify the topics that learners find it hard to grasp. For example, by analysing the number of participants getting a question wrong can provide an indication of the difficulty of grasping the concept presented by the question or the difficulty of understanding the question for example, wording. This gives me the opportunity to reflect on the materials and improve the course for subsequent runs, for example by rewording a question, adding more worked examples etc. (Figure 2) [K3,K5, K6].
Figure 2. Using learner analytics to improve course
Begin Programming was University’s first MOOC and as a co-creator and the lead facilitator of the course I learnt many important lessons. In a MOOC, educators cannot please everyone (Liyanagunawardena, Lundqvist & Williams, 2014). On reflection, I feel we could have said in the course blurb that this course is “challenging but fun” and also to have a “techie” friend in hand if the participants think that they are unable to download and install software. In fact, we have included these in our subsequent run course descriptions.
Some consider numbers (enrolled, participated and/or completed) to show how successful a MOOC is. However, various stake holders in MOOCs may define success differently (Liyanagunawardena, Lundqvist, Parslow & Williams, 2014; Liyanagunawardena, Parslow & Williams, 2014). As the lead facilitator of the course, I have seen many non-programmers take to the waters of programming with our MOOC. It then opens a different perspective for them; a glimpse to the world of programming. They cherish the game that they developed and are proud of what they have achieved. For example,
It gives me a great pleasure knowing that I was part of the team that made it possible. At the same time I am proud to have helped the University get the first mover advantage in the world of MOOCs with FutureLearn. As an educator I always want to instigate the desire to learn more within my students. From the participants’ comments (see below) I know I have achieved it in the MOOC.
In my view, we have offered a very successful course and it has given me the opportunity to reach learners across the world in massive numbers and use my expertise to help them learn programming and create a desire to learn more.