Audience: Learners taking the course. (10 responses)
Purpose: Results will act as a baseline later in evaluation process.
Position: Right at the beginning of the learning journey.
In sequential order the first evaluation method would have been the pre learning experience journey survey. The results from this survey will act as a reference point later on in our evaluation process. However, on their own the results are valuable enough in evaluating where your learners are at, but what is really interesting is when you compare these results with the results of our rate you confidence survey later on in the process.
So after analysing this data we are able to put together a relatively accurate picture of who the learners are and how confident they currently feel about their knowledge around the registration process and their ability to register. Being able to establish this will allow us to compare it with the results of the next survey in our evaluation plan. The process is essentially mimicking a pre and post test, but in an informal manner that looks at overall perceived knowledge and confidence levels.
About 30% of learners said they did not feel confident in going through the registration process.
90% of this cohort is in the LTXD program. (Mmmmmm?)
Nearly half of the cohort says that they do not know anything about the registration process at NYU in general and at the ECT department specifically.
Audience: Learners taking the course
Purpose: Comparing these results with the results of the first survey will allow us to evaluate the effectiveness of instilling a sense of confidence in achieving the learning goals and objectives. These results will also tell us which learning objectives our design is failing to hit and in so doing guide us in redesigning content.
Position: Right at the end of the learning path, before learners start the registration process.
So after learners have completed this form will be able to reach insights like: "After completing the learning journey, approximately 90% of learners are either confident or very confident in their ability to go through the registration process. This (hopefully) shows an improvement of x% when compared to confidence levels at the start of the learning journey."
We will also be able to see which learning objectives learners feel they have mastered and which ones they are less sure about.
Being able to do this will not only give us an indication of what section of our learning journey needs work content wise, it will also allow us to ask learners what they still struggle with, what else they need and what could have helped them better master they objective.
(Though we didn't send the survey, those are possible insights/ information we may have.)
"80% of students agreed that they could recall critical information and/or steps regarding the registration process."
or
"60% of learners were on the fence about being able to proficiently navigate and effectively utilise the course registration tools on Albert."
The latter type of insight will allow us to what are we doing wrong in the section of our learning journey that covers this outcome, i.e. our video tutorials. This will allow is to do user testing on this section specifically, and to potential tweak or overhaul the content.
Audience: Learners taking the course
Purpose: Evaluating the learning experience's efficacy in achieving its goals.
Position: After the learner has completed the registration and enrolment process.
This survey will allow us to establish if the learning experience was actually able to achieve its goal. It will also allow the learners to give anonymous and honest feedback on things like, what worked, what did not work, what they struggled with, what the learning experience did well, what it did poorly, and, ultimately, what whether or not they found it useful.
(Though we didn't send the survey, those are possible insights/ information we may have.)
"75% of learners said they found the experience useful."
"A large amount of learners complained about the Edpuzzle question in the video tutorial being either irritating, distracting and/or pointless."
"A significant amount (63%) of learners didn't feel the Slack channel was a helpful to their learning."
There will also be a bunch or really valuable qualitative insights generated by questions 2, 4, 7 and 8. The responses here can be plotted out on an affinity map to find common themes, issues or patterns.
Audience: Prospective, new or current ECT students
Our participants: 2 (Madhu and Yuning). Ideally we would have liked to have 5.
Purpose: To evaluate what is working and what is not working within our lessons, quizzes and videos.
Position: To be performed before learning experience is rolled out, when the e-learning resource in its beta stage and even earlier.
Method: Think aloud protocol (TAP) with an accompanying interview.
Analyses: Affinity map for both TAP and interview.
Think-aloud protocol and interview
Using the rich qualitative data we were able to gather during our TAP and interview with current ECT students Madhu and Yuning, we were able to plot out an affinity map (as seen above) to reach some of the following insights that will allow us to improve the content and flow of the e-learning resource:
The headers, fonts and font sizes need to be consistent across the course.
Text is hard to read in the quiz feedback sections.
User enjoyed simplicity. Felt it was very clean in some sections.
The video is too long. We need to cut the recap section and bring it down to 3 mins.
The information in the first part of the video is already in the previous modules so it may be redundant.
All the different search methods need to be covered. So class number, course code, filters, course description.
Some quiz questions need to be changed. For example, we should not test the course number but test students to identify where to find the course number.
Quizzes can be a little bit longer.
Edpuzzle is unnecessary in the videos.
Rather have post video quiz or a post-module quiz.