Core area 1: Operational issues

A) An understanding of the constraints and benefits of different technologies

In my CMALT Portfolio Core Area One looked at analysing two platforms for offering a Small Private Online Course (SPOC). Here I will discuss investigating platforms for video assessments.

Investigating platforms for video assessments

Problem

Many of our students become surveyors who need to make presentations to clients. At present, we do not give our students the opportunity to improve their skills in making presentations by practicing. We do not assess them in this important skill other than assessing their presentation slides. For students who are following the Apprenticeship route, they have to work toward getting the professional membership of the Royal Institution of Chartered Surveyors (RICS) at the end of their studies. For this, they have to make a submission for the Assessment of Professional Competence (APC) and make a presentation. Therefore, to support our students in their APC, UCEM wanted to investigate the possibility of assessing presentations to help our students improve their presentation skills.

Current Limitations

UCEM is currently using Moodle platform integrated with Turnitin for student assignments. The Assignments are marked on Turnitin. However, there are some limitations to using Turnitin submissions and what it can do to support the above. We are unable to allow video or any other file type such as Excel (which is required for valuations and quantity surveying calculations) in Turnitin submissions as it is only suitable for text submission.

So for example, the module Workbased Research Project (PRJ6WRS) has 30% weighting on its first assessment which is a presentation. For this, we ask students to submit 10 slide presentation with notes (2500 words). However, ideally, this should be a presentation. But due to the limitations of the current software we use we are making do with a PDFed presentation and presentation notes.

Finding a Solution

In order to find a solution to the problem at hand, UCEM hired a new Learning Technology Project Manager (Marieke Guy). She was in charge of the project to assess a suitable technology solution. The project was initiated by the Assessment Working Group, an institution-wide working group created to rethink the UCEM assessment process. Marieke developed the testing plan and I was assigned (part-time) to Marieke's team to help her with the research and testing when it transpired that the project would not progress as fast as the institution required.

Below I am presenting how two technologies were tested at UCEM and how they were implemented. Over the course of 3-5 weeks (part-time) in the Spring/Summer of 2021 Marieke and I tested two video assessment platforms:

The link above provides access to a confidential evaluation report. The document is password protected. Password is provided to the ALT Assessors.

Ease of use for end-users (student/instructor)

On GoReact the student has to go into the assignment and then click on the "View Instructions" tab to view instructions despite the bolded tab giving you the impression you are already on the "INSTRUCTIONS" tab. Please see the images. The instructions are not clearly visible either. Only a small space is given for the assignment instructions and you have to use both horizontal and vertical scroll bars to read instructions.

GoReact Instructions tab

GoReact View Instructions tab

GoReact took over 30mins (start 08:17 and finish 8:49) to process a 457MB video file which is not a great experience for a student submitting an assignment. On the other hand, Bongo was quicker to upload and process the submission.

The Bongo recording button only gets activated if the student makes a noise. This was not clear to us when testing. This is not how students would expect software to work, therefore we would have to provide guidance as to how the software is designed.

Both platforms provided ease of submission. Bongo overwrites the previous submission if a student makes a second submission to the same assignment while the assignment is accepting submissions. Therefore, on Bongo, there is always only one submission per student. On the other hand, on the GoReact platform, a student can upload multiple video files and they are all stored. Either the student or the instructor is able to delete submissions. However, if a student submits multiple video files, either by accident or on purpose, they all will be presented to the instructor for marking. When questioned GoReact's reply was that "the instructor can determine by themselves or engaging with the student which assignment the instructor will grade/rate..". Though this is a great feature to see how a student develops their assignment, we found this to be too cumbersome for both our students and tutors not knowing which is the final version to be submitted and making that judgment. It was possible to say the last submission based on the timestamp would be considered but this would put more effort on the instructor to identify the last submission. On the other hand, a student could claim to have submitted the correct version but be graded on a different submission as the student can see their correct submission along with incorrect submissions in the system.

There is another complication with GoReact when there is more than one submission by the same student; because this interferes with how marks get pulled into the grade book.

In the testing we did, when a student had 32/45 and 10/45 marks for two submissions the lowest mark was pulled into the grade book. When another student had 8/45, 6/45, and 18/45 as marks 18/45 was pulled through to the grade book. If implemented this could complicate assessment.

GoReact assignment submission with multiple files

With GoReact it was easy to add supporting files as PDFs. However, with Bongo this was not an option. Where it was required to submit slides or preparation for a video presentation in a document if we were to use Bongo we would have had to keep a submission link open for students on the VLE making it two submissions. However, as a relatively new product, Bongo was more receptive to our requests for new features.

GoReact provides better video editing capability within the platform while Bongo provides basic facilities.

Storage and Reuse of video

This is easily achieved in GoReact. In Bongo, all videos have to be uploaded to the platform. For example, you could not ask students to provide a health and safety commentary for drone footage available on YouTube. Though this can be seen as a drawback in the Bongo platform, the compliance team saw this as an advantage which can guarantee that the institution will not violate copyright laws. GoReact doesn't allow you to connect the Library within the system to another video/media repository where as Bongo was in the process of allowing you to use institutional media library.

Feedback

In Bongo, instructors can specify key terms to measure how often content-related terms such as brand names, industry terminology, and or vocabulary terms are said in a recording. The GoReact platform does not provide any automatic analysis.

Bongo allows anonymous marking and feedback while in GoReact this is not possible.

Group Projects

In Bongo, you are able to set up milestones and meetings. This allows the instructor to view student participation and contribution, which is always a problematic area in group projects when students complain that their colleagues did not contribute sufficiently. However, at present, you cannot bring VLE groups into Bongo. On GoReact you can set up meetings and add presenters however timings need to be agreed upon external to GoReact. This means that the students will have to share their emails, phone numbers and or messaging IDs to make the group project work. GoReact Grade passback will not work for multiple presenters.

Compatibility

Bongo allows students to record videos on phones. GoReact does not work on mobile devices. If a student does not have a webcam on their laptop or computer with GoReact there is no way of producing the video.

Analysis and Reflection

Both Bongo and GoReact are well-known video assessment platforms with their own features, benefits, and constraints. While in financial terms it was cheaper to adopt one software, this was not the main consideration in the testing process. In the testing process, the focus was end-user experience that is how our students and tutors will find the new software. It was also important for us as the institution to make sure our students were at ease using this software. Having not used a video assessment system before, it was likely that students will hesitate and resist the introduction of such a system. Therefore we wanted the best experience possible for our students and to make their assessment work for them, for the institution and for the industry.

Out of the testing done in a sandbox area, I have highlighted the main benefits/constraints that were identified. Full testing document is only made available (above) to the ALT assessors due to its sensitive nature.

Benefits of Bongo:

  • The interface is user-friendly and intuitive. Assignment submission instructions are clearly visible at the point where the student is making the submission.

  • Video submission is not too time-consuming. The video upload and processing is faster than the other platform tested.

  • The platform works well with limited bandwidth (tested by a colleague in an area with patchy and low-speed connection)

  • Works in most countries including China. With over a quarter of international students, this is an important criterion to be met by an assessment submission system. (eg. Some YouTube videos do not work in China and Zoom does not work in certain Middle-Eastern countries)

  • Assignment submission - Bongo has a clearly marked submission button that allows the student to make the submission

  • One submission: Bongo allows only one submission to go through at the closing date of an assignment. A student can submit multiple times while the assignment is accepting submissions but every time the new submission overwrites the existing one.

  • Allows automatic analysis: The software allows the instructor to define key terms and automatic analysis of the video can pick up these terms facilitating the marking process

  • Bongo can be seamlessly integrated with the existing VLE using LTI

  • Bongo marks are pulled through to the VLE

  • Bongo allows anonymous marking

  • Bongo provides excellent support for group projects allowing the instructor to view student participation in the process as well as in the final product (submission)

  • Bongo works across a range of devices and platforms

  • Short video clips can be combined within the platform with basic edit functions

Drawbacks of Bongo:

  • Bongo does not allow using YouTube videos or videos hosted elsewhere. This is a drawback because, for example, drone footage of a construction site on YouTube could be used to ask our students to provide health and safety commentary, which was a scenario we envisaged that could help construction professionals and be an authentic assessment of their skills and knowledge needed in the industry. However, this was also seen as a benefit by the copyright officer who was concerned about possible violations of using copyright materials without appropriate permission. On the other hand, it is also a benefit because if a video hosted elsewhere is used for an assignment and the video was taken down by the owners during the assignment submission period this could cause unnecessary operational issues for the institution.

  • Bongo does not allow the submission of additional files. This is a drawback compared to the other platform. However, it is possible to be overcome by giving access to a file upload in the VLE. As the platform is still developing we were also able to request features too ("wish list").

  • The recording button only gets activated when the user makes a noise. This is not necessarily what a student will expect from the software. We decided to provide clear guidance to overcome complaints from students so that they are aware of what to expect from the software.

  • Video editing functionality is not as sophisticated as it could be. This could also be construed as a benefit because otherwise, a student may spend a needlessly long time editing a submission when they are required to submit a video assessment to show their knowledge and skills of subject matter, not video editing.

Overall, in my view, Bongo provided the best user experience. It was important to remember that the student, who is the main user of this system will be in a pressurised environment when taking their exams/assignments. In such a scenario, being able to view instructions clearly is of paramount importance. On the other platform (GoReact) there was no clear indication of what the student has submitted or in fact whether it is submitted. In an exam situation, this is not a place where the student would want to be.

Furthermore, having multiple submissions causes confusion to instructors and ultimately to the VLE grade book as the marks were not correctly pulled through from the video assessment platform. As a video assessment platform, the main task of the platform should be to be able to submit assignments, mark, and display grades. When there are features compromising any of these main functions not working as an institution wants them to be, it is difficult to justify the use of such a system.

Therefore, even with some drawbacks, we decided to go with the Bongo platform. The system is now being implemented with the view of making it available to students in the Autumn 2022 Semester.

Looking back at the experience, Marieke and I created a detailed document of all testing that was carried out (made available to ALT Assessors above). This was really useful to go back to after several months of working on other projects as buying software takes institutional negotiations etc and not necessarily follows straight after testing. Having detailed documentation also helped when people who were not involved in the initial testing were brought into the team (new members of staff for example). This saved much time and unnecessary stress because sometimes we rely on memory and as humans, we are not always able to recall what is required instantly. Furthermore, the latest Moodle update on the sandbox area accidentally wiped out the testing area we used for Bongo and GoReact. I have written this section using the screenshots available in the document as I was not able to get new screen grabs from the testing area. This experience showed the importance of maintaining good documentation whatever the task carried out and this will be the main take-home message. However, one thing to learn from this experience is also the need to perhaps ask for a specific backup of test courses on the sandbox area when a project is completed.

The project is now moving forward and we hope to launch Bongo as an assessment platform for selected Autumn semester modules.

1b) Technical knowledge and ability in the use of Learning Technology

As a Learning Technologist, it is important to be able to be flexible to unlearn and relearn new skills as new and useful technologies emerge. Being able to relate to a variety of technologies out there helps to suggest suitable solutions to problems at hand.

In my CMALT 2016 portfolio, I have broken down my experience with various learning technology software into the following sections.

  • Programming Languages and Mobile Technology

  • Instructional Design and sharing Software

  • Virtual Learning Environments

  • eBooks

  • Assessment

  • Learning Technology

I do not believe it is necessary for me to repeat my competency in the software I have already demonstrated in the previous portfolio. Therefore, I will demonstrate my knowledge of other software that I have started using recently. I am demonstrating here my knowledge and skills in Accessibility Testing Tools and Data Analysis and scripting.

Accessibility Testing Tools

I am championing accessibility at the University College of Estate Management and I am using many tools to test and advise on accessibility. It is also important to identify tools that are easy to use for non-accessibility experts so that they can check their documents for accessibility easily.

Why accessibility testing?

First and foremost accessibility testing helps us identify issues with our content and make them more accessible to people with disabilities whether the disabilities are visible or invisible. This helps not only our disabled students but also all students because we know creating accessible content helps everyone. Secondly, creating accessible content helps us comply with the new accessibility regulations.

I can give you many examples but I think these quotes from my colleagues who provided feedback after accessibility training will show why accessibility testing tools are important:

"I feel empowered to adopt the various tools ... to ensure my presentations are accessible to all. I found the foreground/background colour checker extremely useful whilst being simple to use, a powerful tool. It is very easy when preparing presentations that if they are accessible to you then they will be accessible to everyone..."

"... colour contrast can have such a dramatic effect on users. I am often looking to create new content and resources for students so it will be important to use the WebAIM Contrast Checker for this. Just because something is very colourful, does not necessarily mean it is better! .. It was interesting to discover there are many tools that already exist on almost all laptops/devices to aid accessibility. I previously thought many of these would have to be specified add-ons and it was helpful to see them in action. This will help me to create content that is more compatible with these tools..."

".. having learnt the new functions available I will be looking to use accessible functions in the future and knowing how to do this should help students need any additional support. In my role as an AOO [Apprentice Outcomes Officer] it will help me to support students more in their journey with us.."

Accessibility Insights

I use Accessibility Insights as an app (Accessibility Insights for Windows) as well as an extension for Chrome browser. This allows easy testing of colour contrast.

Accessibility Insights Chrome extension allows you to test websites with automatic checks followed by manual checks to complete a thorough investigation into accessibility. This tool allows you to export your results as a report which can be viewed with a web browser. This software is completely free and is the best I have found that provides all these facilities for free.

Accessibility insights for Windows

Accessibility insights for Windows

Accessibility insights for Web Assessment Report

Accessibility insights for Web Assessment Report

axe Developer Tools

This is the tool that the Government Digital Services is using to test accessibility compliance. However, the free Chrome extension has very limited features. You can't save or export a report without the paid-for service. The two-week period of trial allows you to explore all the great features available. Even the paid-for version only allows exporting results as either a JSON file or CSV (issues only) which is not ideal.

axe DevTools

axe DevTools

axe Report of a website's accessibility

axe Report of a website's accessibility

WAVE Tool

WAVE accessibility tool is available as a Chrome extension or as a web tool. Again this software allows testing websites for accessibility. The software allows a quick view of issues and tags places where issues are and provides helpful information to correct the issue.

WAVE Web tool

WAVE Web tool

WAVE Chrome Extension

WAVE assessment report

Headings Map

Headings Map is another Chrome extension that allows you to check the heading levels of a web page to quickly view whether the heading structure is correctly in place.

Headings Map

Headings Map

WebAIM Contrast Checker

WebAIM contrast checker is really useful for testing colour contrast anywhere anytime. Until recently, you needed either RGB code or Hexadecimal value of the colour to use it in the checker. But now they have introduced a colour picker which is really easy to use. I have now updated my YouTube video to reflect this change.

I advocate using this tool because it is web-based (available anytime anywhere to anyone with internet access) and also shows how to interpret the results.

Accessibility: Check the colour contrast

A video I have created to demonstrate the use of WebAIM Colour Contrast Checker tool.

Microsoft Accessibility Checker

Microsoft products contain an inbuilt accessibility checker which many people don't know about. I have been raising awareness at UCEM so that colleagues will hopefully start using it more to check their work. However, no automatic accessibility checker is perfect. You need to know that when you are using these tools.

For example, there are shades of green and red you can use with colour contrast will be 4.5:1. But for people with colour vision deficiency this colour combination will not work. Similarly, if you produce a document without any headings, Check Accessibility will not complain. But you have to know that heading structure is a basic need for vision-impaired users who rely on screen reader software.

NVDA Screen Reader

NVDA is a free screen reader software you can use to test your creations are accessible to screen reader users. However, not being a competent screen reader user it is very difficult to use the software as a user reliant on the software will use it. It is ideal if you can find screen reader users to test your work but failing that this is something you can do. I am sharing a video I have created to give a demonstration of how screen reader works.

Adobe Acrobat Pro DC

Making PDF documents accessible is a very difficult task. You can test PDF accessibility using Adobe Acrobat Pro DC's accessibility testing tool.

I have created a series of blog posts to demonstrate PDF accessibility checking process:

Automating repetitive tasks

Why learner data analytics?

At UCEM we do a lot of learning data analysis to find out how our learners use resources and what they think of the resources. Understanding our learners this way is really important to us because as a distance education providing institution we do not "see" students face-to-face or get their feedback or even cues that face-to-face teaching gives you.

By analysing how many students have accessed assessment briefs we can identify the number of students who have not even engaged with assessment to even look at it. If we find a certain resource has more user clicks than we would expect we can look to see whether it is because there is difficulty in understanding the content etc. Also we can see the types of resources that are popular with students. All these allow us to understand student behaviour and together with student panel discussions, these insights help us improve VLE content and our learning materials.

Each week there is an anonymous survey where learners can provide their feedback about that week's resources. Because it is an anonymous survey, students can give frank feedback without fearing that they will be "found out" for what they have said. The survey details are analysed during the run of a course as well as after a course finishes to first get a feel for how the course is doing (in delivery) and how the course has performed (post-delivery). We also analyse data from Moodle (statistics and reports) to understand student engagement and patterns of engagement.

When a staff member left the institution, while a new staff member was being recruited, I was asked to help with the task of analysing weekly anonymous survey details. Previously we had to go to each survey each week to extract the data because there was no possibility of extracting a data dump for all surveys in one go. Therefore, this has been a manual task of extracting data from VLE (various places), drawing graphs, summarising findings and creating a report for each module.

I wanted to see whether there was a way of automating this repetitive and time-consuming task of extracting data, creating weekly sentiment graphs and presenting the information.

UCEM now has access to IntelliBoard which allows institutions to get VLE data in a variety of formats. However, I do not have access due to the limited user licenses available to the institution. I asked a colleague who has access to the system to extract the data dump for the weekly surveys and I got a data set with thousands of rows of data. This was the first stage - accessing the data. The data provided to me was anonymous.

In the next stage I had to negotiate access to a software that I was able to script with.

I have successfully completed five courses in the Data Science specialization offered by Johns Hopkins University years ago. The courses I have completed are:

  • R Programming

  • Data Scientists Toolbox

  • Getting and Cleaning Data

  • Exploratory Data Analysis

  • Reproducible Research

So, I requested access to R and R Studio software.

However, as these courses were completed many years ago and since then I have not used R programming, I needed to refresh my memory and this was possible via the web tutorials. I also followed the course Essentials of Data Literacy by Davidson on edX.

Coursera Course Completion screen

Coursera Completed Courses

Unfortunately, acquiring the software for the program to script was a very long-winded process due to procurement challenges.

So, I started to create the structure of the program to summarise data (from the data dump) with pen and paper. I designed the full program logic on a piece of paper while waiting for the software to be approved.

When the software procurement was taking way more time than expected, I started scripting the program with R on R Studio using my home computer (working from home I could program on my home computer). Despite the program being ready, I was not able to test the script until I got access to R on my work computer.

After I got access to R and R studio, I used a few rows of data from the data dump to test the program logic. Then when the program was working for a selected module, I created the loops so that it can be extended to work for all available data.

In the end, I was able to program a script that takes data from the Intelliboard data dump and create a Microsoft Excel file for each module containing the summary data for each week.

R Studio Screen shot with script in view

Program script opened in R Studio integrated development environment

VBA Integrated development environment with macro script

VBA Integrated development environment with macro script

Next, I used the data from the created Excel files to create graphs. For this, I used Excel macros. I drew a graph manually and recorded a macro. Then I edited the required lines in the recorded macro to make it work for any file in the required format. For example, when creating a graph you need to select the data range. For the recorded macro to work for any file with the required format, depending on how many rows of data are available, you have to change the data range selection. These were the kind of changes that were required in the macro.

As the final part of the program, I wrote a macro that looped through all the files in a given folder and call another macro to draw graphs. With this, I was able to fully automate the creation of weekly sentiment graphs which was previously done manually.

Also, note that I have used red and green in the graphs but they are used with different lines and fill patterns to support accessibility.

These graphs have been used in the institution for a long time with red, green and grey used for each response. Therefore, I did not want to change the colours and make the users feel uncomfortable. Instead, I tried to use other designs so that color is not the only means of differentiating information. For example, in the bar graph green is a solid fill, red contains a vertical patterned fill and grey contains a diagonal fill pattern.

Final output

Final product of the scripts

Microsoft PowerBI

Recently I have been given access to Microsoft Power BI and I wanted to see whether this software was able to report the same data better. I followed several resources to gain an understanding of Power BI:

I found learning and creating reports with Power BI much easier than working with VBA. So I have created the same set of graphs for reporting with Power BI and now exploring how best this tool can be used for learner data analysis in an accessible way (at the moment categories are identified only by colour and this is the only way PowerBI seem to offer differentiation).

Power BI Report showing graphs

With the use of PowerBI, I was able to further improve the report to provide a way where you could compare how students on one module feel compared to all other modules in the semester or compared to the modules of the same level (say Level 4, 5 or 6). Because PowerBI provides easy access to create visuals with very little know-how needed for programming, it is much easier and more user-friendly than using R for analysing data. However, in order to allow the report to dynamically show different data when categories are selected in some instances, there is some programming required. For example, I wanted the report to show the average percentage of students who have responded "happy" for a given level when a module is selected.

To achieve this, I had to create "calculated measures" in PowerBI for each level and then write a logical statement to select which value to be shown in the PowerBI "Card" visual. I have shown this in the image. For this, I had to learn DAX (Data Analysis Expression) language.

PowerBI report comparing modules

PowerBI is a very powerful tool especially given that we can provide the facility to slice and dice into the data. Here I have provided some feedback I have received about the reports I have created.

However, PowerBI though a software by Microsoft lacks accessibility features. That is, the graph colours cannot be changed to include patterns hence not great for people with colour vision deficiency.

evidence of feedback on reports

Analysis and reflection

I have presented here only a few software tools I have used in recent times. I felt it was necessary to show the wide range of tools I use and to demonstrate this I took the example of accessibility testing. To show my technical ability at the Senior CMALT level, I felt it was better to show an example of recent programming, hence I used the second example of automating a repetitive task.

I think in my reflection I must include a paragraph about the Jisc Accessibility Community as I have used evidence of Accessibility-related work here. I learn so much from the monthly webinars orgnaised by this community and I do feel I have grown so much with their support. There is also an email group (Digital Accessibility Regulations for Education list) and a Jisc Accessibility Community MS Teams area with over 1100 members. In these member areas, people ask questions and support each other. You can see here an example of how I have contributed to the community. I think this is so important as all institutions are trying to find out what we need to do to comply with the new accessibility regulations. When you are stuck and need some support this community is a lifeline. I feel strongly about taking from the community as well as contributing to the community because unless we do give and take it is not a sustainable community.

Contribution to Accessibility Community

I find being able to use and being aware of a multitude of tools helps to support tutors who may ask for advice to find a technology tool that suits their needs. However, it is also important to be able to identify the drawbacks and limitations of such tools. For example, I promote the use of the Microsoft Accessibility checker but at the same time, I tell everyone how it can miss certain issues and the need to be aware of accessibility to make the correct judgment.

When I was tasked with the analysis and creating graphs for modules, it would have been easier to create the graphs manually as it has always been done in the institution that way. However, I strongly felt that such repetitive tasks should not take a person's time and effort if they could be automated. Despite the barriers to getting access to R and R Studio software I persevered. As I have mentioned above, at one point, I was using my home computer to code the script as the procurement process was taking longer to get the software on the work laptop. As we were working from home, due to the Covid restrictions, I could use my home computer to work on the script. However, despite completing the script, I was not able to run it to test the program because I did not want the data files to be transferred across to my home computer even when they did not contain any data that would come under GDPR. Because of this, though it was frustrating, I had to wait patiently until the IT department was ready to approve the use of the software following the procurement process. I had to jump through quite a few hoops to get to the point where I was able to use the software to automate in under 5 minutes a task that would have otherwise taken someone at least a day (considering data extraction too). In the longer run, this is a much more effective and efficient way for UCEM. A take-home message from this experience is if you strongly believe something is the right thing to do, persevere. It will pay off.

The task of programming a macro was somewhat new to me. Despite using recorded macros, I had not gone into recorded macros to change them. At first, it seemed daunting as there were a lot of places where I felt I was drowning. Persevering and looking for online tutorials and code fragments worked well in this situation because there are so many people out there programming macros and sharing their code snippets. Once you get used to the code it is just another programming task albeit in a new programming language - Visual Basic. There was one point when I did not understand why my program wasn't working as it should be. This was when there were one or two rows of data for the categories happy, meh, and sad the graph would swap its X and Y axes. For all other instances, that is when there were more than three or more rows of data, my program would work as expected. This was a mystery to me even after extensive searches I was not able to identify the reason. When I reached out to a colleague who is a Microsoft product expert, he told me that when the number of data points is fewer than or equal to the number of categories Excel automatically swaps the X and Y axes. This was the missing piece of the puzzle. An IF statement was all that was required to test for this condition and if the condition was met the X and Y axes had to be swapped.

Code snippet to Swap X and Y axes

Code snippet to Swap X and Y axes if the condition was met

This experience showed the importance of testing a program thoroughly because there could be unintended consequences of the default behaviour of the software. It also showed the importance of one's network and being open and confident to reach out when you are at a dead end. It is alright not to know something but you should be able to find the answer or give it your best try to find an answer if the answer is required.

The whole experience showed me that I can take opportunities to develop and stretch myself trying out new things and new ways of working. The programming task was the most challenging undertaking I had done recently. However, I enjoyed this experience immensely, and maybe I was connecting with my old self as a graduate software engineer at the start of my career what seemed to be a lifetime ago.

This also taught me more about where macros are stored by Microsoft products. If the macro is saved to be used in multiple files (workbook) then the macro is saved in PERSONAL.XLSB file in C:\Users\<username>\AppData\Roaming\Microsoft\Excel\XLSTART as opposed to within the workbook. This means that if you are sharing a file with a macro either you have to save the macro in the workbook or share the PERSONAL.XLSB.

When a new member of staff joined our team, I was asked to hand over this work to her. I took the time to write a ReadMe file that contains all instructions to run the program to generate the graphs. When I was writing the ReadMe file, I realised that there were so many dependencies (for example, the libraries that needed to be installed and loaded in R and their order of loading) and that all these needed to be properly documented for anyone else using a different computer to be able to run the program. It was an eye-opener because I as the program author being so close to the program, I knew everything inside out; but to be able to detach myself from the program and think like a new person working on a separate computer that may or may not contain all the required libraries was a new experience. This too showed the importance and need for proper documentation for technology projects.

Now that I have access to Power BI software, I can create required reports much more easily. So in a way, the time and energy I put into program in R language and program the macros are no longer required. However, this is always the case with technologies. You have to work with the tools available to you at a given point in time. As more tools are made available, sometimes the work you have done may seem irrelevant or unnecessary but it is part of life in technology fields. This also shows the need to be flexible and adaptable in this line of work because when newer tools are made available one should be able to unlearn what was learned before and relearn what is required for the new tool.

I am really proud of this work because I know over the years it will save so much time that someone would otherwise have had to spend on doing a boring copy-paste and graph drawing exercise.

1c) Supporting the deployment of Learning Technologies

In this section, I am going to show two projects where I have recently supported the deployment of learning technologies in my organisation. I will use the example of rolling out Zoom video conferencing tool and Inspera assessment software (still in the pilot stage). I also run accessibility workshops for staff, which I will discuss under Wider Context, Specialist and Advanced Areas.

Zoom

UCEM is a distance education provider and has been providing online distance education programmes ever since I joined in 2015. We also have fully remote staff members, some even working from abroad.

UCEM was using Blackboard Collaborate as the webinar software when there was a need to investigate a more suitable software for webinars in 2019. Most of our students had issues installing Blackboard Collaborate on their office computers and Zoom being able to be run on the browser was very attractive from this point. Furthermore, at the time Zoom provided a fair automatic transcription accuracy rate compared to not having anything at all in other software. Once Zoom was selected as the software for UCEM webinars, we were tasked with the rolling out of this throughout the organisation.

VLE Resources

I created a whole topic area for Webinars on the "Module Tutoring" course for tutors. When I was creating this resource I tried to use Zoom videos from their website as much as possible. This is because otherwise, we have to keep updating the videos when the interface changes. However, when it was required to show how Zoom was to be used on our VLE, the screencasts were created using Camtasia. I created the scripts for each video, created the screencast, and my colleague Graham did the voice-over which I then synchronised with the screencast video on Camtasia to produce the final video. I decided to use a tabbed page to keep all videos in a place that people were able to refer back to quickly. You can see a screengrab of the main resource I created for tutors. I created five out of the eight videos used in the resource. Three videos were embedded from the Zoom support pages.

Along with these technical Zoom how-to instructions, we included ideas for activities that could be used to support interactivity in webinars such as icebreaker activities, polling, breakout rooms, collaboration activities, timed activities with an online timer shared on screen, and so on. This was to help tutors get an idea about what sort of interactivity they could try out in their webinars.

At the start of the rollout, as we got questions from tutors, I created a Zoom FAQ which was regularly updated to reflect the questions posed to us.

Zoom resources on VLE

Zoom resources on the VLE

Blog posts

You can view the blog posts I have created on Zoom-related issues that were then shared with the UCEM staff.

The post Zoom: Teething Troubles or Features? is the most visited page of the UCEM Blog site and especially during the Covid lockdowns it has received a lot of visitors showing that it may have helped a global audience.

Google Analytics page for UCEM Online education blog

Google Analytics page for UCEM Online education blog

Google Analytics page for Zoom Teething Troubles or Features post

Google Analytics page for Zoom Teething Troubles or Features post

Online Training

My colleague Graham lead the online training sessions where I too joined to provide support. In these sessions, each tutor was given the meeting co-host status for some time so that they could share their screen and try out the new software in a risk-free and supportive environment. These were run either as group sessions with a few tutors or if requested as on-demand sessions for the tutors to practice screen sharing before they conducted their webinar for students.

Many of our tutors are working part-time from various parts of the UK and it was not logistically possible to conduct face-to-face sessions. Furthermore, given that this is a communication software, it was felt letting them join the session and providing them support to explore it together with us would provide the best experience.

Evaluation

There was an evaluation after Zoom software was implemented at UCEM to understand how the process was from the end users' perspective. I designed the survey and this was conducted using the Online surveys tool.

  • Student experience of Zoom webinars survey was open from 10th June – 31st July 2019 and attracted 283 responses

  • Staff experience of Zoom webinars survey was open from 25th June – 31st August 2019 and attracted 27 responses

The report I have written on the evaluation is shared.

Zoom: Webinar Software (Staff and student feedback) (Password Protected)

The link above provides access to a confidential evaluation report. The document is password protected. Password is provided to the ALT Assessors.

This work led to a journal publication Transcripts and Accessibility: Student Views from Using Webinars in Built Environment Education in the European Journal of Open, Distance and E-Learning.

Being Available to Help

I am really glad that my colleagues feel able to reach out to me when they have difficulties and I have always provided them with a prompt response and offer further help if required. This email conversation is between me and a new member of staff who reached out to me for support. At UCEM business as usual Learning Technology queries are sent to a separate CoreServices team which I am not part of. However, this recent conversation shows that my colleagues at UCEM consider me approachable and a point of contact if they feel they need support

My response email providing Zoom support

This email conversation was with a colleague who joined UCEM recently who had issues with a webinar. Note the time of email and the quick response I have sent.

Inspera

UCEM is in the process of trialing the Inspera assessment platform. I was involved in the project from the evaluation phase where we evaluated two software Inspera and Better Exams to see which one would best suit UCEM's requirements.

Evaluation of Assessment software (Password Protected)

The link above provides access to a confidential evaluation report. The document is password protected. Password is provided to the ALT Assessors.

Online Awareness Raising/Training Webinars

Inspera was selected as the assessment software for UCEM. I have so far conducted at least three webinars to give an understanding of the new software to teaching and assessment management staff. I am also in the process of creating a training webinar for the markers.

The presentations provided here are confidential and are password protected. Password is provided to the ALT Assessors.


Students' view of Inspera for tutors

Email sent to staff who have taken part in the initial Inspera session to try the system

Email sent to staff who have taken part in the initial Inspera session to try the system

At this initial stage to raise staff awareness of the new system that will be trialed at UCEM, I have created an "exam" (in Inspera terms) that the staff can take to understand how students will see the system. I am making this also an opportunity for the academic staff to get an understanding of what sort of questions can be presented in the system. This opportunity to take the "exam" as a student was offered to academic staff who will be delivering the modules where Inspera will be piloted, academic support staff involved in these modules and operations management and quality assurance staff involved in exams and assessments.

You can view the mock exam questions from Inspera Mock Exam for Staff document.

Technical Challenges

The Inspera system that was given as a sandbox for UCEM testing contained many features that are now not available in the actual installation we have received. For example, one feature that UCEM wanted in the assessment platform was the facility for students to be able to choose questions they wanted to answer, say pick four questions out of five in the assessment.

Inspera Sandbox Question Selection for students

Inspera Sandbox Question Selection for students available

Inspera Sandbox Question Selection Settings

Inspera Sandbox Question Selection Settings

Notice that the "Allow candidates to select X of Y questions from this section" option is not available in the UCEM installation. At the time we were given access to the sandbox for testing the system before purchase it was not made clear that we had access to Beta system. Therefore, we were under the impression that this would be a feature available to us.

I have raised these issues that are likely to affect our pilot and have escalated my concerns.

Due to several such issues, UCEM has decided to postpone the pilot to Autumn 2022 which is the right decision given the circumstances.

Analysis and Reflection

People are generally resistant to change. As Kurt Lewin's model of change suggests there are three steps: unfreezing, changing, and refreezing to get to a new equilibrium. In my view, the resistance to change from a user's perspective, especially in technology projects, arises from the fear of the unknown, the fear of will there be more to do, and the fear of will I be able to cope. All these feelings are natural and human.

I also believe in engaging stakeholders from the beginning. This gives them the sense that this is "our project". For example, in the Inspera pilot project, I proposed giving our staff access to a mock exam to have a feel for the system. This way they can interact with the system and get to know it and see how our students will 'see' it when they are taking their assessments.

As technology project change agents we need to understand and empathise with our users. By holding their hand if necessary to take the first steps until they feel comfortable to carry on, on their own we can help alleviate the resistance to change. This is what we did in the Zoom roll-out by giving tutors a safe space to practice screen sharing.

Once our tutors were comfortably conducting their webinars with Zoom platform, we started introducing ideas for interactions such as polls, ice breaker activities etc. There were lunchtime sessions where tutors shared their experiences with others. These were facilitated to provide a platform for tutors to discuss among each other what works and how they can learn from each other. This can be looked at making similarities to what Vygotsky identified as the zone of proximal development and scaffolding where more knowledgable others provide support for development.

With Covid pandemic, a lot of educational institutions scrambled to get online while UCEM being an online institution was at an advantage as we have already designed and developed our programs for online delivery. In this sense, it was in a way easier for us as the learning technology team in an established online institution to support our students and staff during these unprecedented times.

It does not mean that our colleagues do not face issues. As I have shown above in the email conversation, there are times when some colleagues face issues and they reach out for help. I believe in providing help as soon as possible because otherwise, the user feels frustrated and more apprehensive about the use of technology. I also try to use images, screenshots to give clear instructions. This is even more important now because we are not in the office to "show" someone how to do something. I feel it is also important to not come across as cold and indifferent when someone is asking for help. A standard message of "Please refer to Module Tutoring > Topic 4: Webinars > Zoom essentials for webinars - Share Content tab" would have given the same information. But I feel it is important to show empathy and appreciate the difficulty they faced and then show support to create a long-standing connection/ colleagueship.

I also appreciate the fact that things in technology fields are moving so fast and that we, the technology change agents, have to keep up with that. I attend many Jisc events, especially the ones organised by the Accessibility Community and other online events to keep myself up to date. I am now also a part of the "Inspera UK Forum". The latest meeting I attended was on the 2nd of March 2022. These online events allow me to not only get up to date information but also engage with people who have similar interests in developing ideas, discussing issues and to learn from each others experiences.

In terms of the Inspera project, at the moment rollout is paused because UCEM expected the feature allowing candidates to select questions will be available to us. Unfortunately, the Inspera team is unable to deliver it despite allowing us to test a system that included this feature. There are also concerns about the LTI connection support limitations to the VLE roles we use at UCEM. It is not necessarily a bad thing to halt a project until things are sorted. In fact, I believe it is the right decision in this instance as assessment is the most important thing for our students. Had we implemented a system in a hurry that failed to deliver on such an important aspect of the student learning journey, it would have caused reputational damage to the institution. Putting the project on hold until the path is cleared and technical challenges are overcome I believe is the best way forward for us in this instance. However, I wish I was involved in the project earlier on, during the development of the testing plan at least, because some of the issues that came to light, later on, would have been caught upfront if a wider team with experience in various departments were involved in the process. We are in contact with the Inspera team and when they are ready to deliver the features we require from Beta testing to the delivery platform we will be continuing with the Inspera project.

There is one thing in the Zoom project I wish I could have changed- that is adopting a video conferencing software that could be used anywhere in the world. When we tested Zoom, we looked at the listed countries in the Restricted countries or regions section of Zoom. Even today they only list Cuba, Iran, North Korea, Syria, and Ukraine (Crimea Region). However, some other Middle Eastern countries too, block Zoom use via Internet Service Providers (ISPs). Therefore, a small minority of our students are affected. Some students in these countries are able to use Virtual Private Network (VPN) services via their employers and they are able to join Zoom sessions. We did consider providing a VPN for our students but connecting to Zoom would have been illegal for the residents of those countries therefore, we did not go down that path. Instead, now we identify students who are from these countries and we download the recorded video after the webinar and provide it on the Vimeo video sharing platform for them on the course page. This was a very good lesson for all of us as now we are very conscious about the international access to the software we investigate. However, due to the benefits Zoom offered it was deemed the most suitable technology for UCEM. Since its adaptation in 2019, Zoom is hailed a success and even in 2022, we are not looking for a replacement.

References

Lewin, K. (1975). Field theory in social science: Selected theoretical papers. Westport, CT: Greenwood Press.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.