Core area 1: Operational issues:

a) An understanding of the constraints and benefits of different technologies

a) An understanding of the constraints and benefits of different technologies

You should show how you have used technology appropriately, given the constraints and benefits it provides within your context. Evidence in support of such statements might include a brief commentary on the choices behind the development and use of learning technology that influence its fitness for purpose. (This might discuss issues as viability, sustainability, scalability, interoperability and value for money.) You may already have something like this in the form of a design outline, proposal, conference presentation or similar. You should include such existing documentation wherever it seems relevant. Alternatively, you might want to take this opportunity to find out more about a technology you have deployed and produce a report on its viability.

Subject improvement and video technologies for assessment - NSG2EHP

Description:

I worked on a subject improvement design process with a Nursing educator which saw me evaluate various video publishing tools for the purposes of assessment within a Nursing subject for students across Australia and Singapore. I went through a process to evaluate and communicate the constraints and benefits of different technologies to the educator for the purpose of helping her make an informed decision on which video tool would best support students in their assessment. I then assisted to design and implement the tool, including creating assessment links and support resources/instructions for students in the subject. This involved an iterative design, test, implement and evaluate cycle across cohorts to finalise a working solution.

The video assessment that the staff member was trying to achieve was a particularly complex use case involving a two-stage/two-part scaffolded video assessment:

  1. Students record a short video of themselves presenting a microteaching activity on a specific topic to camera, then upload this to an appropriate storage area for assessment (Assessment 2), as well as a word document detailing their teaching plan (linked to their teaching activity)
  2. Students share their video with their peers post assessment in a central area, to share and view, in preparation for the upcoming final assessment
  3. Students choose a peers' video on the same topic and review this as part of a critical self-reflection for Assessment 3.

This assessment also ideally had to meet a number of requirements/considerations:

  • Solutions needed to be tested for a multi-campus and multi-cohort delivery - i.e. were the tools accessible to Singapore students and suitable for Singapore bandwidth connection speeds
  • The workflow process should be simple for students and not cause unnecessary challenges
  • The approach should allow students to both submit for assessment and share with their peers, ideally without needing two submission points/processes
  • Video files 5-10 minutes in length should be supported
  • The process should not cause additional instructor workload if possible
  • The academic had some concerns with using PebblePad due to previous experiences so was unlikely to go down this route.

Through investigating the instructors' needs and different tools, I identified benefits and constraints with different systems. I went through a process to evaluate different video tools both institutionally supported and external to the University and provided a comparison/evaluation rubric to the instructor as part of a further discussion where the instructor then chose a solution based on information and my advice. This was provided as a word document outlining my evaluation findings mapped as a matrix table (refer to evidence section) documenting key tools such as LMS, YouTube, Cloudstor, PebblePad, etc. This evaluation rubric/table can be found in the evidence section.

The assessment went through two iterations:

1. Singapore cohort: Using Cloudstor as the storage/submission mechanism. Due to high alignment in evaluation, and a fellow educational designer colleague recommending the tool due to positive experiences in other subjects. Cloudstor held benefits in that it was:

    • Able to handle large file sizes
    • University supported and based on the AARNET high speed network (accessible for Eduroam partners)
    • Tied to University accounts so no additional account or password required
    • Allowed students to change their privacy settings so others could view the file, allowing one submission for multiple purposes.

However results when using in practice were not positive - students uploaded incorrect file types and/or did not change sharing settings, and the system was excruciatingly slow for Singapore bandwidth connections. This led to negative experiences for the students, and additional workload for the instructor/coordinator in chasing up incorrect student submissions.

2. Local cohort: Next iteration I recommended using YouTube as an approach with the local cohort - the YouTube solution resolved technical challenges found with Cloudstor for the following benefits:

    • File upload speeds better and more consistent (though potential fears of being blocked overseas)
    • File types validated by the system on upload, meaning all submissions if successful were video files and could be viewed by an assessor.

However this approach brought up additional broader challenges in practice, particularly student and faculty/department privacy concerns. Students were confused about the sharing settings in YouTube and concerned regarding their files being public. The use of an external tool also holds risks around privacy and retention of student assessment data, and ensuring equity for students.

Despite creating instructional guides that instructed the students to create an 'unlisted' share link for private/semi-private view, students held assumptions/pre-existing perceptions of YouTube being 'public' and so held major concerns and negative responses toward using YouTube, and the instructional guides did not address this. The instructor/coordinator was also new to the process and so was unaware of the nuances of these different settings to communicate these to students. This meant I needed to provide extended support and work with the staff member to rethink the approach and clarify with them around next steps.

Reflection:

What have I learnt from doing what I describe above?

Through the process I was exposed to a very specific use case for video assessments that I had not previously come across, which while narrow and not commonly used for assessment, is not confined to this subject and while working at La Trobe I have found other colleagues who have found this use case employed in other subjects. This has given me a deeper understanding of what academic/teaching staff are trying to achieve with educational technologies and for what purposes. The fact that students needed to not only submit their video to their instructor for assessment but also share with their peers complicated the steps and the assessment process, however this does not meant that the strategy is not pedagogically sound. It did however raise issues around finding a solution that would meet local and Singapore student's needs and literacy/technological levels, and in a streamlined way that did not add significantly to student or instructor confusion or workload.

I also through the process identified that there are real gaps at La Trobe University but also more generally around having tools that support video assessments effectively. These limitations hinder good teaching and academics' ability to be innovative and to implement the technologies and assessment types they would like in practice. It seems there is no real end-to-end solution that is easy, clear and supports not only video but video for assessment (and complex assessment) purposes.

  • Most systems were not an end-to-end solution, so students would need to use multiple tools and learn multiple processes including editing, compressing, exporting appropriate file types, etc - potentially an issue for different student cohorts (e.g. EAL learners, overseas students, low literacy learners).
  • Submitting to an assessment area would be limited to instructor view only, therefore not available by default to other students for later sharing - this limited the use of the LMS assessment dropbox or would lead to additional steps for students or instuctor to make videos available via other means. Other systems also had differing privacy settings which would need to be set up by the student so required training/support/instructions.
  • Some systems had limitations in file size (e.g. 200MB), so were not suitable for the assessment task, or the assessment task had to be adapted.
  • Several systems had limitations in purpose (i.e file storage rather than dedicated video solution) - which meant the systems were unable to validate the file type submitted by student was appropriate video file, and required extra steps for instructor/other students to download the file and playback on their computer for viewing. Many that incorporated sharing used a link to a file and were not designed for assessment, meaning difficulty in validating when a student submitted/uploaded the file for assessment as they could update/overwrite the version post submission (thereby gaining unfair advantage to improve their work post due date without penalty).

Recommending YouTube as an approach brought up additional challenges, particularly student concerns around privacy which led to institutional/departmental fears as well as my own concerns around how this solution aligned with our data retention requirements as part of institutional and national policy.

Upon further reflection in light of the privacy concerns, using YouTube I feel went against institutional and TEQSA policies - generally assessments should be stored on local/institutional servers for data storage. There are also TEQSA regulations about ensuring students are not disadvantaged for assessment - the fact that YouTube requires students to organise a new non-institutional account is perhaps beyond the scope of a reasonable request. To mitigate this in the next iteration I suggested the instructor provide a disclaimer and allow alternate submission options for students upon negotiation/request.

These are broader considerations around privacy, data storage, and data validity that need to be considered and can have an effect on the confidence of academic staff, students and institutions to implement particular approaches in practice - this has broader implications for encouraging technology take-up and for ensuring that staff and students have positive technology-enhanced experiences. Students' concerns not only affected the teaching academic but filtered up to the department level leading to pressure put on the academic to change the approach and abandon the technologies.

What could have gone better? What would I do differently another time?

Unfortunately, it is very difficult to advise around better solutions when so many of the solutions are problematic. I think the approach I took to evaluate tools and iteratively work with the teaching academic to implement and evaluate the delivery through a couple of teaching periods was a sound approach and one that gave me the ability to both develop a relationship with the academic but also gain detailed and constructive feedback/data around how the solution worked in practice, this allowed me to intervene and support further where needed - so I would aim to undertake similar approaches in future (time and workload permitting). However, I think the privacy concerns and institutional policy concerns were ones that I did not consider fully or prioritise to the degree that I should have. Reflecting upon both this experience and the experience from the ePortfolio mobile-first project (see below) has provided me with a deeper understanding that privacy and data concerns are real issues in educational contexts and cannot be taken lightly - they can literally make or break a system implementation and affect users' willingness to adopt educational technologies and their experiences when using those technologies. They are also complex considerations that can be confusing for most users and so require real care not only when implementing a solution but when communicating and training staff and students, in order to achieve successful outcomes and gain stakeholder buy in.

Evidence:

  • Emails from coordinator:

Kate thank you so much. I really appreciate your efforts and your suggestions. I will look over these prior to semester commencing.

Much appreciate all your assistance over the last few months.

  • Emails from project team members:

Thanks for the confirmation, Kate – you’ve done an awesome job and we know from Sinead’s feedback that she was hugely appreciative of all your input and expertise.

  • Documentation: evaluation matrix of video tools, video assessment submission guide (below)
Tool comparison for NSG2EHP[1].docx
Video assessment – submitting Assessment 2.docx

ePortfolio research and implementation project: A mobile-first clinical assessment model

Description:

I co-wrote and co-presented a paper related to the implementation of a new mobile-based ePortfolio approach with La Trobe Nursing students and clinical educators. This paper was accepted (as part of a peer-reviewed process) and jointly presented at the 2017 ePortforum in Melbourne.

The model was brought in by a work colleague and originally intended to improve the student experience from previous ePortfolios that were not mobile-first, by

  • encouraging an improved student experience - increased access to technologies in flexible formats
  • encouraging student autonomy and accountability in the clinical assessment process - designed to make students responsible for leading the conversations with clinical assessors, and to encourage a shared conversation / 360 degree feedback approach where students ask for feedback and receive verbal feedback at point of being assessed in clinic, rather than receiving written delayed feedback with no context.
  • improve the administration load for the coordinators when managing clinical placements and multiple clinical educators' accounts.

While I was not involved in the design or implementation of the ePortfolio project, my role in the ePortforum paper and presentation submission involved me locating research that supported and discussed considerations in practice that could support or hinder the model's implementation and shed light on some of the early findings. This work deepened my and others' understanding of mobile use of ePortfolios and the challenges in practice.

This paper detailed the ePortfolio redesigned approach, as well as the early findings and lessons learned. The paper and the lessons learned demonstrated that there are a number of a number of constraints and considerations when implementing a tool in a clinical setting; technical and scalability challenges when introducing ePortfolios and/or mobile-first approaches in clinical settings, but also a number of ethical and other broader challenges or considerations that can affect staff and student willingness to engage and their actual use of the technologies particularly in clinical settings: e.g. privacy and confidentiality, data validity and administration workflows for clinicians, students and educators.

Some of these findings in the literature have been supported by early anecdotal feedback from the project but more research is required and in progress.

Reflection:

What have I learnt from doing what I describe above?

I already had experience in using ePortfolios and understood the constraints and benefits in education more generally, such as benefits for students (including ease of storage and long term access post course/institution - see Wuetherick & Dickinson, 2015) and benefits for learning and teaching such as improved assessment and feedback (Green et al 2014) and student agency over learning (Shulman, 1998; Bryant & Chittum, 2013). From my own research (Masters' thesis) I uncovered additional benefits such as improved student support and intervention for student issues in a timely manner. I was also aware of broader challenges to effective ePortfolio implementation in education including lack of understanding of pedagogical intent, lack of consistency in use over a course, steep technological learning curve, and student ownership of the space affecting logistics and workflows when handling assessment.

However, through investigating ePortfolios for the nursing and clinical context I found that there were a number of benefits but also constraints/barriers to implementing mobile technologies specifically in nursing or clinical settings which could affect or influence the model that I wasn't previously aware of. These key benefits and challenges are outlined within the research paper submitted to ePortforum 2017 (see Supporting Evidence section further below) but key points and some additional benefits/constraints found through this research are highlighted below.

Benefits:

The research literature I found suggested that there were pedagogical and potentially some logistical benefits for moving to a mobile-first ePortfolio approach:

  • For the nursing discipline/curriculum: Benefits of using ePortfolios in nursing disciplines are varied (Curtis, 2012; Garrett, MacPhee & Jackson, 2013; Green et al., 2014) but include storage, reflection (Feather & Ricci, 2014), demonstrating digital literacy required by nursing organisations (Wassef, Riza, Maciag, Worden & Delaney, 2012), demonstrating CPD (professional development records for nursing accreditation) as well as during interviews for employment (Feather & Ricci, 2014).
  • During clinical placement: There is also some evidence of potential benefits in taking a mobile-first approach for clinical settings: Bogossian and Kellett (2010) have identified issues related to lack of computer and network access in hospital settings, and mobile technologies could potentially address/mitigate this issue. Other authors have noted the importance of using freely available and portable tools (Bolliger & Shepherd, 2010; Wuetherick & Dickinson, 2015) which could be applied in this context.

However, the literature and early findings in the project also identified a number of constraints in clinical settings that could possibly affect implementing the model in practice:

Constraints/barriers

  • Restrictions to use mobile technologies in a clinical setting - students potentially forced to put phones away. Some studies have noted lack of access to computers and network (Bogossian, Kellett,& Mason, 2008).
  • Privacy concerns for clinical educators when assessing student work/progress on shared computers (if/when using alternatives to mobile)
  • Privacy concerns of patient data, particularly for patients and clinical educators seeing students using mobile technologies in clinic when working with patients
  • Clinical educators' time and administrative load when assessing and reviewing student reports - both workload and time allocation preferences
  • Digital literacy levels of students (Wuetherick & Dickinson, 2015)
  • Instructor resistance to ePortfolios (Wuetherick & Dickinson, 2015)
  • Inter-rater reliability issues (Feather & Ricci, 2014), which could apply in clinical contexts with clinical educators using differing assessment scales both compared to lecturers and to each other. This is not an issue with the technology per se however points to larger issues that also need to be addressed as part of a successful approach.

Findings in practice

As noted in our conference paper (see Supporting Evidence below), early feedback from students on the new mobile first model and ePortfolio template suggested that they perceived ePortfolios and the mobile first approach to have improvements over the previous model both from an access and process perspective but also due to improvements to the user interface which made the clinical assessment process simpler and clearer. Feedback from the coordinators indicated that they perceived the new template and process improved their workload from an administrative perspective. However, these benefits seemed more tied to the interface rather than the availability of mobile. The early feedback from all parties suggested that while the intention of the model had been to improve student accountability and transparency between student and educator, it did not always work this way in practice. Broader issues with technical challenges but also privacy concerns, resistance and lack of consistency in clinical educators' practice caused additional issues for La Trobe staff and students and undermined the intended approach. In particular, the new approach streamlined accounts but meant that clinical educators held concerns around data validity especially in validating their own feedback as part of formal assessment was theirs and was not modified by others. The clinical educators also often preferred to not use mobile devices (either their own or the students') and so would assess students outside of clinic hours which removed/limited the 360 feedback opportunity and student initiated approach. There were also concerns or limitations around students using their own mobile devices in clinical settings.

These challenges suggest that if implementing ePortfolios in clinical settings in future, a range of factors need to be considered and potentially mitigated. In particular, a lot of work needs to be completed around educating and gaining buy-in from clinical educators so they feel invested and positive around intended approaches and that the intended approaches will meet their needs. Some of the privacy and data validity concerns are valid but others are potentially due to lack of clinical educators' understanding of the process which could be mitigated through training and communications. Additionally, work around technical implementation and technical limitations when working in clinical settings need to be considered and worked through more deeply - a range of flexible approaches may be required to allow for different clinical experiences, access considerations and clinical staff/institutional styles. Unfortunately, these considerations may mean that fully achieving a student-led autonomous approach may not always be possible, at least not through the technology, though may in part be achieved through clearly defined processes and clinical educator training.

I am also continuing some aspects of this research within a separate research project across university sites to look at unintended consequences for students and staff using ePortfolios in clinical/health or education settings when working with 3rd parties who are vulnerable groups (e.g. children, patients). Working with other institutional sites in the early stages of this project has given me an appreciation for the fact that these ethical considerations, particularly privacy and confidentiality of data, are not minor concerns but in some cases have led to disciplines and institutions halting the use of ePortfolios altogether. This is a concern as ePortfolios have some strong pedagogical benefits, particularly in supporting student critical reflection and self-evaluation and promoting student agency and responsibility over their learning journey. These considerations demonstrate that not only are limitations of the technology a consideration, but the broader policies, approach and implementation/support provided for educational technology use are just as important.

Supporting evidence

I have included a copy of the program and our research paper excerpted from the conference book of peer reviewed papers (see below).

Evidence of the paper and presentation details are available at the ePortforum / ePortfolios Australia Forum 2017 website page at: https://eportfoliosaustralia.wordpress.com/forums/2017-eportfolio-forum-home-page/2017-eportfolio-forum-program/

The conference website includes the program and the book of short peer-reviewed papers of which our paper 'A mobile-first clinical assessment model' was featured.

eportfolioforum_2017_TYKMRC-peer-reviewed-paper.pdf
eportfolioforum_2017_program_a4print_v7_20170915.pdf