LAK22 Assess

Workshop on Learning Analytics and Assessment

MARCH 21, 2022
12noon - 3pm (PDT)
VIRTUAL SPACE

OBJECTIVES

Motivated by the successfully organized 1st Workshop on Learning Analytics and Assessment during LAK 21, we organise the 2nd iteration of this workshop (LAK22 Assess) which main objective will be to further promote research and practice that looks at the intersection of learning analytics and assessment. This workshop will examine approaches that build upon established principles in assessment to improve reliability, validity, usefulness of data collection and analysis in learning analytics. The workshop will also look into the ways how learning analytics can contribute to the future developments in assessment for both summative and formative purposes. The workshop will examine practices for the use of learning analytics to assure assessment trustworthiness with the particular attention to the socio-technical nature of potential challenges. The workshop will also be an opportunity to further frame and shape special issues as important products for the connections between LA and assessment.

PARTICIPATION SUMMARY

  • The organizers will start the workshop by outlining links between learning analytics and assessment. Attendees will have the opportunity to give a plenary and roundtable presentation on either a theory and/or work in progress.

  • An open call for participation is aimed to solicit brief descriptions of current research and practice projects.

  • This work will be presented in the workshop and discussed with workshop participants in the planary and roundtable style.

  • The LAK22 Assess workshop will provide a space for both capacity building and connection, and it is hoped that the event will spark the formation of a community of practice. The outcomes of the event will be housed on the Google Site. A possible follow-up publication will be organized in the form of a journal special issue.

Background

The field of learning analytics aims to harness the potential of digital traces of user interaction with technology. Through the analysis of digital traces, learning analytics seeks to advance understanding and support learning process, and improve environments in which learning occurs. Many promising results in learning analytics have promoted vibrant research and development activities and attracted much attention of policy and decision makers. To date, learning analytics demonstrated very promising results in several areas such as prediction and description of learning outcomes and processes (e.g., Baker, Lindrum, Lindrum, & Perkowski, 2015; Gardner & Brooks, 2018; Greene et al., 2019), analysis of learning strategies and 21st century skills (e.g., Jovanović, Gašević, Dawson & Pardo, 2017; Matcha, Gašević, Uzir, Jovanović & Pardo, 2019), adaptive learner support and personalized feedback at scale (e.g., McNamara et al., 2012; Molenaar, Roda, van Boxtel & Sleegers, 2012), and frameworks for ethics, privacy protection, and adoption (e.g., Tsai et al., 2018).

Regardless of many promising results, the field still needs to address some critical challenges, including those at the intersection between learning analytics and assessment. For example, how can learning analytics be used to monitor learning progress? How can learning analytics inform formative and summative assessment as learning unfolds? In which ways can validity and reliability of data collection and analysis in learning analytics be improved? These challenges are of high significance in contemporary society that more and more requires development and use of complex skill sets (Greiff et al., 2017). Therefore, learning and assessment experience are closely associated. A growing body of research in educational data mining has been done on developing techniques that can support intelligent tutoring systems with the mechanisms for skill development (Corbett & Anderson, 1994; Desmarais & Baker, 2012). Yet, there is limited research that looks at how data collected and methods applied in learning analytics can be used and possibly constitute a formative or summative assessment. Moreover, can such data and methods satisfy requirements for assessments articulated in psychometric properties, methodological models, and different types of validity and reliability.

The role of learning analytics in analysis of assessment trustworthiness is another open research challenge. This has particularly been emphasized during the COVID19 pandemic with the emergency transition to distance and online education that also required different approaches to assessment that go beyond proctored exams. Several studies proposed the use of data analytic methods for detection of potential academic dishonesty and cheating behaviors. Although some interesting insights are ported and a strong potential to detect suspicious behaviors is demonstrated, there are many open challenges related to technical, ethical, privacy, practical, and policy issues of the development, implementation, and use of such data analytic methods.