LAK24 Assess

Workshop on Learning Analytics and Assessment

Date: Tuesday, March 19, 2024
Time: 9:00 - 17:00
Kyoto, Japan
Format: In-person workshop

OBJECTIVES

Motivated by the successfully organized Workshop on Learning Analytics and Assessment during LAK 21, 22 & 23, we organise the 4nd iteration of this workshop (LAK24 Assess) which main objective will be to further promote research and practice that looks at the intersection of learning analytics and assessment. This workshop will examine approaches that build upon established principles in assessment to improve reliability, validity, usefulness of data collection and analysis in learning analytics. The workshop will also look into the ways how learning analytics can contribute to the future developments in assessment for both summative and formative purposes. The workshop will examine practices for the use of learning analytics to assure assessment trustworthiness with the particular attention to the socio-technical nature of potential challenges.  The workshop will also be an opportunity to further frame and shape special issues as important products for the connections between LA and assessment.

PARTICIPATION SUMMARY

 Background

The field of learning analytics aims to harness the potential of digital traces of user interaction with technology. Through the analysis of digital traces, learning analytics seeks to advance understanding and support learning process, and improve environments in which learning occurs. Many promising results in learning analytics have promoted vibrant research and development activities and attracted much attention of policy and decision makers. To date, learning analytics demonstrated very promising results in several areas such as prediction and description of learning outcomes and processes (e.g., Baker et al., 2015; Gardner & Brooks, 2018; Greene et al., 2019), analysis of learning strategies and 21st century skills (e.g., Jovanović et al., 2017; Matcha et al., 2019), adaptive learner support and personalized feedback at scale (e.g., McNamara et al., 2012; Molenaar et al., 2012), and frameworks for ethics, privacy protection, and adoption (e.g., Tsai et al., 2018).

Regardless of many promising results, the field still needs to address some critical challenges, including those at the intersection between learning analytics and assessment (Gašević et al., 2022; Raković et al., 2023). For example, how can learning analytics be used to monitor learning progress? How can learning analytics inform formative and summative assessment as learning unfolds? In which ways can validity and reliability of data collection and analysis in learning analytics be improved? These challenges are of high significance in contemporary society that more and more requires development and use of complex skill sets (Greiff et al., 2017). Therefore, learning and assessment experience are closely associated. A growing body of research in educational data mining has been done on developing techniques that can support intelligent tutoring systems with the mechanisms for skill development (Corbett & Anderson, 1994; Desmarais & Baker, 2012). Yet, there is limited research that looks at how data collected and methods applied in learning analytics can be used and possibly constitute a formative or summative assessment. Moreover, can such data and methods satisfy requirements for assessments articulated in psychometric properties, methodological models, and different types of validity and reliability.

The role of learning analytics in analysis of assessment trustworthiness is another open research challenge. This has particularly been emphasized during the COVID19 pandemic with the emergency transition to distance and online education that also required different approaches to assessment that go beyond proctored exams. Several studies proposed the use of data analytic methods for detection of potential academic dishonesty and cheating behaviors. Although some interesting insights are ported and a strong potential to detect suspicious behaviors is demonstrated, there are many open challenges related to technical, ethical, privacy, practical, and policy issues of the development, implementation, and use of such data analytic methods.

References

Baker, R. S., Lindrum, D., Lindrum, M. J., & Perkowski, D. (2015). Analyzing Early At-Risk Factors in Higher Education E-Learning Courses. International Educational Data Mining Society (pp. 150-155)

Corbett, A. T., & Anderson, J. R. (1994). Knowledge tracing: Modeling the acquisition of procedural knowledge. User Modeling and User-Adapted Interaction, 4(4), 253–278. 

Desmarais, M. C., & Baker, R. S. (2012). A Review of Recent Advances in Learner and Skill Modeling in Intelligent Learning Environments. User Modeling and User-Adapted Interaction, 22(1–2), 9–38.

Gardner, J., & Brooks, C. (2018). Student success prediction in MOOCs. User Modeling and User-Adapted Interaction, 28(2), 127-203.

Gašević, D., Greiff, S., & Shaffer, D. W. (2022). Towards strengthening links between learning analytics and assessment: Challenges and potentials of a promising new bond. Computers in Human Behavior, 134, 107304.

Greene, J. A., Plumley, R. D., Urban, C. J., Bernacki, M. L., Gates, K. M., Hogan, K. A., ... & Panter, A. T. (2019). Modeling temporal self-regulatory processing in a higher education biology course. Learning and Instruction, 101201.

Greiff, S., Gasevic, D., & von Davier, A. A. (2017). Using process data for assessment in Intelligent Tutoring Systems. A psychometrician’s, cognitive psychologist’s, and computer scientist’s perspective. Design Recommendations for Intelligent Tutoring Systems. 5, 171–179.

Jovanović, J., Gašević, D., Dawson, S., Pardo, A., & Mirriahi, N. (2017). Learning analytics to unveil learning strategies in a flipped classroom. The Internet and Higher Education, 33(4), 74-85.

Matcha, W., Gašević, D., Uzir, N. A. A., Jovanović, J., & Pardo, A. (2019). Analytics of learning strategies: Associations with academic performance and feedback. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge (pp. 461-470).

McNamara, D. S., Raine, R., Roscoe, R., Crossley, S. A., Jackson, G. T., Dai, J., ... & Dempsey, K. (2012). The Writing-Pal: Natural language algorithms to support intelligent tutoring on writing strategies. In Applied natural language processing: Identification, investigation and resolution (pp. 298-311). IGI Global.

Molenaar, I., Roda, C., van Boxtel, C., & Sleegers, P. (2012). Dynamic scaffolding of socially regulated learning in a computer-based learning environment. Computers & Education, 59(2), 515-523.

Raković, M., Gašević, D., Hassan, S. U., Ruipérez Valiente, J. A., Aljohani, N., & Milligan, S. (2023). Learning analytics and assessment: Emerging research trends, promises and future opportunities. British Journal of Educational Technology. doi: 10.1111/bjet.13301

Tsai, Y. S., Moreno-Marcos, P. M., Jivet, I., Scheffel, M., Tammets, K., Kollom, K., & Gašević, D. (2018). The SHEILA framework: Informing institutional strategies and policy processes of learning analytics. Journal of Learning Analytics, 5(3), 5-20.