Shared outcomes of the workshop are published below
Collaborations are also possible, feel free to contact us at L.W.v.Meeuwen@TUe.nl
Outcomes of the first workshop (LAK24): https://sites.google.com/view/lak24adoptionworkshop/follow-up
Esther Venture Medina
Ioana Jivet
Rogers Kaliisa
Nithila Ramesh
Student-facing learning analytics-based dashboards have recently become a focal topic. However, the field would benefit from rigorous tests of the usefulness and useability of the dashboards (Kalissa et al., 2024). Often multiple dashboard elements are applied and tested together in one dashboard, making it difficult to assess the utility of a single element. For example, it is unclear if students who view a successful dashboard are performing better than students in the control condition because they focused on how much time they spent on the LMS each week, or because they were prompted to reflect on the assignment grades they received. In this presentation, I will give an overview of the different elements included in various dashboards in literature and what aspects of self-regulated learning they pertain to. In this way contributing to discussions on how best to experimentally evaluate dashboards. Further, I will present how we can tease apart for whom and in which context each of these elements would optimally support self-regulation. This adds to the discussion on the personalisablity of dashboards to support different learning paths and best scaffold students in different trajectories. I will present how students with certain individual differences (e.g., course engagement, self-regulated learning skills) may find some dashboard elements more useful than others. I will also explain how we will assess this in an observational study, and how it could be implemented in future dashboards to support students’ self-regulated learning development in a personalised manner. With this presentation, we hope to gain insight into other potential inter- and intraindividual differences that can be focal points for personalisations aiming to improve the adoption of student-facing dashboards across diverse educational contexts. We would also appreciate insights into what other aspects of the dashboards need personalisation to allow for adoption across different courses and universities.
References
Kaliisa, R., Misiejuk, K., López-Pernas, S., Khalil, M., & Saqr, M. (2024). Have learning analytics dashboards lived up to the hype? A systematic review of impact on students’ achievement, motivation, participation and attitude. Proceedings of the 14th Learning Analytics and Knowledge Conference, 295–304. https://doi.org/10.1145/3636555.3636884
Miriam Fernandez
Universities are increasingly adopting data-driven strategies to enhance student success, with AI applications like Learning Analytics (LA) and Predictive Learning Analytics (PLA) playing a key role in identifying at-risk students, personalising learning, supporting
teachers, and guiding educational decision-making. However, concerns are rising about potential harms these systems may pose, such as algorithmic biases leading to unequal support for minority students. While many have explored the need for Responsible AI in LA,
existing works often lack practical guidance for how institutions can operationalise these principles. In this talk, we will present a novel Responsible AI framework tailored specifically to LA in Higher Education (HE). We started by mapping 11 established Responsible AI frameworks, including those by leading tech companies, to the context of LA in HE. This led to the identification of seven key principles such as transparency, fairness, and accountability. We then conducted a systematic review of the literature to understand
how these principles have been applied in practice. Drawing from these findings, we present a novel framework that offers practical guidance to HE institutions and is designed to evolve with community input, ensuring its relevance as LA systems continue to develop.
Jihee Hwang, Yeonji Jung, and Junghwan Kim
This presentation explores case studies on the perceptions, opportunities, and challenges of adopting Learning Analytics (LA) in two large public universities in the U.S. Each case focuses on the experiences of introducing an LA course within an academic program centered on adult education, lifelong learning, and human resource development. Across the cases, a notable challenge was the narrow view of LA as solely a branch of statistics or research methods, limiting its recognition as an interdisciplinary field that integrates learning theories, data, and practical applications. The discussion also highlights the foundational skills necessary for effective LA adoption and dashboard use in higher education, including data literacy, numeracy, and the critical interpretation of measurements and constructs. Drawing on the presenters’ combined expertise as practitioners, educators, and scholars in higher education administration, this session offers insights into the readiness and perceived value of LA adoption from diverse stakeholder perspectives. Strategies for successfully integrating LA into academic programs and institutional practices will also be shared.
Zexuan Chen (Serlina)
Learning Analytics Dashboards (LADs) have seen widespread adoption in higher education [1], but their application in peer assessment remains relatively underexplored. This study seeks to address this gap by presenting a design-based research project focused on the design and development of PeerGrader, a peer assessment system featuring an integrated LAD. This dashboard visualizes and tracks both assessors’ and assessees’ learning performance. Furthermore, it illustrates interactions with peer work across varying proficiency levels throughout the feedback process. Grounded in the theories of peer assessment, self-regulated learning (SRL), and principles of learning analytics, this project was implemented with five student cohorts (N=346) in a higher education EFL course in China. Two peer assessment tasks were carried out over a two-month period. In the first task, participants completed the peer assessment without access to the LAD; in the second task, the LAD was introduced. Data were collected through PeerGrader logs, screen recordings, questionnaires, retrospective interviews, and qualitative feedback. Preliminary findings suggest that the LAD significantly enhanced assessors' ability to monitor their behavioral interactions, reflect on their cognitive processes, and co-construct feedback with GenAI, thereby facilitating more informed decisions about their learning. In addition, challenges and opportunities related to the integration of LADs in peer assessment were identified, shedding light on critical factors for scaling and improving the system's functionality. These insights are valuable not only for refining PeerGrader, but also for advancing the adoption of LADs in peer assessment contexts.
[1] Kaliisa, R., Misiejuk, K., López-Pernas, S., Khalil, M. and Saqr, M. Have Learning Analytics Dashboards Lived Up to the Hype? A Systematic Review of Impact on Students' Achievement, Motivation, Participation and Attitude. In Proceedings of the Proceedings of the 14th Learning Analytics and Knowledge Conference (Kyoto, Japan, 2024).
Amir Winer and Nitza Geri
Learning Analytics Dashboards (LADs) hold immense potential to enhance learning processes in higher education by providing actionable insights to faculty. However, achieving widespread adoption among faculty has proven challenging. A survey aimed to identify usage barriers of traditional business-intelligence LADs revealed three primary concerns: no time, not knowing what to change, feeling that students are responsible for their learning.
To address these concerns, efforts focused on making dashboards "invisible" through two strategies. First, dashboards were seamlessly integrated into the main interface of the Virtual Learning Environment (VLE). This approach ensured that LADs were no longer separate tools requiring additional effort but part of the natural workflow for instructors and students. Second, basic intervention plans were automated. Alerts triggered for students and faculty included specific calls to action with pedagogical prescribed suggestions tied to a predefined schedule. These measures simplified the process and aligned the data timely interventions on the semester timeline.
While these interventions addressed the first two concerns, the third challenge remains deeply rooted in faculty perceptions of their role in student teaching. To tackle this, we suggest moving forward with a generative AI based Learning Analytics Conversational Agent to directly support students. This will allow students to independently plan their learning events, providing tailored suggestions and bypassing the need for faculty-led interventions. By empowering students and reducing the pressure on faculty to monitor and act, the system balances personalized and open support, autonomy, and guidance.
This layered approach demonstrates a progressive visions shifting away from traditional LADs that focused on faculty and student efficiency. The ongoing challenge remains to harmonize faculty participation with technological solutions, ensuring both groups benefit equitably from the insights LADs encapsulate.
Claudia Ruhland
Learning analytics (LA) offers universities significant potential to personalize learning, promote academic success, and provide students with targeted support. However, a key challenge remains obtaining students' consent to use their data. Not only data protection concerns but also a lack of trust and especially a perceived lack of personal benefit often lead to low consent rates. To address this issue, this study applied the user-gratification approach to designing learning analytics dashboards to motivate students to consent.
Prior to dashboard development, a survey was conducted among students to determine their preferences for the information displayed. The results showed that students wanted personalized messages by teachers, individually set deadlines, notifications on course obligations, assessments obtained, reminders for pre-class activities and reminders for pre-class activities. These findings will be integrated into the design of the dashboard to ensure maximum perceived benefit. The effectiveness of the dashboard is tested in a user experience study. Based on the insights gained, an exemplary dashboard will be developed and integrated into the consent process in order to clearly illustrate the purpose of data processing. The aim is to address students' information needs, strengthen their trust, and increase acceptance of data processing. Qualitative surveys will be used to investigate the extent to which the perceived relevance and individual benefits of the dashboard were decisive factors for approval.
The study will show the extent to which learning analytics dashboards can function not only as analytical tools but also as instruments for promoting trust and acceptance. The user-gratification approach provides a valuable basis for communicating clear added value to students and encouraging participation. This article discusses the implications for the design of future LA systems and the importance of transparency and user-centricity in addressing data protection challenges.
Bart Rienties