Conceptual Framing

Background / Framing 

The intent to offer insights into learning that are “actionable” has been a core tenet of learning analytics from the field’s inception (Siemens, 2013). That this goal of making an impact on practice remains more aspirational than realized has been noted by multiple papers, both those examining prior impact (Ferguson et al., 2016) and those considering how the situation might be rectified (Dimitriadis et al., 2021). Increased attention to developing learning analytics that are not only technically rigorous but able to be effectively used by teachers, students, and other educational stakeholders to improve learning can be seen as part of a larger move toward human-centered learning analytics that takes people's situations, needs, and goals as the starting point (Buckingham Shum, Ferguson, & Martinez-Maldonado, 2019). For example, recent work has started to bring actionability to the forefront, anchoring it as part of fundamental inquiry for impactful learning analytics research (Dimitriadis et al., 2021; Jung et al., 2023). 

While the importance of the actionability of learning analytics may be acknowledged widely, the knowledge base related to improving it has remained relatively sparse and disconnected. This may be due both to the fact that many researchers do not have the opportunity to concretely address actionability in authentic learning situations and that those who do have approached the challenge in quite different ways. For example, harkening back to the “actionable insights” language of business analytics, some have focused on the effective presentation of useful information that can be acted upon to improve learning (Susnjak, Ramaswami, & Mathrani, 2022). From this perspective, there is a growing interest in tool design that ensures that metrics are not only informative but also motivational to prompt particular actions (Dimitriadis et al., 2021). Work taking this view often emphasizes human-centered design approaches, involving stakeholders in design to identify actual needs and preferences that can inform the design decisions (Wiley, Dimitriadis, & Linn, 2023). A related approach to actionability involves embedding actions or links to action in analytics tools; for example, pre-written messages from instructors can be programmed to send automatically to students with certain levels of activity in an online system as tracked by the analytics (Pardo, Jovanovic, Dawson, Gašević & Mirriahi, 2019).

In contrast to the technical perspectives described above, other work has emphasized social aspects of actionability, considering end-user routines and integration of analytics into their practices. From these perspectives, actionability depends not only on the types of information provided and visualization cues but on broader social systems of activity, taking into consideration such factors as teacher orchestration and student self-regulated learning (Amarasinghe et al., 2022; Klein et al., 2019). In this view, actionability is not simply a property of the analytics but also the larger learning system into which they are embedded. This allows for a broader perspective on the impact of analytics in teaching and learning activities, including both direct behaviors and decisions based on analytics, as well as more holistic or implicit ways that analytic information can feed into the ways these systems operate (Wise & Jung, 2019). 

Based on the literature review, we would like to the concept of actionability in learning analytics from different stakeholders' perspectives based on the literature synthesis: (1) technical aspect (for technology designers and developers), (2) information presentation (for designers), (3) self-regulated learning (for learners), (4) classroom orchestration (for teachers). This set of perspectives will be used as a starting point for discussing the concept of actionability throughout the workshop.

References

Amarasinghe, I., Michos, K., Crespi, F., & Hernández‐Leo, D. (2022). Learning analytics support to teachers' design and orchestrating tasks. Journal of Computer Assisted Learning. DOI: 10.1111/jcal.12711

Buckingham Shum, S., Ferguson, R., & Martinez-Maldonado, R. (2019). Human-centred learning analytics. Journal of Learning Analytics, 6(2), 1-9.

Dimitriadis, Y., Martínez-Maldonado, R., & Wiley, K. (2021). Human-centered design principles for actionable learning analytics. Research on E-learning and ICT in education: Technological, pedagogical and instructional perspectives, 277-296.

Ferguson, R., Brasher, A., Clow, D., Cooper, A., Hillaire, G., Mittelmeier, J., Rienties, B., Ullmann, T., & Vuorikari, R. (2016). Research evidence on the use of learning analytics: Implications for education policy.

Jung, Y., Sarmiento, J. P., & Wise, A. F. (2023, March). Designing for analytic actionability: Temporality and plurality as strategies for human-centered learning analytics. In Companion Proceedings of LAK’23 (pp. 168-170). ACM. 

Klein, C., Lester, J., Nguyen, T., Justen, A., Rangwala, H., & Johri, A. (2019). Student sensemaking of learning analytics dashboard interventions in higher education. Journal of Educational Technology Systems, 48(1), 130-154.

Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2019). Using learning analytics to scale the provision of personalised feedback. British Journal of Educational Technology, 50(1), 128-138.

Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380-1400.

Susnjak, T., Ramaswami, G. S., & Mathrani, A. (2022). Learning analytics dashboard: a tool for providing actionable insights to learners. International Journal of Educational Technology in Higher Education, 19(1), 12.

Wiley, K., Dimitriadis, Y., & Linn, M. (2023). A human‐centred learning analytics approach for developing contextually scalable K‐12 teacher dashboards. British Journal of Educational Technology. https://doi.org/10.1111/bjet.13383

Wise, A. F. & Jung, Y. (2019). Teaching with analytics: Towards a situated model of instructional decision-making. Journal of Learning Analytics, 6(2), 53-69.