Integrating Multi-channel Learning Data to Model Complex Learning Processes

This workshop focuses on integrating multi-channel learning data to model complex learning processes across tasks, domains, and contexts.

Presentations will focus on the workshop topics with ample time for discussion. Participants will collaboratively articulate models of self-regulated learning (SRL) for any existing learning system (e.g., MetaTutor, BioWorld, Betty’s Brain, nSTUDY, Crystal Island, MOOCs, etc.) and issues about collecting and analyzing multimodal multichannel data and the implications for learning analytics.

Particular focus will be on the technology environment required to integrate multimodal data (challenges such as time stamping data that has different sources (i.e., log files and psychophysiological) related to psychological processes that can only be evaluated at different rates (e.g., boredom has a longer psychological window of measurement than surprise does)) and matching those data sources to specific psychological processes. Follow-up discussion about transfer regarding learning analytics, self-regulated learning processes, and design of various data visualizations to model, foster, and support learners’ self-regulated learning.

Small group sessions will brainstorm novel methods and analytical techniques used to evaluate cognitive, affective, metacognitive, motivational, and social processes during human-machine interactions and their implications for learning analytics and data visualizations. Further, interactive demos of existing systems and other prototypes, especially on the methods used to collect multimodal multichannel SRL data and data visualizations will be provided.

Finally, specific time will be set aside for discussing opportunities and pathways for cross-institutional and industry collaborations.