Overview


 Workshop on Utilizing EEG Input in Intelligent Tutoring Systems
 June 5 @ Honolulu, Hawaii
 W5 (HL 3G -FD)
 12th International Conference on Intelligent Tutoring Systems Workshops

Overview

The ultimate intelligent tutoring system could peer directly into students' minds to identify their mental states (e.g. engagement, cognitive load, competencies, intentions) and decide accordingly what and how to teach at each moment. Recent advances in brain imaging technologies have lead to several portable EEG headsets that are commercially-available and show promise for use in intelligent tutoring systems.


The EEG signal is a voltage signal that can be measured on the surface of the scalp, arising from large areas of coordinated neural activity manifested as synchronization (groups of neurons firing at the same rate). This neural activity varies as a function of development, mental state, and cognitive activity, and the EEG signal can measurably detect such variation. Using signals recorded from low-cost, portable EEG devices, Chang et al. trained machine learning classifiers to detect reading difficulty in an intelligent tutoring systems (Chang, Nelson, Pant, & Mostow, 2013), student confusion while watching course material (Wang, 2013), and user frustration while using a spoken dialog interface (Sridharan, Chen, Chang, & Rudnicky, 2012). Frasson et al. also used EEG to model learners' reactions in ITS (Blanchard, Chalfoun, & Frasson, 2007), detect learners' emotions (Heraz & Frasson, 2007), assess learners' attention (Derbali, Chalfoun, & Frasson, 2011), and more recently to show that subliminal cues were cognitively processed and have positive influence on learners' performance and intuition (Chalfoun & Frasson, 2012; Jraidi, Chalfoun, & Frasson, 2012). Szafir and Mutlu demonstrated that ARTFul, an adaptive review technology for flipped learning that monitors student's attention during educational presentations and adapts reviewing lesson content, can improve student recall abilities by 29% and in less time (Szafir & Mutlu, 2013). Azcarraga and Suarez used a combination of EEG brainwaves and mouse behavior to predict the level of academic emotions, such as confidence, excitement, frustration, and interest (Azcarraga & Suarez, 2012). These early results shows promise for augmenting intelligent tutoring systems with EEG signals.


Advances in EEG-ITS require close collaboration between education researchers, machine learning scientists and computational neuroscientists. To this end, an interdisciplinary workshop can play a key role in advancing existing and initiating new research. Our workshop would be the first of this type to be held at the ITS conference. We hope that it will attract an interdisciplinary target audience consisting of researchers in education, machine learning, and neurosciences.


Topics of Interest
As an unobtrusive measure, EEG has the potential to provide continuous, data-dense, longitudinal measures of mental states without disrupting the flow of instruction, practice, and learning. To the extent that brain operation during these processes is domain-independent, a wide variety of learning contexts can exploit mental state information. 
  • Education Focus
    • Identify cognitive states that are relevant to learning: knowledge state, workload, …
    • Identify affective states that affect learning outcome: emotion, attention, engagement, …
  • Machine Learning Focus
    • Decoding of cognitive states from EEG brain activity
    • Feature selection and data mining techniques for decoding cognitive states
  • Neural Science Focus
    • Experimenting with different brain imaging techniques: from portable, consumer-friendly EEG devices to laboratory EEG, fMRI, MEG, NIRS devices


Important Dates

  • March 1, 2014: Call for Papers
  • March 20, 2014: Deadline for submission of workshop papers
  • March 31, 2014: *Extended* deadline for submission of workshop papers
  • April 20, 2014: Notification of acceptance
  • May 4, 2014: Send camera-ready papers to Kai-min Chang <kaimin.chang@gmail.com>
  • May 5, 2014: Camera-ready papers due
  • June 5, 2014: Workshop @ ROOM 3 (HL 3G -FD) in the Hamilton Library of University of Hawaii (directions).

Shared Dataset and Toolkit

Submissions based on any data-sets or tasks are welcomed, and originality of approach is encouraged. However, to assist researchers who are new to this topic, we are providing some EEG data, as well as a toolkit to help process the EEG data. We welcome exploratory or innovative submissions that reveal patterns of correspondence between learning models and brain activity.


This dataset consists of ~3 years of tutor usage data collected in vivo at a primary school. The tutor is Project LISTEN's Reading Tutor and EEG was recorded with NeuroSky BrainBands. The Reading Tutor helps students learn how to read by listening (using Automated Speech Recognition) to them read story aloud. We annotated the time-course of a reading session with the sentence that the student was reading. Our EEG data was recorded using NeuroSky's API. NeuroSky generated 12 channels: Rawwave, PoorSignal (Signal Quality), Delta, Theta, Alpha1, Alpha2, Beta1, Beta2, Gamma1, Gamma2, Attention, and Meditation. The dataset consists of roughly 169 hours of EEG recording and 200,000 sentences. Our toolkit is a re-implementation of the pipeline used in Chang et al. with some minor differences.


Paper Submission
We invite position and research papers that advance theory, methods, practice, and design of intelligent tutoring systems with portable EEG devices. We seek for submission of original (previously unpublished) research papers. The length of the submitted papers should not exceed 6 pages in Springer LNCS formatSubmission of previously published work is possible as well, but the authors are required to mention this explicitly. Previously published work can be presented at the workshop, but will not be included into the workshop proceedings (which are considered peer-reviewed publications of novel contributions). Moreover, the authors are welcome to present their novel work but choose to opt out of the workshop proceedings in case they have alternative publication plans.

(Closed)
* To encourage submissions and improvements to the papers, we will close the submission system 3-4 days later. However, in order to coordinate reviewing efforts, please submit a version by the Mar 31, 2014 deadline. Thanks! *

Proceedings

  • To appear


Schedule

There are 4 sessions (2 in the morning and 2 in the afternoon). Each session consists of three presentations (20 min oral + 5 min discussion). There will also be 15 min in each session for group discussions. The workshop will take place in ROOM 3 (HL 3G -FD) in the Hamilton Library of University of Hawaii.


 09:00 - 10:30 Session 1
 Emotion
 Opening remark

 Modelling EEG Signals for the Prediction of Academic Emotions
 Judith Azcarraga, Nelson Marcos, and Merlin Teodosia Suarez

 Exploring the Behavior of Novice Programmers’ EEG Signals for Affect-based Student Modeling
 Tita R. Herradura, Joel P. Ilao, and Merlin Teodosia C. Suarez

 Emotional Transitions in Driving
 Pierre Olivier Brosseau, Thi Hong Dung Tran, and Claude Frasson 
 10:30 - 11:00 Break 
 11:00 - 12:30 Session 2
 Learner profile, memory
 A Study of Learner’s Mental Profile in Different Categories of Tasks
 Ramla Ghali and Claude Frasson

 Classification of video game players using EEG and logistic regression with ridge estimator
 Gustavo A. Lujan-Moreno, Robert Atkinson, George Runger, Javier Gonzalez-Sanchez, and Maria Elena Chavez-Echeagaray

 Predicting subsequent memory from single-trial EEG
 Eunho Noh, Grit Herzmann, Tim Curran, and Virginia R. de Sa

 Group discussion
 12:30 - 13:30 Lunch 
 13:30 - 15:00 Session 3
 Open Toolkit and Dataset
 A Public Toolkit and ITS Dataset for EEG
 Yueran Yuan, Kai-min Chang, Yanbo Xu, and Jack Mostow

 EEG Helps Knowledge Tracing!
 Yanbo Xu, Kai-min Chang, Yueran Yuan, and Jack Mostow

 Extracting temporal EEG features with BCIpy
 Justis Peters, Sagar Jauhari, and Tiffany Barnes

 Group discussion
 15:00 - 15:30 Break 
 15:30 - 17:00 Session 4
 Other modalities
 An Exploration of Two Methods for using fMRI to Identify Student Problem Solving Strategies
 Caitlin Tenison and John R. Anderson

 Intelligent tutors exploiting novel sensing modalities for decoding students' attention
 Alvaro Soto, Felipe Orihuela-Espina, Diego Cosmelli, Cristian Alcholado, Patrick Heyer, and L. Enrique Sucar

 Smart headbands for monitoring functional brain activity
 James Dieffenderfer, Mychael Chance Bair, Justis Peters, Andrew Krystal, and Alper Bozkurt

 Closing discussion

Organizers

  • Kai-min Kevin Chang, Carnegie Mellon University, USA
  • Claude Frasson, University of Montreal, Canada


Program Committee

  • Judith Azcarraga, De La Salle University, Manila, Philippines
  • Tiffany Barnes, North Carolina State University, USA
  • Carole Beal, University of Arizona, USA
  • Pierre Chalfoun, University of Montreal, Canada
  • Maher Chaouachi, University of Montreal, Canada
  • Lotfi Derbali, University of Montreal, Canada
  • Karola Dillenburger, Queen's University Belfast, UK
  • Imène Jraidi, University of Montreal, Canada
  • Jack Mostow, Carnegie Mellon University, USA
  • Brian Murphy, Queen's University Belfast, UK
  • Bilge Mutlu, University of Wisconsin–Madison, USA
  • Daniel Szafir, University of Wisconsin–Madison, USA
  • Martin Talbot, Warner Bros, Canada
  • Merlin Teodosia Suarez, De La Salle University, Manila, Philippines
  • Yanbo Xu, Carnegie Mellon University, USA

Links
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:24 AM
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:24 AM
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:24 AM
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:24 AM
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:24 AM
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:24 AM
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:24 AM
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:24 AM
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:25 AM
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:25 AM
Ċ
Xu 07.pdf
(359k)
Kai-min Kevin Chang,
Jun 5, 2014, 8:25 AM
Ċ
Kai-min Kevin Chang,
Jun 5, 2014, 8:25 AM
ċ
code.zip
(736k)
yueran yuan,
Mar 11, 2014, 8:21 PM
Comments