Teachers are the Learners: Providing Automated Feedback on Classroom Interpersonal Dynamics (NSF Cyberlearning #1822768)
Teachers are the Learners: Providing Automated Feedback on Classroom Interpersonal Dynamics (NSF Cyberlearning #1822768)
This is an NSF-funded (Cyberlearning #1822768, PI: Jacob Whitehill) project on how machine learning and computer vision can be harnessed to characterize automatically the interpersonal dynamics between students and teachers from videos of school classrooms, and how machine perception can deliver new training experiences for both pre- and in-service teachers. The project is an interdisciplinary collaboration that spans multi-modal machine learning, data visualization, classroom observation, and teacher training.
Worcester Polytechnic Institute (WPI)
University of Virginia (UVA)
The quality of teacher-student interactions in school classrooms both predicts and impacts students’ learning outcomes. Training teachers to perceive subtle interactions and interpersonal classroom dynamics more accurately can help them to implement more effective interactions in their own classrooms. Contemporary methods of training teachers to understand classroom interactions are based mostly on watching classroom observation videos of other teachers, which have been annotated for different dimensions (“positive climate”, “teacher sensitivity”, etc.) of an observation protocol such as the widely-used Classroom Assessment Scoring System (CLASS; Pianta et al. 2008). Only rarely do teachers receive personalized feedback on their own classroom interactions, and when they do, it is sparse — typically coded just once for every 15-minute video segment — that does not provide detailed explanations of how the segment was scored. In order to provide more temporally specific, more densely annotated, and more efficient feedback on teachers’ own classroom observation sessions, we propose to develop an Automatic Classroom Observation Recognition neural Network (ACORN) that extracts and integrates multimodal features of facial expression, eye gaze, auditory emotion, speech, and language in order to assess classroom dynamics automatically. ACORN will be trained based on two CLASS-coded classroom observation datasets collected by the University of Virginia (UVA) of hundreds of pre-school and elementary school teachers across the USA. Moreover, based on the ACORN prototype, we will develop a Classroom Observation Interactive Learning System (COILS) that trains teachers to perceive classroom dynamics more precisely. The COILS will be evaluated in a study on 50 pre-service teachers at UVA’s Curry School of Education.