Multimodal analytics interfaces for simulation-based scenarios

Using different sensors and equipment (e.g. cameras, indoor localisation system, microphones) we aim to collect data from healthcare simulation scenarios in a Bachelor of Nursing program. This data would allow us to create multimodal learning analytics interfaces to support teachers and students during debrief sessions. Read more...

Towards communicating insights using exploratory visualisations

The aim of this project is to automatically extract insights from complex data and translate them into explanatory visualisations using data storytelling techniques. Read more...

DBCollab: automated feedback for face-to-face database design

In this project, we developed a system to provide automated feedback to groups that are working together in the design of conceptual models of a database. The system is composed of a multitouch interactive tabletop, a 360 camera, a kinect deepth camera and a microphone array and captures different modalities of communication along with the click streams . Read more...

Automated feedback of oral presentation skills

In this project, we captured multimodal data to built a model that predict communication skills during oral presentations. Data from a kinect depth camera, microphones and power point presentations were analysed to create a predictive model. Read more...