This session will be delivered by Hashini Senaratne and Brandon Matthews
This session will provide tutorial participants with knowledge of useful multimodal sensors, features, and pre-processing techniques for estimating different user states (e.g., cognitive load, stress, awareness, and intention). Such estimates could be used to measure experimental outcomes or provide input to human-robot collaborative (HRC) systems to achieve improved collaborations. Various data modalities will be discussed including physiological data such as cardiac and electrodermal activity and behavioural data such as vocal activity, body movements, eye movements and response time. We will also explore state of the art sensors suited to remote, in-situ and XR based HRI contexts. Alongside the theoretical components there will also be a practical feature extraction activity using publicly available datasets and a demo showcasing XR+HRI data collection using free and open-source tools.
10.00 am - 10.30 am: Useful physiological and behavioural features
10.30 am - 11:00 am: Break
11.00 am - 11.30 am:Â Sensors suited for different contexts
11.30 am - 12.00 pm: Preprocessing and synchronisation techniques
12.00 am - 12.30 pm: Hands-on user modelling activity and XR demo