Extended reality (xR) and telepresence applications are on the rise. Existing experiences, however, are not fully immersive, as only two senses (audio-visual) are typically stimulated. In this talk, I will describe our ongoing work on developing multisensory immersive experiences, which combine auditory, visual, olfactory, and haptic/somatosensory stimuli. I will show the impact that stimulating more senses can have on user quality of experience, sense of presence and immersion, and engagement levels. Moreover, with multisensory experiences, monitoring these so-called human influential factors is crucial, as the perception of sensory stimuli can be very subjective (e.g., while a smell can be pleasant for some, it can be unpleasant for others). To this end, I will also describe our work on instrumenting virtual reality headsets with biosensors to allow not only for automated (remote) monitoring of human behaviour and tracking of human influential factors, but to also develop new markers of user experience, such as multimodal metrics of time perception and cybersickness. Lastly, I will describe some new applications of multisensory experiences being developed for healthcare, including the use of immersive multisensory nature walks for patients diagnosed with post-traumatic stress disorder and multisensory priming for motor-imagery based neurorehabilitation for stroke survivors.
Tiago H. Falk is a Full Professor at the Institut national de la recherche scientifique (INRS), University of Québec, Canada. He obtained his BSc from the Federal University of Pernambuco (Brazil) and his MSc and PhD from Queen’s University (Canada), all in Electrical and Computer Engineering. Prof. Falk is Founder and Director of the Multisensory Signal Analysis and Enhancement (MuSAE) Lab, which is focused on building next-generation human-machine interfaces for both real and virtual worlds. Since 2023, he is also Co-Director of the INRS-UQO Research Centre on Cybersecurity and Digital Trust, where research is being conducted to make human-machine interfaces secure and reliable by tackling emerging vulnerabilities to artificial intelligence algorithms. He is currently Co-Chair of the IEEE Future Directions Initiative on Telepresence, Co-Chair of the Technical Committee on Brain-Machine Interface Systems of the IEEE Systems, Man, and Cybernetics Society (SMCS), Member-at-Large of the SMCS Board of Governors, and Editor-in-Chief of the SMCS eNewsletter. He is a Fellow of the IEEE and the AAIA.
Robotic teleoperation is widely employed at CERN for inspection and maintenance tasks within the accelerator tunnels, for the purposes of reducing human time in hazardous environments and in certain cases, performing tasks humans are unable to complete. Visual, haptic and audio feedback is used depending on the task, and the control framework can adapt to the environment complexity and communication link quality. The robot is controlled via the CERN Robotic Framework, and the operator interface provided by the CERN Robotic GUI, a multi-modal interface. This talk will present the building blocks of both of these frameworks to allow for complex manipulation, and will also highlight particular interventions and many lessons learned over the last ten+ years of operations, as well as present current directions of research for future applications.
Eloise Matheson is a robotics engineer working at CERN, focused on both mechatronics systems design, integration and testing, and teleoperation of robots used regularly in the accelerator complex for inspection, maintenance and repair tasks. Previously she worked in the fields of teleoperation, for space applications as an engineer working at ESA, and later for medical (surgical) robotics at Imperial College London where she completed her PhD. She is currently a co-chair of the EU Robotics Telerobotics and Teleoperation Working Group and a co-chair of the IEEE RAS Technical Committee on Robotics for Nuclear Environments.
To enable effective control in robotics and telemanipulation and to achieve a sense of telepresence, human–machine interfaces must provide both dexterous control and high-fidelity sensory feedback. In this talk, we present a wearable system recently developed to establish a high-bandwidth, bidirectional connection between a user and external devices. The system, NeuraLoop, integrates 32 channels for electromyography (EMG) recording and 32 channels for electrotactile stimulation. EMG signals are used to detect user motion intentions and translate them into device commands using machine-learning techniques, while electrotactile stimulation delivers spatially distributed, high-resolution tactile feedback. We demonstrate how multichannel electrotactile stimulation can facilitate embodiment in virtual reality and how EMG decoding enables the detection of micro-gestures. Finally, we showcase the use of NeuraLoop for closed-loop control and discuss perspectives for its further development and applications.
Strahinja Dosen received the Diploma of Engineering in Electrical Engineering and the M.Sc. degree in Biomedical Engineering in 2000 and 2004, respectively, from the Faculty of Technical Sciences, University of Novi Sad, Serbia, and the Ph.D. degree in Biomedical Engineering from the Center for Sensory-Motor Interaction, Aalborg University, Denmark, in 2009. From 2011 to 2017, he worked as a Research Scientist at the Institute for Neurorehabilitation Systems, University Medical Center Gottingen, Germany, and then as an Associate Professor at the Department of Health Science and Technology (HST), Aalborg University (AAU). Currently, he is a Full Professor in the same Department and leads a research group on Neurorehabilitation Systems. Prof. Dosen was a principal investigator for AAU and HST in several EU (Tactility, Wearplex, Sixthsense, and SimBionics, Growt5) and nationally funded (Robin, Remap, Climb, NeuroMate) projects. He published more than 130 manuscripts in peer-reviewed journals, and his main research interest is the closed-loop control of assistive robotic systems.