Proof of concept experience about bringing video game characters into the real world on a holographic light field display. Research work presented at the ACM International Conference on Intelligent Virtual Agents. Paris, France, July 02 - 05, 2019. DOI: 10.1145/3308532.3329423
Proof of concept experience demonstrating interactive virtual characters for immersive mixed reality (MR) applications. Research work presented at the IEEE International Symposium on Mixed and Augmented Reality (ISMAR). Nantes, France, October 09-13, 2017. DOI: 10.1109/ISMAR-Adjunct.2017.45
In this project we developed a series of experiments aimed to investigate the emotional and physiological effects of virtual embodiment in immersive virtual environments.
On the first experiment we investigated the physiological effects of manipulating physiological feedback. On a second experiment we investigated how the manipulation of physiological feedback affects the perception of evocative stimuli.
We carried out an experiment to explore aspects of brain activity (surface EEG signals) when a virtual arm moves. The objective was to understand the relationship between real, and virtual arm movements. Given that previous research has shown that the mirror neuron mechanisms are more active from first person perspective our hypothesis was that virtual body ownership might increase those effects.
This study presents an evaluation of a mobile game with physiologically aware virtual humans as an approach to modulate the participant’s affective and physiological state. We developed a mobile version of a virtual reality scenario where the participants were able to interact with virtual human characters through their psychophysiological activity. Music was played in the background of the scenario and, depending on the experimental condition, the virtual humans were initially either barely dancing or dancing very euphorically. The task of the participants was to encourage the apathetic virtual humans to dance or to calm down the frenetically dancing characters, through the modulation of their own mood and physiological activity. Results from our study show that by using this mobile game with the physiologically aware and affective virtual humans the participants were able to emotionally arouse themselves in the Activation condition and were able to relax themselves in the Relaxation condition, during the same session with only a brief break between conditions. The self-reported affective data was also corroborated by the physiological data (heart rate, respiration and skin conductance) which significantly differed between the Activation and Relaxation conditions.
Hobby DYI project where I build a Robot Smart Car using a Raspberry pi.
I developed the mobile user interface for Android using Unity. Video streaming implemented using UDP sockets.
OpenCV was used for face detection and pre-trained TensorFlow Lite models for object classification and detection.
We used respiration feedback to enhance the embodiment illusion in a virtual avatar.
Implemented inverse kinematics for VR characters used in a demo (foot IK, heap displacement, blend shape animations for walking, upper body IK)
Morph targets to convey micro expressions driven by gaze tracking
We used immersive virtual reality to induce a full body ownership illusion that allows offenders to be in the body of a victim of domestic abuse.
It has been shown that it is possible to induce a strong illusion that a virtual body (VB) is one’s own body. However, the relative influence of a first-person-perspective (1PP) view of the VB and spatial coincidence of the real body and VB remains unclear. We demonstrate a method that permits separation of these two factors. It provides a 1PP view of a VB, supporting visuomotor synchrony between real body and VB movements, but where the entire scene including the body is rotated 15° upwards through the axis connecting the eyes, so that the VB and real body are coincident only through this axis. In a within-subjects study that compared this 15° rotation with a 0° rotation condition, participants reported only slightly diminished levels of perceived ownership of the VB in the rotated condition and did not detect the rotation of the scene. These results indicate that strong spatial coincidence of the virtual and real bodies is not necessary for a full-body ownership illusion.
We developed a Respiratory Computer Interface (RCI) that enables the user to interact with third application through the use of respiratory measures.
In this study, real-time processing of physiological signals is presented as an alternative approach for natural and intuitive communication with computer applications. Particularly, respiration measures are used as an innovative technique for game interaction. A respiratory-computer interface (RCI) has been designed and evaluated with the aid of an ad-hoc mini-game where the players participated in a race to blow up 3D virtual balloons. Participants evaluated attributes of the interface such as: usability, learnability, satisfaction and immersion. The results show that the RCI can be very useful as a natural and involving game interface.
In this project I lead a group of undergraduate students to develop a biofeedback mobile game to assist breath-control training and encourage relaxation. Some other key elements added to the game included music and graphics.
The design of the mobile relaxation game was based on the research findings by Adhikari (2008) on relaxation and breath-control, and built with the technical specification of the Respiratory Computer Interface by Arroyo-Palacios J. and Romano D (2009).
We investigated how to recognize emotion in real-time, experienced during different types of media exposure (video clips, pictures and virtual environments) using physiological signals.
Different machine learning algorithms for pattern recognition and classification of physiological signals were tested (Linear Discriminant Analysis, Multilayer Perceptron Networks, Probabilistic Neural Networks and Support Vector Machines) having most satisfactory results with Probabilistic Neural Networks.
As result of this research we develop a Bio-Affective Computer Interface (BACI) that facilitates the adaptation of third party computer applications according to the emotional state of the user.
External ventricular drains (EVD) are widely used to measure and manage intracranial pressure (ICP) for aneurysmal subarachnoid hemorrhage (aSAH) patients. After several days of use a decision is made to remove the drain or replace it with an indwelling shunt. This involves a “clamping trial” whereby the EVD is clamped and transduced and pre and post clamping CT imaging is performed to observe any change in ventricular size and clinical status. This practice may lead to prolonged hospital stay, extra radiation exposure, and neurological insult due to ICP elevation. The present study aims to apply a widely validated morphological clustering analysis of ICP pulse (MOCAIP) algorithm to detect signatures from the pulse waveform to differentiate an intact CSF circulatory system from an abnormal one during EVD clamping. We hypothesize that an intact CSF system should be stable and pulses with a similar mean ICP level should have similar shapes.
As part of this project we are currently designing and implementing a Reinforcement Learning System to maximize the sense of Presence in immersive virtual environments.