Contributions

The following contributions were accepted to the workshop. The papers therefore went through an anonymous, two-stages reviewing process.

Multisensory Locomotion for Seated VR (Invited Talk) [Video]

Francesco Soave, Ildar Farkhatdinov, and Nick Bryan-Kinns

Recent developments of XR applications along with multisensory input and stimulation (i.e. visual + sound + haptics) led to investigate further locomotion techniques and possibilities that open up. Walking is the most usual form of locomotion for humans and is correlated with various sensory feedback such as visual, somatosensory, vestibular, efference-copy and auditory. For video games and simulators that provide on-screen visual feedback, we generally do not experience serious problems while using keyboards or joysticks to walk around the VR scene with a character. However, it has long been shown that using such interfaces for walking causes more motion sickness, disorientation and reduced immersion compared to interfaces which utilize body-based input from users. One to one mapping of the movements of the user in the real environment to the virtual environment requires a large work space and costly sensing instruments...

An Ankle-Actuated Seated VR Locomotion Interface (Short-Paper) [Video]

Ata Otaran and Ildar Farkhatdinov

We present an interface that allows users to perform foot-tapping movements in a seated posture to emulate walking sensation in VR. The interface comprises an impedance controlled robotic foot platform that moves while resisting the user's foot-tapping movements. The platform's angle data is used to recognize the walking intention of the user and animate the VR avatar accordingly. The impedance characteristics of the platform are controlled to emulate different VR walking terrains.

On Head Movements in Repeated 360 Video Quality Assessment for Standing and Seated Viewing on Head Mounted Displays
(Short-Paper) [
Video]

Majed Elwardy, Hans-Juergen Zepernick, and Yan Hu

Watching 360 videos on head mounted displays (HMDs) allows viewers to explore scenes in all directions. In this paper, we focus on investigating the head movements of two participants for standing and seated viewing of a total of 720 360 videos on HMDs. The head movements were recorded in a 360 video quality assessment experiment which was repeated after a long and short break between sessions to study changes in viewing behavior over time. The analysis of the head movement data is provided as histograms of head rotations, head speed, head turns, and head trajectories. It is shown that the participants have their own distinct exploration behavior for standing viewing which becomes less different for seated viewing.

How to Take a Brake from Embodied Locomotion Interfaces (Invited Talk) [Video]

Carlo Flemming and Daniel Zielasko

A challenge with embodied locomotion interfaces is that the user often cannot “not” use them. In this talk, we want to characterize this challenge and draw possible solutions that we are currently investigating.

The Bayesian Causal Inference of Body Ownership Model: Use in VR and Plausible Parameter Choices (Short-Paper) [Video]

Moritz Schubert and Dominik M. Endres

Experiencing virtual body ownership is an important component of user experience in virtual reality applications with embodied avatars. A functioning model of body ownership could allow designers of such applications to predict the occurrence of body ownership illusions in users. One attempt at such a model, the Bayesian Causal Inference of Body Ownership (BCIBO) model, explains body ownership as inference about the causes of sensory signals. When sensory signals under consideration (e.g. tactile and visual signals) are attributed to a single object (e.g. a rubber hand), then this object is incorporated into the body. We investigate an unrealistic choice of parameter values in the original specification of the BCIBO model and make some suggestions for improvements.