The following paper were accepted to the workshop, after going through an anonymous, two-stages reviewing process:
The use of 360-degree video has been increasing in the past years. However key challenges, such as guiding viewer’s attention without limiting free exploration, has also emerged. In this paper we proposed the notion of Action Units (AUs), composed of social cues, aimed to improve the user experience of the 360-degree video-based immersive storytelling platform that a viewer uses while seating in a swivel chair, namely “Swivel-chair VR”. We then evaluated AUs against other two commonly use techniques, by comparing their effects on memory, engagement, enjoyment, cybersickness, and viewers performance including search and attention. The results indicate that when used, AUs helped to increase the levels of engagement and enjoyment, reduce the search time of targets and the level of cybersickness. It is also preferred by users for its diegetic aspects.
As the development of virtual reality continues, it is getting more integrated into the life of people. Right now virtual reality is mostly confined to a specific fixed location. With the introduction of cableless and mobile head mounted displays there is no longer a need for a workstation to use VR. This way it is possible to bring VR experiences in public spaces. In this work we are focusing our evaluation on VR scenarios in public transport systems: plane, train, metro, tram or as a co-driver in a car. We analyse different seating situations for each type of transportation and its implications to VR applications. We present an analysis of possible scenarios including their challenges and constraints for interaction, space and navigation for public mobile seated VR.
VR is often experienced in a seated position, similar to most video games. The reasons for this are diverse and can go beyond just being more comfortable. Interestingly, even when users are physically seated, often the experience designers try to simulate standing and moving experiences, using head-bobbing and other ways to make users believe they are actually walking. Does this actually work? Can we actually get seated users to believe they are standing, walking, or even running? And is trying to provide the illusion of being standing or moving while being seated even relevant and helpful? Does it actually improve user experience and/or performance? Or might it be more effective to just be "honest" and ensure that the virtual locomotion metaphor matches the users' physical posture? In this extended abstract, we aim to explore these little-researched questions and analyze different situations and scenarios, hoping to help motivate future research and discussion on this topic.
In this position paper, we want to point, with a little bit of provocation and maybe a pinch of fun, to some grievances of and also chances for today's pool of (consumer) VR applications concerning the chosen user posture. Where in our opinion, the user is considered or required standing in too many cases, and transitions between postures are usually entirely unsupported.
In this paper, we compare the impact that standing and seated viewing of 360 videos on head-mounted displays has on subjective quality assessment. The statistical analysis of the data gathered in a pilot study is reported in terms of average rating times, mean opinion scores, and simulator sickness scores. The results indicate: (1) Average rating times consumed for 360 video quality assessment are similar for standing and seated viewing, (2) Higher resolving power among different quality levels is obtained for seated viewing, (3) Simulator sickness is kept significantly lower when seated.
The space near our body, the peripersonal space (PPS), has been investigated by focusing on changes in multisensory-stimuli enhanced neural and behavioural responses when stimuli are presented within or outside the PPS. We previously showed that the tactile reaction time decreased as a result of applying a rhythmic pattern consisting of walking-sound vibrations on the soles. However, it is unclear whether the change occurs if the same waveform pattern is applied to the soles at different step frequencies of vibration. Here, we show that applying walking-sound vibrations to the soles expands the peripersonal space, but only if they are presented at a credible walking frequency. The findings suggest that extension of the PPS is sensitive to the fidelity of the walking dynamics.
With this work, we want to put up for discussion a draft classification of advantages and disadvantages between sitting and standing user interfaces in VR.
Virtual reality (VR) exergames provide a unique opportunity for developing safe and effective therapies for older adults at assisted living facilities. This group of people has an increased risk of falls which can lead to severe injuries and increase the risk of falls even further. Fall prevention training is done regularly at these facilities, but workload for physicians is high, and the exercises generally do not change that often.
In this paper we discuss results of interviews with senior hospital staff and adequate fall prevention exercises which can be implemented in patients daily schedules and administered through a VR exergame. Movements that control the exergame match the motions suggested by our partner physicians and improve balance by shifting the players center of mass.
Towards Accessibility in VR - Development of an Affordable Wheelchair Motion Platform (PDF)
Kilian Brachtendorf, Benjamin Weyers and Daniel Zielasko
In virtual reality research, extensive effort has been made towards thoughtfully investigating various locomotion and interaction techniques in order to constitute an immersive experience for the user. Mandated by design seated applications are obliged to utilize alternate means of locomotion due to the inability to naturally move through the environment. Whilst complex devices and techniques are proposed in the state of the art applications, the subset of physically handicapped people are generally disregarded in these studies. In this work, the development of a wheelchair computer interface is suggested focusing on providing an acceptable degree of realism while still being cost-efficient and feasible for use in non-research environments.
When we walk through the physical worlds, our locomotion is naturally accompanied by a compelling and embodied sensation of mobility and self-motion - something that is missing from many gaming, telepresence, and VR applications unless users physically move. Physically moving has long been known to enhance spatial orientation performance when compared to imagined locomotion [8, 16, 18] and controller-based movements in VR [8]. With the increasing availability and affordability of mobile head-mounted displays that are untethered and provide inside-out tracking, physically walking in VR becomes increasingly feasible. There are, however, still shortcomings physical walking in VR that might not be fixable by technical solutions, such as limited free-space walking areas, safety concerns (for standing/walking users), accessibility to users with mobility challenges, fatigue, and comfort. That is, even with improved technologies, many users may still want to sit down for all but very short experiences, or when there is a compelling reason to stand or be upright. Here, we will discuss if and how it might be feasible to provide a compelling sensation of mobility and self-motion for seated VR users to tackle those concerns by utilizing leaning-based locomotion interfaces. We will also discuss obstacles, concerns, and limitations of this approach, and how we might improve those interfaces to make them more intuitive and explorable, without requiring much (if any) instructions.
Using Augmented Reality to Assist Seated Office Workers’ Data Entry Tasks (PDF)
Heejin Jeong, Ankit Singh, Myunghee Kim and Andrew Johnson
The data entry task is one of the common office tasks performed in the seated position that can be assisted by augmented reality technology. The paper introduces an ongoing project to develop an AR-based data presentation interface to assist office workers’ data entry tasks. The interface developed in this project was designed to facilitate the transition between the two steps including reading data and typing the data into a computer system using a keyboard. Usability evaluation is conducted in a controlled experiment with human subjects to examine whether the AR-based data presentation interface improves data entry performance and mitigates office workers’ physical and mental workload.