Embodied Exploration (ASSETS'23)

May 2022 - February 2023

Overview

For many people with disabilities, knowing how accessible a space is before they go there is a necessity for their safety and ability to navigate the space. Unfortunately, doing so is a difficult task to perform accurately in-person, and even more so remotely. Currently, assessing this remotely is limited to making phone calls to verify, looking at photos or videos of the space, or virtual tours. These methods are not tailored to a person's disability and may be missing crucial information such as legroom under a countertop, height of certain objects, and whether or not items are within reach.

As such, Embodied Exploration is a novel Virtual Reality (VR) technique that enables wheelchair users to embody Meta Avatars and interact with digital twins of physical spaces to assess accessibility for their personal circumstances. We implemented three interaction techniques: manipulation, visibility, and locomotion. 

I personally implemented the manipulation framework in Unity and created examples such as opening windows, retrieving/filling a cup of water, etc. While these tasks may seem simple from an outside perspective, it may be difficult or even impossible for someone in a wheelchair depending on their physical characteristics. Thus, such tasks cannot be overlooked. 

We conducted a user study in which six real-world wheelchair users compared our technique against a photo gallery and virtual tour, through which we found that Embodied Exploration provided an accurate and viable technique to facilitate remote accessibility assessment.

Demo

Embodied Video.mp4

Links

Video Preview

Presentation

Acknowledgements:

This project was my second successful publication and I give thanks to all of my colleagues, user study participants, and advisor Professor Yang Zhang.