MIXED REALITY EXPLORATIONS

Simulating the physical, embodied experience of sailing and navigating a Micronesian outrigger canoe is an exciting new “grand challenge” for Spatial Interaction research that seeks to advance underlying technologies for multi-modal human-computer interaction, physical simulation, haptic feedback, and experiential data analytics.

Our preliminary collaborative work includes inventing novel, multi-sensory VR simulations controlled with custom, mixed reality spatial interfaces (Fig. 1). Our most advanced simulator includes accurate star maps and 3D terrain and bathymetry for Chuuk Lagoon in Micronesia along with an outrigger canoe model with dynamic sail and rudder. These are controlled using a custom fabricated tangible mixed reality interface where a physical mainsheet (rope connected to the sail) and wooden tiller held in the hands are tracked with 6 degrees-of-freedom in 3D space to control their virtual counterparts, which interact with the simulated water. There are many approximations in the current simulator, and a robust, multi-physics simulation is one of our planned research topics. However, the technical topic we are most passionate about lies in the hybrid digital+physical, or mixed reality, spatial interface.


Already, the mixed reality interface engages human, embodied cognition in ways that we have never before seen in a virtual reality environment, and this has powerful implications. By holistically engaging the full sensate, in ways inspired and directed by master Micronesian navigators, we aim to create a new class of rich spatial interfaces that not only helps them to keep the embodied, multisensory skills of sailing and navigating these engineering marvels alive (and evolving) in their displaced communities but also advances the underlying technologies to the point that navigators will be able to feel (through useful perceptual illusion and haptic feedback) physical sensations as complex as the reflection and refraction patterns that form as multiple swells interact with the islands. Swells, wind, scents, stars, temperatures – all are critical inputs to the master navigator, and together these create a unique challenge problem for the mixed reality research. Beyond more typical multi-modal interfaces that use voice and 3D tracker inputs or gestures to accomplish simple tasks, like “put that there”. Imagine instead if our spatial interfaces could be as rich as the way an outrigger crew collaborates to perform the hap – a special tacking maneuver that requires facing the boat into the wind, luffing the sail, untying the two booms of the special V-shaped sail, simultaneously loosening the forestay and backstay, picking up the booms (sail attached and flapping now violently in the wind), quickly passing these down the line of 5 crew members one person to the next (all while shifting body weight to balance the boat in the waves and maintain a heading into the wind), tying the booms to the other end of the boat, which now becomes the bow, tightening the mainsheet, and finally continuing on the new course – what a challenge problem for embodied computing, sensing, interaction, and new community-based mixed reality environments!