2. WeARHand

WeARHand: Head-Worn, RGB-D Camera-Based, Bare-Hand User Interface with Visually Enhanced Depth Perception

Abstract

  • We introduce WeARHand, which allows a user to manipulate virtual 3D objects with a bare hand in a wearable augmented reality (AR) environment. Our method uses no environmentally tethered tracking devices and localizes a pair of near-range and far-range RGB-D cameras mounted on a head-worn display and a moving bare hand in 3D space by exploiting depth input data. Depth perception is enhanced through egocentric visual feedback, including a semi-transparent proxy hand. We implement a virtual hand interaction technique and feedback approaches, and evaluate their performance and usability. The proposed method can apply to many 3D interaction scenarios using hands in a wearable AR environment, such as AR information browsing, maintenance, design, and games.

Paper

  • T. Ha, S. Feiner and W. Woo, “WeARHand: Head-Worn, RGB-D Camera-Based, Bare-Hand User Interface with Visually Enhanced Depth Perception,” ISMAR (S&T), Sep. 2014.

Video

Image

Hand Tracking with a Near-range Depth Camera for Virtual Object Manipulation in an Wearable Augmented Reality

Abstract

    • This paper proposes methods for tracking a bare hand with a near-range depth camera attached to a video see-through Head-mounted Display (HMD) for virtual object manipulation in an Augmented Reality (AR) environment. The particular focus herein is upon using hand gestures that are frequently used in daily life. First, we use a near-range depth camera attached to HMD to segment the hand object easily, considering both skin color and depth information within arms’ reaches. Then, fingertip and base positions are extracted through primitive models of the finger and palm. According to these positions, the rotation parameters of finger joints are estimated through an inverse-kinematics algorithm. Finally, the user’s hands are localized from physical space by electro-magnetic tracker and then used for 3D virtual object manipulation. Our method is applicable to various AR interaction scenarios such as digital information access/control, creative CG modeling, virtual-hand-guiding, or game UIs.

Paper

    • G. Park, T. Ha, and W. Woo, "Hand Tracking with a Near-range Depth Camera for Virtual Object Manipulation in an Wearable Augmented Reality," HCII 2014. [Link]

Video