We present a novel wearable RGBD camera based navigation system for the visually impaired. The system is composed of a smartphone user interface, a glass-mounted RGBD camera device, a real-time navigation algorithm, and haptic feedback system. A smartphone interface provides an effective way to communicate to the system using audio and haptic feedback. In order to extract orientational information of the blind users, the navigation algorithm performs real-time 6-DOF feature based visual odometry using a glass-mounted RGBD camera as an input device. The navigation algorithm also builds a 3D voxel map of the environment and analyzes 3D traversability. A path planner of the navigation algorithm integrates information from the egomotion estimation and mapping and generates a safe and an efficient path to a waypoint delivered to the haptic feedback system. The haptic feedback system consisting of four micro-vibration motors is designed to guide the visually impaired user along the computed path and to minimize cognitive loads. The proposed system achieves real-time performance on a standard laptop, and helps the visually impaired extends the range of their activities and improve the mobility performance in a cluttered environment. The experiment results show that navigation in indoor environments with the proposed system avoids collisions successfully and improves mobility performance of the user compared to conventional and state-of-the-art mobility aid devices.
<Fig. 1> System overview: Image/3D information acquisition device, SLAM Process, and a feedback system.
 Young Hoon Lee, Gérard Medioni, RGB-D camera based wearable navigation system for the blind,
Computer Vision and Image Understanding 2015 (Under review).
 Young Hoon Lee, Gérard Medioni, Wearable RGBD indoor navigation system for the blind (Oral paper),
ECCV 2014 Assistive Computer Vision and Robotics Workshop, Zurich, Switzerland, Sep 2014.
 Young Hoon Lee, Gérard Medioni, A RGB-D camera Based Navigation for the Visually Impaired,
RSS 2011 RGB-D: Advanced Reasoning with Depth Camera Workshop, Los Angeles, Jun 2011.
 Wearable RGB-D navigation system: Grocery Scene
 Best Research Demo Award, 2nd Annual Ming Hsieh Department of Electrical Engineering Research Festival, USC
 Finalist, Cornell Cup USA, presented by Intel.
 USC Viterbi magazine
 The Economic Times: How Image Processing will move the world
 L.A. Business Journal