3D Finger CAPE: Clicking Action and Position Estimation

Abstract
In this paper we present a novel framework for simultaneous detection of click action and estimation of occluded fingertip positions from egocentric viewed single-depth image sequences. For the detection and estimation, a novel probabilistic inference based on knowledge priors of clicking motion and clicked position is presented. Based on the detection and estimation results, we were able to achieve a fine resolution level of a bare hand-based interaction with virtual objects in egocentric viewpoint. Our contributions include: (i) a rotation and translation invariant finger clicking action and position estimation using the combination of 2D image-based fingertip detection with 3D hand posture estimation in egocentric viewpoint. (ii) a novel spatio-temporal random forest, which performs the detection and estimation efficiently in a single framework. We also present (iii) a selection process utilizing the proposed clicking action detection and position estimation in an arm reachable AR/VR space, which does not require any additional device. Experimental results show that the proposed method delivers promising performance under frequent self-occlusions in the process of selecting objects in AR/VR space whilst wearing an egocentric-depth camera-attached HMD.


Publication
Youngkyoon Jang, Seung-Tak Noh, Hyung Jin Chang, Tae-Kyun Kim, Woontack Woo
3D Finger CAPE: Clicking Action and Position Estimation under Self-Occlusions in Egocentric Viewpoint
IEEE Transactions on Visualization and Computer Graphics (TVCG), vol. 21, no. 4, pp.501-510, April 2015.
(also presented in
 IEEE VR 2015, Arles, Camargue, Provence, France, Mar. 23-27, 2015 as a long paper, accept rate: 13.8% (13/94)
).

Download: 
[PDF] [Demo on Youtube video] [Project page]


Online demo video
 Preview video (30 seconds)
 Long video (3 minutes)

Acknowledgement

This work was supported by the Global Frontier R&D Program on 
<Human-centered Interaction for Coexistence> funded by the National Research Foundation of Korea grant funded by the Korean Government(MSIP)(2010-0029751).



Links