Direct 3D Pose Estimation of a Planar Target
 
Oral Presentation in WACV 2016
 
Hung-Yu Tseng 1*          Po-Chen Wu 1*          Ming-Hsuan Yang 2           Shao Yi Chien 1
* Equal contribution
 

Abstract
 
Estimating 3D pose of a known object from a given 2D image is an important problem with numerous studies for robotics and augmented reality applications. While the state-of-the-art Perspective-n-Point algorithms perform well in pose estimation, the success hinges on whether feature points can be extracted and matched correctly on targets with rich texture. In this work, we propose a robust direct method for 3D pose estimation with high accuracy that performs well on both textured and textureless planar targets. First, the pose of a planar target with respect to a calibrated camera is approximately estimated by posing it as a template matching problem. Next, the object pose is further refined and disambiguated with a gradient descent search scheme. Extensive experiments on both synthetic and real datasets demonstrate the proposed direct pose estimation algorithm performs favorably against state-of-the-art feature-based approaches in terms of robustness and accuracy under several varying conditions.

Downloads
 
Paper [pdf]
Supplementary [zip]
Presentation [link][slides]
Poster [pdf]
Codes [GitHub]

Bibtex
 
@inproceedings{tsengdirect,
    title={Direct 3D Pose Estimation of a Planar Target},
    author={Tseng, Hung-Yu and Wu, Po-Chen and Yang, Ming-Hsuan and Chien, Shao-Yi},
    booktitle = {Proc. IEEE Winter Conference on Applications of Computer Vision},
    year = {2016}
}