Practical Camera Calibration

Abstract

Camera calibration is one of the long existing research issues in computer vision domain. Typical calibration methods take two steps for the procedure: control points localization and camera parameters computation. In practical situation, control points localization is a time-consuming task because the localization puts severe assumption that the calibration object should be visible in all images. To satisfy the assumption, users may avoid moving the calibration object near the image boundary. As a result, we estimate poor quality parameters.


In this work, we aim to solve this partial occlusion problem of the calibration object. To solve the problem, we integrate a planar marker tracking algorithm that can track its target marker even with partial occlusion. Specifically, we localize control points by a RANdom DOts Markers (RANDOM) tracking algorithm that uses markers with randomly distributed circle dots [Uchiyama 2011]. Once the control points are localized, they are used to estimate the camera parameters.


The potential application that the proposed calibration method can contribute is multiple cameras calibration to build multiple cameras based applications such as camera arrays and multi-view stereo reconstruction. Currently, we are working on the multiple cameras calibration work.

Publication

  • Yuji Oyamada, Pascal Fallavollita, and Nassir Navab, "Single Camera Calibration Using Partially Visible Calibration Objects Based on Random Dots Marker Tracking Algorithm," IEEE ISMAR 2012 Workshop on Tracking Methods and Applications (TMA), 2012 [pdf, bib, slide]
  • Yuji Oyamada, Pascal Fallavollita, Hideo Saito, and Nassir Navab, "Occlusion handling multiple camera calibration using a planar marker tracking algorithm," The 16th Meeting on Image Recognition and Understanding (MIRU), 2013 [paper, bib, teaser slide]

Source code