Automatic robot self-calibration
Nowadays, humanoid but also other robots come with a rich set of powerful yet inexpensive sensors like cameras, RGB-D cameras, inertial, tactile or force sensors. This opens up the possibility for calibration approaches that are more “self-contained”, can be performed autonomously and repeatedly by the robot, and that simultaneously estimate the position of the sensors with respect to the robot. The robot model itself can be simultaneously calibrated as well.
We have demonstrated the potential of this approach on a number of robots and developed also a multisensorial robot calibration toolbox: https://github.com/ctu-vras/multirobot-calibration.
The key to self-calibration is redundancy. The kinematic chain can be closed exploiting physical contact (aka closed-loop calibration approaches) or by observing the robot pose using visual sensors (open-loop calibration approaches). Next to traditional methods exploiting contact with the environment (e.g. robot touching a planar surface - panel (B) below) or external metrology systems (e.g. laser trackers - panel (D) below), we have added self-contact (panel A) and self-observation (panel C) as methods that are suited for automatic self-contained calibration.
We show how the different calibration approaches can be combined in a single cost function. Thorough experimental validation of the methods in isolation and in combination is shown on the iCub humanoid (left, self-touch and self-observation) and an industrial dual-arm manipulator (right, self-contact, self-observation, planar constraints, external laser tracker).
Self-contact or self-touch can be employed in different ways. Contact can be exploited as a constraint (top right) or, if the robot is covered with sensitive skin, one can use the skin to calibrate the robot kinematics (bottom left), or, the robot kinematics can be used to spatially calibrate the skin (bottom right).
Rustler, L.; Potocna, B.; Polic, M.; Stepanova, K. & Hoffmann, M. (2021), Spatial calibration of whole-body artificial skin on a humanoid robot: comparing self-contact, 3D reconstruction, and CAD-based calibration, in 'Humanoid Robots (Humanoids), IEEE-RAS International Conference on', pp. 445-452. [IEEE Xplore][preprint-pdf]
An overview of the multisensorial and multirobot calibration toolbox (https://github.com/ctu-vras/multirobot-calibration) is shown below.
Rozlivek, J.; Rustler, L.; Stepanova, K. & Hoffmann, M. (2021), Multisensorial robot calibration framework and toolbox, in 'Humanoid Robots (Humanoids), IEEE-RAS International Conference on', pp. 459-466. [IEEE Xplore][preprint-pdf]