Non-Rigid Recovery and Tracking

Real Time Non-Rigid 3D Surface Tracking using Particle Filter

ABSTRACT

Recovering a deformable 3D surface from a single image is an ill-posed problem because of the depth ambiguities. The resolution to this ambiguity normally requires prior knowledge about the most probable deformations that the surface can support. Many methods that address this problem have been proposed in the literature. Some of them rely on physical properties, while others learn the principal deformations of the object or are based on a reference textured image. However, they present some limitations such as high computational cost or the lack of the possibility of recovering the 3D shape. As an alternative to existing solutions, this work provides a novel approach that simultaneously recovers the non-rigid 3D shape and the camera pose in real time from a single image. This proposal relies on an efficient particle filter that performs an intelligent search of a database of deformations. We present an exhaustive Design of Experiments to obtain the optimal parametrization of the particle filter, as well as a set of results to demonstrate the visual quality and the performance of our approach.

PAPERS

Books:

  • .

Journals:

  • Leizea, I., Álvarez, H., and Borro, D., “Real time non-rigid 3D surface tracking using particle filter”, Computer Vision and Image Understanding, Vol. 133, pp. 51-65. April, 2015. (pdf, video).

Conferences:

  • .

Posters:

  • .

VIDEOS

RELATED PROJECTS

Real-time Deformation, Registration and Tracking of Solids Based on Physical Simulation

ABSTRACT

This paper proposes a novel approach to registering deformations of 3D non-rigid objects for Augmented Reality applications. Our prototype is able to handle different types of objects in real-time regardless of their geometry and appearance (with and without texture) with the support of an RGB-D camera. During an automatic offline stage, the model is processed in order to extract the data that serves as input for a physics-based simulation. Using its output, the deformations of the model are estimated by considering the simulated behaviour as a constraint. Furthermore, our framework incorporates a tracking method based on templates in order to detect the object in the scene and continuously update the camera pose without any user intervention. Therefore, it is a complete solution that extends from tracking to deformation formulation for either textured or untextured objects regardless of their geometrical shape. Moreover, our proposal focuses on giving a correct visual with a low computational cost. Experiments with real and synthetic data demonstrate the visual accuracy and the performance of our approach.

PAPERS

Journals:

  • Leizea, I., Mendizabal, A., Alvarez, H., Aguinaga, I., Borro, D., and Sanchez, E., "Real-Time Visual Tracking of Deformable Objects in Robot-Assisted Surgery", IEEE Computer Graphics and Applications, Vol. 37, No. 1, pp. 56-68, January-February, 2017. (pdf).

Conferences:

  • Leizea, I., Álvarez, H., Aguinaga, I., and Borro, D., “Real-Time Deformation, Registration and Tracking of Solids Based on Physical Simulation”, Proceedings of the 13th IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2014), pp. 165-170 (ISBN: 978-1-4799-6184-9). Munich, Bavaria, Germany. September 10-12, 2014. (pdf, video).

Posters:

    • .

VIDEOS

RELATED PROJECTS