Texture synthesis for stable planar tracking

Abstract

We propose a texture synthesis method to enhance the trackability of a target planar object by embedding natural features into the object in the object design process. To transform an input object into an easy-to-track object in the design process, we extend an inpainting method for naturally embedding the features into the texture. First, a feature-less region in an input object is extracted based on feature distribution based segmentation. Then, the region is filled by using an inpainting method with a feature-rich region searched in an object database. By using context based region search, the inpainted region can be consistent in terms of the object context while improving the feature distribution.

Publication

Clément Glédel, Hideaki Uchiyama, Yuji Oyamada, Rin-ichiro Taniguchi "Texture synthesis for stable planar tracking," ACM Symposium on Virtual Reality Software and Technology (VRST), 2018 [paper, bib] Awarded as one of Honorable Mentions (Poster & Demo) in VRST 2018!!