Light fields from 3D Scanning
2013
2013
Acquiring the complete appearance of real objects is a complex process, which requires to control both the lighting and observation conditions. Even though this is feasible in the extremely controlled environment of a laboratory, it becomes impractical in the concrete case of an on-site measurement campaign. That’s why many market softwares provide only colour maps as a result of the appearance reconstruction stage. However, by nature, such a simple representation is unable to encode reflectivity variations over the object surface, and is then not sufficient to adequately represent the materials it is made of, leading to an unavoidable loss of realism.
Appearance measurement always requires the acquisition of pictures. Shot from different viewpoints, these pictures naturally capture the material response according to the observer’s position. We thus proposed to exploit this available information to encode the object colour in the form of a light field, that is, as a function of the viewing direction. To do so, we estimate coefficients of spherical harmonics or polynomial basis functions from the acquired colour samples. Since fitting strategies are very dependent to the input, the uneven distribution of samples coming from real data acquisition may lead to poor results. We therefore worked on a regularization term to condition the fitting process, in order to avoid chromatic aberrations in poorly sampled regions.
Coefficients are eventually stored in textures and evaluated on-the-fly during rendering according to the current viewpoint, leading to a digital copy that exhibits reflections while the observer moves around the object, and thus providing a better understanding of the materials it is made of.