Back to Search Start Over

A Framework for Automatically Recovering Object Shape, Reflectance and Light Sources from Calibrated Images

Authors :
Alain Fournier
Daniel Meneveaux
Bruno Mercier
SIGNAL-IMAGE-COMMUNICATION (SIC)
Université de Poitiers-Centre National de la Recherche Scientifique (CNRS)
Imager Lab
Columbia University [New York]
IG
Source :
International Journal of Computer Vision, International Journal of Computer Vision, Springer Verlag, 2007, 73 (1), pp.77-93. ⟨10.1007/s11263-006-9273-y⟩
Publication Year :
2007
Publisher :
HAL CCSD, 2007.

Abstract

More details on http://www.sic.sp2mi.univ-poitiers.fr/ibr-integration/ijcv.html The original publication is available at www.springerlink.com; International audience; In this paper, we present a complete framework for recovering an object shape, estimating its reflectance properties and light sources from a set of images. The whole process is performed automatically. We use the shape from silhouette approach proposed by R. Szeliski in [40] combined with image pixels for reconstructing a triangular mesh according to the marching cubes algorithm. A classification process identifies regions of the object having the same appearance. For each region, a single point or directional light source is detected. Therefore, we use specular lobes, lambertian regions of the surface or specular highlights seen on images. An identification method jointly (i) decides what light sources are actually significant and (ii) estimates diffuse and specular coefficients for a surface represented by the modi- fied Phong model [25]. In order to validate our algorithm ef- ficiency, we present a case study with various objects, light sources and surface properties. As shown in the results, our system proves accurate even for real objects images obtained with an inexpensive acquisition system.

Details

Language :
English
ISSN :
09205691 and 15731405
Database :
OpenAIRE
Journal :
International Journal of Computer Vision, International Journal of Computer Vision, Springer Verlag, 2007, 73 (1), pp.77-93. ⟨10.1007/s11263-006-9273-y⟩
Accession number :
edsair.doi.dedup.....4038908ff2cb68dce89e02ba481cb090
Full Text :
https://doi.org/10.1007/s11263-006-9273-y⟩