Merging photo textures - (from calibrated cameras) - projected onto geometry
Posted
by freakTheMighty
on Stack Overflow
See other posts from Stack Overflow
or by freakTheMighty
Published on 2010-03-12T23:46:16Z
Indexed on
2010/03/12
23:47 UTC
Read the original article
Hit count: 246
computer-vision
|computer-graphics
I am looking for papers/algorithms for merging projected textures onto geometry. To be more specific, given a set of fully calibrated cameras/photographs and geometry, how can we define a metric for choosing which photograph should be used to texture a given patch of the geometry.
I can think of a few attributes one may seek minimize including the angle between the surface normal and the camera, the distance of the camera from the surface, as well as minimizing some parameterization of sharpness.
The question is how do these things get combined and are there well established existing solutions?
© Stack Overflow or respective owner