Recreating lighting based on sample image

asked 2018-02-10 09:20:02 -0600

charles1208 gravatar image

updated 2018-02-10 09:22:54 -0600


I am wondering if there is any current research/implementation to this problem:

Say given 2 images, one with an object lighted with an (assumingly) uniform lighting, and another image, with the same object, illuminated with a fixed, single light source. Is there a relatively simple way to say, given a 3rd image, with the same object, but from another perspective, to generate a similar lighting?

In order words, is there a way to remodify the 3rd image, such that it looks like how the object would look like under similar lighting conditions as in the 2nd image? Without going through the entire process of building a 3d physics simulation.

Of course, assuming that the objects are purely solid, and ignoring things like transparency and different albedos of the different portion of the object within the images. And that we also have the perspective relations between each individual image.

edit retag flag offensive close merge delete


Somethink like this?

LBerger gravatar imageLBerger ( 2018-02-10 10:49:19 -0600 )edit

Not exactly. What i am looking for is more of like to simulate how an image of an object would look like under different lighting conditions; as opposed to having different lighting conditions as inputs to model an object.

charles1208 gravatar imagecharles1208 ( 2018-02-10 21:44:22 -0600 )edit

something like this but it is not image processing

LBerger gravatar imageLBerger ( 2018-02-11 02:04:20 -0600 )edit