Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Is it possible to correct for changing exposure/contrast etc in images?

I am trying to do feature detection and matching between two cameras for my application. Feature matching works perfectly when the cameras are closer to each other, but when the positions change such that the auto exposure of one of the cameras makes its images darker/lighter compared to the other, matching fails. I am trying to figure out if that's happening because of too much difference in the intensities (I believe). Is it possible to somehow 'normalize' the images with respect to each other so that this difference is eliminated and I can check if matching works then?

Here's a pair of images where matching works: (blue circles are matched features)

image description

And a failure case: (notice left image darker than right because the camera moved into a different area)

image description