Ask Your Question
0

Is it possible to correct for changing exposure/contrast etc in images?

asked 2016-02-15 11:17:56 -0600

saihv gravatar image

I am trying to do feature detection and matching between two cameras for my application. Feature matching works perfectly when the cameras are closer to each other, but when the positions change such that the auto exposure of one of the cameras makes its images darker/lighter compared to the other, matching fails. I am trying to figure out if that's happening because of too much difference in the intensities (I believe). Is it possible to somehow 'normalize' the images with respect to each other so that this difference is eliminated and I can check if matching works then?

Here's a pair of images where matching works: (blue circles are matched features)

image description

And a failure case: (notice left image darker than right because the camera moved into a different area)

image description

edit retag flag offensive close merge delete

Comments

you can try this too before record picture

LBerger gravatar imageLBerger ( 2016-02-16 06:49:16 -0600 )edit

1 answer

Sort by ยป oldest newest most voted
1

answered 2016-02-16 03:22:32 -0600

kbarni gravatar image

You can equalize the two images with equalizeHist. This will also brighten up the dark areas to get more detail. This is the simpler solution.

Another solution would be to compute the histogram (H1,H2) of the two images, then calculate the energy of each histogram (E1,E2) and multiply image1 be E2/E1.

edit flag offensive delete link more

Question Tools

1 follower

Stats

Asked: 2016-02-15 11:17:56 -0600

Seen: 1,008 times

Last updated: Feb 16 '16