Ask Your Question

Does camera rotation help with a stereo depth map?

asked 2015-03-17 15:54:43 -0500

Cerin gravatar image

I'm thinking of creating a simple dual webcam setup for testing OpenCV's stereo depth map feature. All the examples and descriptions I've seen setup the cameras in a fixed position, pointing them straight ahead with their line-of-sight perfectly parallel to each other.

Are there any cases where the cameras are mounted on separate movable axis where they can slightly rotate side to side, in order to "focus" them on a specific point closer or farther away? I'm thinking of a process similar to how the human eye rotates side to side in order to focus on different depth ranges.

Can OpenCV make use of this ability, or does it assume fixed cameras?

edit retag flag offensive close merge delete


Be aware that you can work with generally oriented cameras if you can easily identify your feature in the images, so once you get the feature position you can back-project it into 3D and find the intersection of the 2 optic rays (in this case you don't need the epipolar constraint). If you can't detect your feature in both images and cameras are not alligned to find correspondences you'll have to scan the entire image and here is where epipola lines come in help as Martin said.

David_86 gravatar imageDavid_86 ( 2015-03-18 04:10:51 -0500 )edit

1 answer

Sort by ยป oldest newest most voted

answered 2015-03-17 21:08:16 -0500

OpenCV assumes fixed cameras like this:

image description

When cameras are setup in this fashion, the epipolar lines become parallel, making much simpler the problem of stereo correspondence.

For the setup that you mention you would need to alter the camera calibration as you rotate the cameras.

edit flag offensive delete link more

Question Tools

1 follower


Asked: 2015-03-17 15:54:43 -0500

Seen: 1,346 times

Last updated: Mar 17 '15