Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Stereo vision scale deviation

I'm using OpenCV's stereo camera calibration to calibrate and rectify a stereo camera pair. I have been using different stereo matching methods (including OpenCV's SGBM) to compute a disparity map, which I then project into a 3D point cloud using the Q matrix that OpenCV computed during stereo rectification.

I have been doing some statistical analysis on a highly textured flat surface that I placed at known distances to the cameras. When just looking at the noise of the depth measurements and the relative errors, the results seem very accurate. Looking at the absolute errors, however, I notice that there is a significant scale difference. The point cloud seems to have an approx. 1.5% larger scale than what I expect.

This would indicate that the calibration didn't measure the baseline distance or the focal length accurately, as according to the projective equation (z = b*f/d) this would result in a scaling. Looking at the calibration results, the baseline distance is just 0.14% off from what I would expect, so this is not the cause. That leaves me with the focal length, which I cannot easily check with the needed accuracy.

Are there any thoughts on how I could improve the calibration accuracy (perhaps more or less distortion coefficients or different parameters)? This is how I call stereoCalibrate:

stereoCalibrate(objectPoints, leftPoints, rightPoints,
            cameraMatrix[0], distCoeffs[0],
            cameraMatrix[1], distCoeffs[1],
            imageSize, rotM, T, E, F, 0,
            TermCriteria(CV_TERMCRIT_ITER+CV_TERMCRIT_EPS, 100, 1e-5));