Ask Your Question

# 2.4.3.2: How can I fix bad stereo alignment?

I set up a stereo rig with two PS3-Eye cameras that were changed to infrared. With that setup I acquire a set of 25 frames with the asymmetrical point pattern visible.

1. With calls to calibrateCamera I calibrate the cameras individually.
2. The resulting camera matrices and dist coefficients are then fed into stereoCalibrate.

1) gives me RMS of ~0.4, 2) results in RMS of 0.96. The outcome you can see in the image below:

This is a side by side rendering with the rectified (with initUndistortRectifyMap and remap) images. What I don't understand is why the two views have a different scaling. Normally the lines shown should be epipolar and thus hitting the same dots in both images.

So how do I fix the bad stereo alignment?

Here are the calls and flags of the most important parameters used:

// This is done for both cameras.
vector<Mat> rvecs, tvecs;
vector<float> reprojErrs;
double rms = calibrateCamera(objectPoints, imagePoints, imageSize,
cameraMatrix, distCoeffs, rvecs, tvecs,
s.flag | CV_CALIB_FIX_K4 | CV_CALIB_FIX_K5);
// This is done with the resulting Mat's from above
Mat map1_1, map1_2, map2_1, map2_2;
Mat R, T, E, F;
double rms = stereoCalibrate(objectPoints, imagePoints1, imagePoints2,
cameraMatrix1, distCoeffs1,
cameraMatrix2, distCoeffs2,
imageSize, R, T, E, F,
TermCriteria(CV_TERMCRIT_ITER+CV_TERMCRIT_EPS, 100, 1e-5),
CV_CALIB_FIX_ASPECT_RATIO +
CV_CALIB_ZERO_TANGENT_DIST +
CV_CALIB_SAME_FOCAL_LENGTH +
CV_CALIB_RATIONAL_MODEL +
CV_CALIB_FIX_K3 + CV_CALIB_FIX_K4 + CV_CALIB_FIX_K5);
// Undistortion maps are calculated for both cameras.
initUndistortRectifyMap(cameraMatrix1, distCoeffs1, R1, P1, imageSize, CV_16SC2, map1_1, map1_2);
initUndistortRectifyMap(cameraMatrix2, distCoeffs2, R2, P2, imageSize, CV_16SC2, map2_1, map2_2);
// Finally from captured view matrices v1 and v2 the resulting views r1 and r2 are calculated.
remap(v1, r1, map1_1, map1_2, CV_INTER_LINEAR);
remap(v2, r2, map2_1, map2_2, CV_INTER_LINEAR);


Things I noticed:

In this answer @Michael_Koval noted that the order of the pattern plays a role. In my workflow the pattern points are generated in two different steps of the workflow but with the low resulting RMS I doubt that this is a problem. Is it?

When I use initUndistortRectifyMap with R1, P1 does this mean the image points will be warped to a common point space or will this transform the points into the view of the other camera? The example that I used as a reference (from samples/cpp/stereo_calib.cpp) has the same calls so I don't quite understand my results.

Update: stereoRectify outputs the translation matrix with an accuracy of <1% so I guess the internal calculations are correct which leaves me even more puzzled. Also the ROIs for both views are calculated correctly.

edit retag close merge delete

## 2 answers

Sort by » oldest newest most voted

After debugging stuff for quite a while I figured out that there was an error in my workflow. To modularize the different steps for better command line usage I split the calculation of the camera matrices from the calculation of the stereo calibration (which is good). What I missed though is that stereoCalibrate modifies the camera and distortion coefficient matrices.

When I then read in the matrices for display, I was using the original matrices - which is plainly wrong.

Update: After changing my setup I was stuck with bad results a second time. In the original setup the camera views were aligned in about 1m distance to the camera. (Aligned means that both cameras capture roughly the same area) At this distance I was using the calibration target (a A4 paper with asymmetrical grid pattern).

I then changed the setup to have a larger capture area so the cameras were aligned in about 2.5m distance. Now the calibration target was pretty small regarding the distance to the camera. So I tried two things:

1. Acquiring more pictures
2. Using the target at a smaller distance.

The first approach led to very high errors even for a single camera. The target just didn't occupy enough space in the image so the optimization was not going well. So I tried the second approach. This time the reported error was good but when I displayed the rectification, it was nonsense like before.

I then remembered a picture from the documentation which showed a guy holding a ridiculous large calibration target. After putting one and one together I figured out my error.

For stereoRectify() it is crucial that the whole viewing field of both cameras is covered by a known pattern. Because of the separation of the cameras there is only one plane where there is maximum overlap of both fields of view. Therefore the calibration pattern has to cover a large enough area of the field of view at the plane of maximum overlap.

Update 2: when you specify the size of the grid don't forget that the grid size is 0.5 times the distance of the dots for the asymmetrical pattern. Use meters as unit for all values - it avoids confusion about the numbers.

Hope this helps you - it took me too much time to figure it out.

more

## Comments

Thanks for the solution… :)

( 2013-02-13 07:06:43 -0600 )edit

again, thanks for the update

( 2013-03-22 06:08:18 -0600 )edit

Thank you so much for explaining so well!! It solved my biggest problem.

( 2013-10-31 04:41:20 -0600 )edit

I'm trying to do stereo calibration with the asymmetric circles pattern, the images are properly rectified, however the reprojection error is around 10 and the triangulation measurement fails indeed for about 0.5 centimeters, any suggestions? All that changed in the original stereo_calib code was the line which detects the pattern, there is something else I should change? Thanks in advance.

more

Official site

GitHub

Wiki

Documentation

## Stats

Asked: 2013-02-11 04:22:34 -0600

Seen: 4,587 times

Last updated: May 12 '13