OpenCV Q&A Forum - RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 2012-2018.Sat, 21 Jun 2014 05:36:54 -0500OpenCV Stereo Calibration and triangulation in a user defined coordinate systemhttp://answers.opencv.org/question/33450/opencv-stereo-calibration-and-triangulation-in-a-user-defined-coordinate-system/How do you stereo cameras so that the output of the triangulation is in a real world coordinate system, that is defined by known points?
OpenCV stereo calibration returns results based on the pose of the left hand camera being the reference coordinate system.
I am currently doing the following:
Intrinsically calibrating both the left and right camera using a chess board. This gives the Camera Matrix A, and the distortion coefficients for the camera.
Running stereo calibrate, again using the chessboard, for both cameras. This returns the extrinsic parameters, but they are relative to the cameras and not the coordinate system I would like to use.
How do I calibrate the cameras in such a way that known 3D point locations, with their corresponding 2D pixel locations in both images provides a method of extrinsically calibrating so the output of triangulation will be in my coordinate system?cv_userThu, 15 May 2014 16:58:31 -0500http://answers.opencv.org/question/33450/Incorrect stereo camera calibrationhttp://answers.opencv.org/question/35448/incorrect-stereo-camera-calibration/Hi,
I'm trying to calibrate a stereo camera setup using OpenCV's standard `calibrateCamera()` and `stereoCalibrate()` functions. The cameras are mounted in a smartphone, approximately 35 mm apart from each other and they face the same direction.
The problem is that the stereo reconstruction using the results of the calibration gives huge distortions of the reconstructed scene. I think that this may be caused by wrong extrinsic parameters (relative rotation and translation of the cameras). Especially the translation look suspicious to me. As far as I understand, the length of the translation vector should be comparable to the distance between the cameras. However, the length of the `T` vector resulting from my calibration varies between 16.7 and 23.4 mm, depending on which of the following approaches I use:
1) Checkerboard calibration pattern, square size 19x19 mm, 12x10 corners:
a) Independent intrinsic calibration of the cameras with `calibrateCamera()`, then stereo calibration with `stereoCalibrate()` using the cameras' matrices and distortions found in the intrinsic calibration as ultimate (`CV_CALIB_FIX_INTRINSIC`)
b) Independent intrinsic calibration of the cameras with `calibrateCamera()`, then stereo calibration with `stereoCalibrate()` using the cameras' matrices and distortions from intrinsic calibration as the initial values for optimization (`CV_CALIB_USE_INTRINSIC_GUESS`)
2) Asymmetric circle grid, 9x3 circles, grid size 30.2 x 30.2 mm - the same two options a and b, as in point 1.
3) One-step calibration using `stereoCalibrate()` only.
Have you got any experience with this problem? The detailed results of the calibration are attached in the PDF [here](http://goo.gl/5ovtDB). I also attach [the original images I use](http://goo.gl/iBNQ1q), should someone want to check the calibration themselves.
Update 2014-06-30:
**Matthieu**, thank you for your response.
Yes, I am pretty sure, that the corners on the checkerboard are correctly detected, because I do the detection semi-manually, using Bouguet's Camera Calibration Toolbox for Matlab (afterwards I import the extracted corner positions to my OpenCV calibration program). The corners are detected with sub-pixel accuracy.
I repeated the whole calibration with a new checkerboard pattern, using 85 images of it. I also did the best I can to get rid of the blur. The results are not much better. The relative camera translation vector, which is a sanity check parameter for me, is now `[-23.41713024477083; 0.05333133785688135; -7.846262984094929]`. And the reconstruction is still distorted. As previously, I enclose the [detailed results](https://www.dropbox.com/s/1fltatgzums4jhl/calibration%20results%20new.pdf) and the [images](https://www.dropbox.com/s/cgr4izcm7vl5zvw/calibration-images-new.zip).
Is the distortion model of the calibration procedure incompatible with my camera setup? Or is it maybe the small base length, that makes the optimization algorithm get stuck on some local minimum? What do you think?
PS. By the way, the results with the Camera Calibration Toolbox for Matlab are no better.aardvarkSat, 21 Jun 2014 05:36:54 -0500http://answers.opencv.org/question/35448/bad rectification and high rms returned by stereoCalibrationhttp://answers.opencv.org/question/33942/bad-rectification-and-high-rms-returned-by-stereocalibration/Hi, i have some problems while computing rectification of stereo pairs: stereoCalibration returns an high rms error and i obtain bad rectification pairs.
I tried both my rectification program and the stereo_calib.cpp provided with opencv. They both returns similiar rms errors. Plus, i ran my program with the sample stereo pairs in opencv/sample/cpp and i got correctly rectified images. So i believe the problem is in the way i take stereo picture, is it possible?
I use the stereo camera of an Htc Evo 3D (a 3D smartphone) taking pictures of a chessboard pattern. I tried to change the number and the set of pictures used as input but the littlest stereoCalibration rms i got was around 1.5 and the rectified images are totally wrong.<br>
Is there any "suggested" way to take a set of pictures for calibration?
Thanks, AndreabecoFri, 23 May 2014 05:09:09 -0500http://answers.opencv.org/question/33942/Stereo calibration: problem with projection matrices and SGBMhttp://answers.opencv.org/question/27508/stereo-calibration-problem-with-projection-matrices-and-sgbm/I am trying to create a depth map from a disparity using SGBM. The resulting disparity map has almost no definition. So I set up sliders for the various SGBM values and can get very limited results - they are basically blank.
After calibrating the cameras individually followed by the stereo calibration I can rectify images which comes out rather well:
![rectified image](/upfiles/13911306844204794.jpg)
Something I've noticed is the projection matrices and the Q matrix have several NAN values. Also the ROI for each camera is all zeros which can['t be right. Below are the saved calibration setting that I am using showing the NAN values.
Since the RMS values for each single camera calibration are under 0.4 and the stereo RMS is less than 2.0 with a re-projection error of less then 4.0 I'm not sure what may be wrong at this point. Suggestions and help would be greatly appreciated.
Camera_0_Matrix<br>
8.2360702625110662e+002 0. 3.1410424020211406e+002<br>
0. 8.1860608380736392e+002 2.4384904263053622e+002<br>
0. 0. 1.<br>
Camera_1_Matrix<br>
8.1134698384868875e+002 0. 3.1328331721626449e+002<br>
0. 8.0724997468627953e+002 2.3930244099756220e+002<br>
0. 0. 1.<br>
Camera_0_Distortion_Coefficients<br>
2.6712887348238189e-002 -1.8664642728347297e-001<br>
-4.8596016262543446e-004 -1.4728080493686517e-003<br>
5.2687479496305778e-001 0. 0. 0.<br>
Camera_1_Distortion_Coefficients<br>
3.9590803660099684e-002 -1.8981006469293660e-001<br>
-6.4688587632222173e-003 -3.7797348771147284e-003<br>
2.7996253215102412e-001 0. 0. 0.<br>
Mat_R<br>
9.9922618707330479e-001 -1.4827571495127988e-002 -3.6430347109269276e-002<br>
1.4533977386417006e-002 9.9985983789215160e-001 -8.3107202823446188e-003<br>
3.6548468754191266e-002 7.7748114984945516e-003 9.9930163701351293e-001<br>
Mat_T<br>
-7.1373089613768613e+001 -6.5822496464438818e-001 -6.3515467302110000e+000<br>
Mat_E<br>
6.8256121992123484e-002 6.3455389090094831e+000 -7.1055121292732304e-001 <br>
-3.7380546856077390e+000 6.4909033105906033e-001 7.1554634341813454e+001<br>
-3.7961924878715458e-001 -7.1372845688807843e+001 5.6918241952876536e-001<br>
Mat_F<br>
7.2342254670288030e-008 6.7664989166222400e-006 -2.2929759854297700e-003 <br>
-3.9819395869095442e-006 6.9566352175857590e-007 6.3859013286650543e-002<br>
6.0378280400693674e-004 -6.4036046720140641e-002 1.<br>
Camera_0_Rotation_Matrix<br>
9.9862396895832117e-001 -4.8951125057362125e-003 5.2213087396614198e-002 <br>
5.1174999253298069e-003 9.9997839183290060e-001 -4.1263860459532032e-003<br>
-5.2191760043559916e-002 4.3879084815174659e-003 9.9862744126261294e-001<br>
Camera_1_Rotation_Matrix<br>
9.9602166255538394e-001 9.1856234214931377e-003 8.8636742058627288e-002 <br>
-9.5633854847882640e-003 9.9994690362209304e-001 3.8381759545177280e-003<br>
-8.8596779729730887e-002 -4.6705737278212095e-003 9.9605662307048315e-001<br>
Camera_0_Projection_Matrix<br>
.Nan 0. .Nan 0.<br>
0. .Nan .Nan 0.<br>
0. 0. 1. 0.<br>
Camera_1_Projection_Matrix<br>
.Nan 0. .Nan .Nan <br>
0. .Nan .Nan 0. <br>
0. 0. 1. 0.<br>
Mat_Q<br>
1. 0. 0. .Nan <br>
0. 1. 0. .Nan <br>
0. 0. 0. .Nan <br>
0. 0. 1.3955142868906164e-002 .Nan<br>
ROI_0<br>
0 0 0 0<br>
ROI_1<br>
0 0 0 0<br>
jim_73_mk1Thu, 30 Jan 2014 19:52:39 -0600http://answers.opencv.org/question/27508/stereoCalibrate problem (solved)http://answers.opencv.org/question/30678/stereocalibrate-problem-solved/SOLUTION: adjust de flags parameter
I'm dealing with two cameras and I played CalibrateCamera, it returned me correct distCoeffs, CameraMatrix, tvecs and rvecs values for each camera. The results seemed to be correct
![image description](/upfiles/1395866231814628.png)
Then I realised that for dealing with 3D triangulation of points I would need stereoCalibrate instead, so I tried it and it doesn't work, it returns zero values for each Output parameter. I verified previously that ImagePoints for each camera were correct but see the results:
![image description](/upfiles/13958664195349502.png)
Why? Because I see the stereoCalibrate returned 0 values for each Output parameter.
How can I solve that? Should I use an alternative of using stereoCalibrate?
PD: I'm using the default extra flags or criteria parameters.
I'm using version 2.4.6 with mingw with the following documentation
http://docs.opencv.org/2.4.6/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#stereocalibrate
dbergaWed, 26 Mar 2014 15:41:58 -0500http://answers.opencv.org/question/30678/Pose from Fundamental matrix and vice versahttp://answers.opencv.org/question/25554/pose-from-fundamental-matrix-and-vice-versa/I have computed the Fundamental Matrix between two cameras using opencv's [findFundamentalMat][1]. Then I plot the epipolar lines in the image. And I get something like:
![Epipolar lines ok][2]
Now, I tried to get the pose from that fundamental matrix, computing first the essential matrix and then using Hartley & Zissserman approach.
K2=np.mat(self.calibration.getCameraMatrix(1))
K1=np.mat(self.calibration.getCameraMatrix(0))
E=K2.T*np.mat(F)*K1
![H&Z][3]
w,u,vt = cv2.SVDecomp(np.mat(E))
if np.linalg.det(u) < 0:
u *= -1.0
if np.linalg.det(vt) < 0:
vt *= -1.0
#Find R and T from Hartley & Zisserman
W=np.mat([[0,-1,0],[1,0,0],[0,0,1]],dtype=float)
R = np.mat(u) * W * np.mat(vt)
t = u[:,2] #u3 normalized.
In order to check everything until here was correct, I recompute E and F and plot the epipolar lines again.
S=np.mat([[0,-T[2],T[1]],[T[2],0,-T[0]],[-T[1],T[0],0]])
E=S*np.mat(R)
F=np.linalg.inv(K2).T*np.mat(E)*np.linalg.inv(K1)
But surprise, the lines have moved and they don't go through the points anymore. Have I done something wrong?
![epilines bad][5]
It looks similar to this question [http://answers.opencv.org/question/18565/pose-estimation-produces-wrong-translation-vector/][4]
The matrices I get are:
Original F=[[ -1.62627683e-07 -1.38840952e-05 8.03246936e-03]
[ 5.83844799e-06 -1.37528349e-06 -3.26617731e-03]
[ -1.15902181e-02 1.23440336e-02 1.00000000e+00]]
E=[[-0.09648757 -8.23748182 -0.6192747 ]
[ 3.46397143 -0.81596046 0.29628779]
[-6.32856235 -0.03006961 -0.65380443]]
R=[[ 9.99558381e-01 -2.72074658e-02 1.19497464e-02]
[ 3.50795548e-04 4.12906861e-01 9.10773189e-01]
[ -2.97139627e-02 -9.10366782e-01 4.12734058e-01]]
T=[[-8.82445166e-02]
[8.73204425e-01]
[4.79298380e-01]]
Recomputed E=
[[-0.0261145 -0.99284189 -0.07613091]
[ 0.47646462 -0.09337537 0.04214901]
[-0.87284976 -0.01267909 -0.09080531]]
Recomputed F=
[[ -4.40154169e-08 -1.67341327e-06 9.85070691e-04]
[ 8.03070680e-07 -1.57382143e-07 -4.67389530e-04]
[ -1.57927152e-03 1.47100268e-03 2.56606003e-01]]
[1]: http://docs.opencv.org/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#findfundamentalmat
[2]: http://i.stack.imgur.com/zd0mK.png
[3]: http://i.stack.imgur.com/FT9yb.png
[4]: http://answers.opencv.org/question/18565/pose-estimation-produces-wrong-translation-vector/
[5]: http://i.stack.imgur.com/y9BkC.pngJosep BoschTue, 17 Dec 2013 03:47:49 -0600http://answers.opencv.org/question/25554/Rectification problem stereo visionhttp://answers.opencv.org/question/25320/rectification-problem-stereo-vision/Hello everybody.
I have a problem with my code. I am trying create a 3d model with stereo vision. With one of the two test dataset, everything works fine and I am able to build the 3d model. With the second dataset I have some problem in the rectification.
here there is for example the normal left image:
![image description](/upfiles/13867771521519052.png)
Here the rectified image:
![image description](/upfiles/13867770243066246.png)
And here the code for calibration and rectification:
cv::stereoCalibrate(objectPoints, imagePoints_l, imagePoints_r,
cameraMatrix_l, distCoeffs_l, cameraMatrix_r, distCoeffs_r,
imageSize, R, T, E, F, TermCriteria(CV_TERMCRIT_ITER + CV_TERMCRIT_EPS, 100, 1e-5),
CV_CALIB_FIX_INTRINSIC);
where I set CV_CALIB_FIX_INTRINSIC because I calibrated separately the two cameras.
cv::stereoRectify(cameraMatrix_l, distCoeffs_l, cameraMatrix_r, distCoeffs_r, imageSize, R, T, R1, R2, P1, P2, Q, 0, 0);
For stereo rectification.
remap(leftImg, undistortedLeftImg, map1_l, map2_l, INTER_LINEAR);
remap(rightImg, undistortedRightImg, map1_r, map2_r, INTER_LINEAR);
For remapping
Is it possible that the wrong rectifications depends on low quality calibration images?
Thank you very much!!ed3dalltimeWed, 11 Dec 2013 09:53:33 -0600http://answers.opencv.org/question/25320/