How to use stereo camera calibration restults to check accuracy of the camera for long distances?

asked 2018-08-27 22:45:40 -0600

astronaut gravatar image

updated 2018-08-28 03:46:06 -0600

Hi

Im using two point Grey Chameleon3 mono camera set up as Master Salve and synchronized so can work as stereo camera. Then was following the OpenCV tutorial to calibrate the cameras. I got 70 samples for the calibration and this are the Calibration results:

Left:
('D = ', [-0.20826996106865595, 0.18366155924086058, -0.0034661778577718466, 0.00307931151347718, 0.0])
('K = ', [432.82205088588205, 0.0, 272.95231180581044, 0.0, 435.6996693192078, 174.95641222266673, 0.0, 0.0, 1.0])
('R = ', [0.9746296173669449, 0.02700939091034212, -0.22218821245470002, -0.026808041682390916, 0.9996329035169494, 0.003922640364378138, 0.22221259607033478, 0.0021333093834195356, 0.974995903139473])
('P = ', [482.0586696457318, 0.0, 402.53031158447266, 0.0, 0.0, 482.0586696457318, 178.41748809814453, 0.0, 0.0, 0.0, 1.0, 0.0])

Right:
('D = ', [-0.20871659658963718, 0.13988041114304747, -0.0024096479983088267, 0.0031211255518143266, 0.0])
('K = ', [428.59279077571426, 0.0, 275.84270706306677, 0.0, 430.39539990687126, 189.6284029604295, 0.0, 0.0, 1.0])
('R = ', [0.9744460995294874, 0.030491070431326987, -0.22254234144476925, -0.03069272631969819, 0.9995256063000713, 0.002553213962635353, 0.2225146189867814, 0.004342461793354892, 0.9749196825188937])
('P = ', [482.0586696457318, 0.0, 402.53031158447266, -71.0404082822227, 0.0, 482.0586696457318, 178.41748809814453, 0.0, 0.0, 0.0, 1.0, 0.0])
('self.T ', [-0.14360295357921507, -0.004493432498569846, 0.03279579808809728])
('self.R ', [0.999992392164315, -0.003887570979931969, 0.0003200083856870595, 0.0038871258997246238, 0.9999914930203251, 0.0013799055115972527, -0.0003253701440041504, -0.0013786511006187285, 0.9999989967272028])


[image]

width
512

height
384

[narrow_stereo/left]

camera matrix
432.822051 0.000000 272.952312
0.000000 435.699669 174.956412
0.000000 0.000000 1.000000

distortion
-0.208270 0.183662 -0.003466 0.003079 0.000000

rectification
0.974630 0.027009 -0.222188
-0.026808 0.999633 0.003923
0.222213 0.002133 0.974996

projection
482.058670 0.000000 402.530312 0.000000
0.000000 482.058670 178.417488 0.000000
0.000000 0.000000 1.000000 0.000000


[image]

width
512

height
384

[narrow_stereo/right]

camera matrix
428.592791 0.000000 275.842707
0.000000 430.395400 189.628403
0.000000 0.000000 1.000000

distortion
-0.208717 0.139880 -0.002410 0.003121 0.000000

rectification
0.974446 0.030491 -0.222542
-0.030693 0.999526 0.002553
0.222515 0.004342 0.974920

projection
482.058670 0.000000 402.530312 -71.040408
0.000000 482.058670 178.417488 0.000000
0.000000 0.000000 1.000000 0.00000

Why on the right camera have

('self.T ', [-0.14360295357921507, -0.004493432498569846, 0.03279579808809728])
('self.R ', [0.999992392164315, -0.003887570979931969, 0.0003200083856870595, 0.0038871258997246238, 0.9999914930203251, 0.0013799055115972527, -0.0003253701440041504, -0.0013786511006187285, 0.9999989967272028])

Then I got some raw (not calibrated) images of the same chessboard that used to calibrate the cameras by different distances like 4m, 8m, 10m, 20m, 30 m and 40 m. I measured the distance from the camera to the chessboard with laser range finder very accurate.

My question is how to use these obtained calibration results knowing the distance to the object to see how accurate is the camera , means to obtain the depth from the image knowing ... (more)

edit retag flag offensive close merge delete

Comments

You'll need to take the left and right images, undistort them using the camera matrix and distortion coefficients, and rectify them into epipolar form using the translation and rotation matrices. Then you could use calib3d module's StereoBM class or related to create a disparity map. For details of the math and the methods consult Chapter 18 of Learning OpenCV 3, Kaehler and Bradski, O'Reilly Media, 2017.

opalmirror gravatar imageopalmirror ( 2018-08-28 14:03:46 -0600 )edit

Ok thanks. First, I just like to compute the depth from the camera to the object(chessboard ) and see how accurate is compare to ground true measurement. For that do I need all this you mentioned? And then I dont understand what the disparity map will give me?

astronaut gravatar imageastronaut ( 2018-08-28 23:15:26 -0600 )edit

The API and formulas are described here: Calib3D API

For more detail, you'll need to read the books or papers and study the theory. The samples directory of the opencv source code is also helpful.

If your stereo depth subject is always a checkerboard, or a checkerboard aruco (charuco), or any thing where your code algorithmically identifies the same subject physical feature points in the left and right image. then there is no need to do stereo matching (which compares left and right images of a general photographic subject). You will still have to use the calibration matrices to project a point in real 3d space onto a point in your left and right image.

Good luck!

opalmirror gravatar imageopalmirror ( 2018-08-29 17:55:25 -0600 )edit

my stereo depth subject is not always a checkerboard. in this way I only want to check the accuracy of the stereo camera and from how far I can detect smallest object. Because I will do object detection and collision avoidance by distances like 30m-40m from the camera to the object. I also search and I have a question if I need to get the distance to some certain Point (or small object ) that is on the image then I dont have to undistorted and rectify the whole image? do I need to rectify certain points after I undistorted the images or what is the procedure for this case? Need to project a point in real 3d space onto a point in my left and right image undistorted and rectified images? Please can help with this?

astronaut gravatar imageastronaut ( 2018-08-31 04:37:45 -0600 )edit

I rectified and created disparity Map from calib3d module's StereoBM. But now is the final step how to find the distance from the camera to the checkerboard in meters? I dont know how to compute the disparity and from there to get the depth. Any help?

astronaut gravatar imageastronaut ( 2018-09-05 05:04:59 -0600 )edit