Ask Your Question
0

Stereo Camera Calibration - World Origin

asked 2020-07-20 13:24:52 -0600

ConnorM gravatar image

Hello, I have some questions about stereo calibration and I also wanted to verify that my understanding so far is correct.

I am working on a project where I want to get 3D coordinates from a stereo camera setup. I am working in C++ using the opencv_contrib aruco library for ChArUco board detection. So far I have calibrated both of my cameras intrinsic parameters (camera matrix, and distortion coefficients for each) by detecting ChArUco board corners and refining them, then I use cv::aruco::calibrateCameraCharuco to get intrinsics. Next, I create a stereo pair by gathering new images where both cameras can see the calibration board. I detect ChArUco board corners from each cameras viewpoint and run cv::stereoCalibrate. Running stereoCalibrate provides me with the extrinsic parameters (rotation matrix, and translation matrix) for the transformation from camera 1's coordinate system to camera 2's coordinate system (correct me if I'm wrong!).

Now that I have my intrinsic parameters for each camera, and my extrinsic parameters relating the two camera coordinate systems, I run cv::stereoRectify to obtain projection matrices for each camera that transform the camera views to be planar with one another. Next, I wanted to run cv::triangulatePoints to see how accurate my 3D point is. Running this triangulation function I get a result returned to me, however, I believe that this 3D point if relative to the world origin. I am wondering how/where OpenCV sets the world origin, and I would like to know how I should go about setting my own world origin.

Here is a list breakdown of my current workflow: 1. Gather a set of calibration images for each camera 2. Use each cameras set of images to calibrate intrinsic parameters 3. Gather a new set of calibration images for the stereo pair, where both cameras can view the ChArUco board 4. Use the images to calibrate extrinsic parameters of the stereo pair And now the part I'm not sure how to do... 5. Use the calibrated cameras with extrinsic parameters to determine 3D points relative to the world origin in my workspace

Thanks for the help!

edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted
1

answered 2020-07-21 01:34:13 -0600

berak gravatar image

I am wondering how/where OpenCV sets the world origin

it doesn't. there is no "world" here, it's all in camera space.

how I should go about setting my own world origin.

you would need some external reference, like gps coords, and another transformation from camera to world coordinates

edit flag offensive delete link more

Comments

The coordinates that I get back from cv::triangulatePoints, are they in reference to one of the cameras then? Do you know where OpenCV attempts to place the origin for the camera?

ConnorM gravatar imageConnorM ( 2020-07-21 11:44:21 -0600 )edit
1

the camera is the origin

berak gravatar imageberak ( 2020-07-22 02:06:06 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2020-07-20 13:24:52 -0600

Seen: 1,319 times

Last updated: Jul 21 '20