Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Stereo Camera Calibration - World Origin

Hello, I have some questions about stereo calibration and I also wanted to verify that my understanding so far is correct.

I am working on a project where I want to get 3D coordinates from a stereo camera setup. I am working in C++ using the opencv_contrib aruco library for ChArUco board detection. So far I have calibrated both of my cameras intrinsic parameters (camera matrix, and distortion coefficients for each) by detecting ChArUco board corners and refining them, then I use cv::aruco::calibrateCameraCharuco to get intrinsics. Next, I create a stereo pair by gathering new images where both cameras can see the calibration board. I detect ChArUco board corners from each cameras viewpoint and run cv::stereoCalibrate. Running stereoCalibrate provides me with the extrinsic parameters (rotation matrix, and translation matrix) for the transformation from camera 1's coordinate system to camera 2's coordinate system (correct me if I'm wrong!).

Now that I have my intrinsic parameters for each camera, and my extrinsic parameters relating the two camera coordinate systems, I run cv::stereoRectify to obtain projection matrices for each camera that transform the camera views to be planar with one another. Next, I wanted to run cv::triangulatePoints to see how accurate my 3D point is. Running this triangulation function I get a result returned to me, however, I believe that this 3D point if relative to the world origin. I am wondering how/where OpenCV sets the world origin, and I would like to know how I should go about setting my own world origin.

Here is a list breakdown of my current workflow: 1. Gather a set of calibration images for each camera 2. Use each cameras set of images to calibrate intrinsic parameters 3. Gather a new set of calibration images for the stereo pair, where both cameras can view the ChArUco board 4. Use the images to calibrate extrinsic parameters of the stereo pair And now the part I'm not sure how to do... 5. Use the calibrated cameras with extrinsic parameters to determine 3D points relative to the world origin in my workspace

Thanks for the help!