First time here? Check out the FAQ!

Ask Your Question
0

OpenCV Stereo Calibration and triangulation in a user defined coordinate system

asked May 15 '14

cv_user gravatar image

How do you stereo cameras so that the output of the triangulation is in a real world coordinate system, that is defined by known points?

OpenCV stereo calibration returns results based on the pose of the left hand camera being the reference coordinate system.

I am currently doing the following:

Intrinsically calibrating both the left and right camera using a chess board. This gives the Camera Matrix A, and the distortion coefficients for the camera.

Running stereo calibrate, again using the chessboard, for both cameras. This returns the extrinsic parameters, but they are relative to the cameras and not the coordinate system I would like to use.

How do I calibrate the cameras in such a way that known 3D point locations, with their corresponding 2D pixel locations in both images provides a method of extrinsically calibrating so the output of triangulation will be in my coordinate system?

Preview: (hide)

1 answer

Sort by » oldest newest most voted
0

answered Jun 2 '15

visiony2 gravatar image

One way to do it, would be: Part1: use SolvePnP to map the points of your ' known 3D point locations' into it's image coordinates for one of your two camera - say cameraA. This will give you an SE3 (aka: rigid body transform, or a translation + rotation) from your wanted 'user defined coordinate system' into the coordinate system of cameraA.

Part2: Then, when you do your triangulation, you'll get point3D's in either the coordinate system of CameraA or CameraB. If it's cameraB ---> use the calibration data to get the transform to reach cameraA. Once the point3D's (from your triagulation) are in cameraA, use that result from Part1 to reach your desired result

Preview: (hide)

Question Tools

Stats

Asked: May 15 '14

Seen: 1,500 times

Last updated: Jun 02 '15