Ask Your Question

How to use Multi-camera Calibration modules?

asked 2017-07-02 21:29:39 -0500

Imkoe gravatar image

updated 2017-07-02 22:01:39 -0500

Hi guys I need some help,

I did a stitching code for 4 GoPro camera input, but I need more accurate camera calibration result especially the camera pose. So I found there is a Multi-camera Calibration class in contrib modules, and trying to use it. However, I did it 3 times for same camera rig to capture different test data (about 200 images each), then totally got different result, extrinsic matrix is way different each test.

What I'm actually trying to do is the following:

  • To generate a random pattern (resolution 1280x960), print it out in A3 size
  • Capture test images about 200 (resolution 1920x1080)
  • Run Multi-camera Calibration (I use pinhole model)
  • Reprojection error comes out about 1~4

There's my reference:

Does anyone have an idea of how to achieve accurate multiple camera calibrations or any trick to use this modules? Each camera has 2 small overlap region on left and right side. Thank you!

edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted

answered 2017-07-04 04:03:13 -0500

pklab gravatar image

The extrinsic matrix is rotation-translation of the camera around scene or scene around camera therefore is normal to get different extrinsic matrix from different scene of same camera.

The intrinsic matrix describes the camera itself and is supposed to be scene independent (but affected by the Reprojection error)

Finally the calibration based on "random pattern" has more features of "chessboard" calibration but is a bit less accurate due to the feature detector/descriptor/extractor like AKAZE or (non-free)SIFT/SURF vs subpixel in the chessboard based algo.

Check the related paper for details:

B. Li, L. Heng, K. Kevin and M. Pollefeys, "A Multiple-Camera System Calibration Toolbox Using A Feature Descriptor-Based Calibration Pattern", in IROS 2013.

At the end, always remember that the contrib modules quite often do not have stable API, and they are not well-tested

edit flag offensive delete link more


Thanks for your reply. So if I didn't get wrong, 4 cameras are static then moving that random pattern round them. After calibration, I suppose to got 4 intrinsic and 4 extrinsic matrix (one of the camera will be the world) coordinate) which is my expected result. I mean I don't know why I take another experiment data set that cameras are in the same place, the results show translation term of extrinsic are very different.

Imkoe gravatar imageImkoe ( 2017-07-05 01:19:45 -0500 )edit

From the doc

  • The matrix of intrinsic parameters (focal length, distortion, optical centre) does not depend on the scene viewed
  • The joint rotation-translation matrix [R|t] is called a matrix of extrinsic parameters. It is used to describe the camera motion around a static scene, or vice versa, rigid motion of an object in front of a still camera.
pklab gravatar imagepklab ( 2017-07-06 01:56:14 -0500 )edit

@Imkoe did you ever find a solution to lower the reprojection error using OpenCV's Multi Camera Calibration module? I am also working on something similar. Looking forward to your response. Thank you.

ShadowWalker197 gravatar imageShadowWalker197 ( 2017-11-23 02:54:51 -0500 )edit
Login/Signup to Answer

Question Tools



Asked: 2017-07-02 21:29:39 -0500

Seen: 1,458 times

Last updated: Jul 04 '17