Ask Your Question

Revision history [back]

Making use of MATLAB cameraParams in OpenCV program

Hello,

I have a MATLAB program that loads two images and returns two camera matrices and a cameraParams object with distortion coefficients, etc. I would now like to use this exact configuration to undistort points and so on, in an OpenCV program that triangulates points given their 2D locations in two different videos.

function [cameraMatrix1, cameraMatrix2, cameraParams] = setupCameraCalibration(leftImageFile, rightImageFile, squareSize)
    % Auto-generated by cameraCalibrator app on 20-Feb-2015

The thing is, the output of undistortPoints is different in MATLAB and OpenCV even though both use the same arguments.

As an example:

>> undistortPoints([485, 502], defaultCameraParams)
ans = 485   502

In Java, the following test mimics the above (it passes).

public void testUnDistortPoints() {
    MatOfPoint2f src = new MatOfPoint2f(new Point(485f, 502d));
    MatOfPoint2f dst = new MatOfPoint2f();

    Mat defaultCameraMatrix = Mat.eye(3, 3, CvType.CV_64FC1);
    Mat defaultDistCoefficientMatrix = new Mat(1, 4, CvType.CV_64FC1);

    Imgproc.undistortPoints(
            src,
            dst,
            defaultCameraMatrix,
            defaultDistCoefficientMatrix
    );

    assertEquals(dst.get(0, 0)[0], 485d);
    assertEquals(dst.get(0, 0)[1], 502d);
}

However, say I change the first distortion coefficient (k1). In MATLAB:

changedDist = cameraParameters('RadialDistortion', [2 0 0])
>> undistortPoints([485, 502], changedDist)
ans = 4.8756    5.0465

In Java:

public void testUnDistortPointsChangedDistortion() {
    MatOfPoint2f src = new MatOfPoint2f(new Point(485f, 502f));
    MatOfPoint2f dst = new MatOfPoint2f();

    Mat defaultCameraMatrix = Mat.eye(3, 3, CvType.CV_64FC1);
    Mat distCoefficientMatrix = new Mat(1, 4, CvType.CV_64FC1);
    distCoefficientMatrix.put(0, 0, 2); // updated

    Imgproc.undistortPoints(
            src,
            dst,
            defaultCameraMatrix,
            distCoefficientMatrix
    );

    System.out.println(dst.dump());

    assertEquals(dst.get(0, 0)[0], 4.8756d);
    assertEquals(dst.get(0, 0)[0], 5.0465d);
}

It fails with the following output:

[0.0004977131, 0.0005151587]

junit.framework.AssertionFailedError: 
Expected :4.8756
Actual   :4.977131029590964E-4

Why are the results different? I thought Java's distortion coefficient matrix includes both the radial and tangential distortion coefficients.

Also, is CV_64FC1 a good choice of type for the camera / distortion coefficient matrices?

I was trying to test the effect of changing the camera matrix itself (i.e. the value of f_x), but it's not possible to set the 'IntrinsicMatrix' parameter when using cameraparams, so I want to solve the distortion matrix problem first.

Any help would be greatly appreciated.