I’m trying to re-calibrate a stereo pair, which was previously calibrated using standard chess board images. Over time, temperature changes etc. shift the baseline/rotation of the two cameras. I'm trying to recover stereo calibration using a single left/right camera image of a pattern (like a chess board).
tl;dr: Recover ‘extrinsic’ params for a stereo pair using a single left/right calibration pattern image. Assume camera ‘intrinsics’ stay the same.
What seems like a straightforward series of steps, is giving incorrect results:
// Get corresponding points in 'pixel' coordinates
vector<Point2f> points_left = ...;
vector<Point2f> points_right = ...;
undistortPoints(points_left, points_left, cameraMatrix[0], distCoeffs[0], noArray(), cameraMatrix[0]);
undistortPoints(points_right, points_right, cameraMatrix[1], distCoeffs[1], noArray(), cameraMatrix[1]);
auto E = findEssentialMat(points_left, points_right, cameraMatrix[0], RANSAC, 0.999, 1.0);
// We want to recover R, t
Mat R_new, t_new;
recoverPose(E, points_left, points_right, cameraMatrix[0], R_new, t_new);
// Compute new R1, R2, P1, P2, Q, as well has adjusted ‘valid ROIs’ r1, r2
Mat R1, R2, P1, P2, Q;
Rect r1, r2;
stereoRectify(cameraMatrix[0], distCoeffs[0], cameraMatrix[1], distCoeffs[1],
image_size, R_new, t_new, R1, R2, P1, P2, Q,
CALIB_ZERO_DISPARITY, 0, image_size, &r1, &r2);
Problem 1) Even though the cameras have moved very slightly, the values of R, t are quite different from original R, t values computed during initial calibration. Especially ‘t’ looks like it's a unit vector! There's no information about this in OpenCV documentation for recoverPose().
Problem 2) I tried using ‘normalized’ output from undistortPoints(), by giving noArray() as the last param in undistortPoints() instead of ‘cameraMatrix’. Then using focal length = 1, principle point = (0,0) in findEssentialMat(). But to no avail, the final result is still wrong.
Am I missing a step somewhere? Are my inputs to findEssentialMat() correct?