Problem with undistortPoints() function in pose estimation of image

asked 2020-12-18 07:38:58 -0600

sigmoid90 gravatar image

I have written about my task here. I have a set of images with known pose which were used for scene reconstruction and some query image from the same space without pose. I need to calculate the pose of the query image. I solved this problem using essential matrix. Here is a code

Mat E = findEssentialMat(pts1, pts2, focal, pp, FM_RANSAC, F_DIST, F_CONF, mask);

// Read pose for view image
Mat R, t; //, mask;                                                                                                                     
recoverPose(E, pts1, pts2, R, t, focal, pp, mask);

The only problem is that OpenCv documentation states that the findEssentialMat function assumes that points1 and points2 are feature points from cameras with the same camera intrinsic matrix. That's not a case for us - images of scene and query image can be captured by cameras with different intrinsics. I suppose to use this undistortPoints() function. According to documentation the undistortPoints() function takes two important parameters distCoeffs and cameraMatrix. Both images of the scene and query image have calibration parameters associated (fx, fy, cx, cy). I obtain cameraMatrix parameter this way:

Mat K_v = (Mat_<double>(3, 3) <<  
                    fx, 0, cx, 
                    0, fy, cy, 
                    0, 0, 1, CV_64F);

Is this correct? Moreover I need to get somewhere distCoeffs. So how I can obtain this distortion coefficients for images of the scene and query image? Probably I should solve it another way?

edit retag flag offensive close merge delete