Ask Your Question

Revision history [back]

As in fisheye::undistortPoints description said: "Note that the function assumes the camera matrix of the undistorted points to be identity. This means if you want to transform back points undistorted with undistortPoints() you have to multiply them with P−1"

This note (not very clear actually) means that cv::fisheye::undistortPoints accepts _normalized_ coordinates as an input. From code above your input points are in unnormalized (i.e. in pixel) coordinates. That way it won't work.

So you should do something like this (variation of of code from my working sw):

vector<Point2f> srcp; //Array of undistorded points
vector<Point2f> dstp; //Array of distorded points
srcp.push_back(Point2f(100,100)); //Adding one point as an example, in pixel coordinates
Matx33d camMat = K; //K is my camera matrix estimated by cv::fisheye::calibrate
Matx33d camMat_inv = camMat.inv(); //Inverting the camera matrix
for(size_t i=0;i<srcp.size();i++) {
    Vec3d srcv = Vec3d(srcp[i].x, srcp[i].y, 1.0); //Creating a vector in homogeneous coords
    Vec3d dstv = camMat_inv*srcv; //Doing martix by vector multiplication
    srcp[i].x = dstv[0]; //Extracting resulting normalised x coord
    srcp[i].y = dstv[1]; //Extracting resulting normalised y coord
}
cv::fisheye::distortPoints(srcp, dstp, K,D); //Performing distortion. D is distortion vector

//Drawing resulting example point: 
circle(img_dist, Point2f(dstp[0].x, dstp[0].y), 3,CV_RGB(255,0,0), -1);