# Find direction from cameraMatrix and distCoeff

Hi Guys,

I have calibrated my camera by detecting checkerboard patterns and running calibrateCamera, retrieving the cameraMatrix and distortion coefficients. These I can plug into project points alongside 3D positions in the cameras space and retrieve the UV where the point is projected into the imperfect camera.

Im using a 3D point that projects into a point near my top left image coordinate corner.

This is all fine, but now I want to go the other way and convert a point in my distorted U,V coordinate into a directional vector pointing at all the points that would be projected into this UV coordinate.

I have tried playing around with the undistortPoints function, to find the ideal points U,V and from those use the cameraMatrix to find a point somewhere along the line, picking values from the cameraMatrix.

X = (U-C_x)/f_x Y = (U-C_y)/f_y Z = 1

But I can't seem to hit a direction that is pointing very close to the 3D point i started from.

Any idea what I might be doing wrong?

kind regards

Jesper Taxbøl

edit retag close merge delete

Sort by » oldest newest most voted

Here's some sample code I use to get the Line of Sight vector in the camera coordinates and then transform it by the rvec and tvec.

//Find the undistorted locations of the point(s)
undistortPoints(ptsIn, ptsOut, cameraMatrix, distortionMatrix, noArray(), cameraMatrix);

double x = ptsOut[0].x;
double y = ptsOut[0].y;

//Find the location relative to the principal point
x -= cameraMatrix.at<double>(0,2);
y -= cameraMatrix.at<double>(1,2);

//Find the azimuth in radians in camera coordinates
double az = atan2(y, x);
//Turn the pixel values into angles by the formula for FOV.
x *= (2 * atan(_size.width / (2 * cameraMatrix.at<double>(0, 0)))) / _size.width;
y *= (2 * atan(_size.height / (2 * cameraMatrix.at<double>(1, 1)))) / _size.height;
//Elevation is perfectly 90 at the principal point
//and the angle from the principal point is the combination of x and y
double el = sqrt(x*x + y*y);
el = CV_PI / 2.0 - el;

//Find the unit 3D vector Line of Sight that matches the azimuth and elevation
Mat LOS(3,1,CV_64F);
LOS.at<double>(0) = cos(el)*cos(az);
LOS.at<double>(1) = cos(el)*sin(az);
LOS.at<double>(2) = sin(el);

//Transform the LOS from camera coordinates to world coordinates.
//rvec and tvec are the same as the output of calibrateCamera of solvePnP
Mat cameraRotation, cameraTranslation;
Rodrigues(rvec, cameraRotation);
cameraRotation = cameraRotation.t();
cameraTranslation = (-cameraRotation * tvec);
LOS = cameraRotation*LOS;


I'm not sure if you needed that last bit from reading your question.

more

This looks interresting. I will try it out when i get access to my opencv box tomorrow.

Meanwhile, could you add some comments to the code as I have a hard time following the atan() parts. I assume az and el is azimuth and elevation, but I cant follow the atan with the focal lengths. I also assume LOS is line of sight?

Kind regards

Jesper

( 2016-06-14 14:54:11 -0500 )edit

The more i read the x *= ... line, i suspect there is a mistake? Atan computes the horizontal FOV and multiplies it linearly to x. I would think that needed to be done inside the atan funktion as it is not linear. (One degree in the center is less pixels in the center of the image than One degree at the edge of the image.) Im really unsure, so please comment. 😀

( 2016-06-14 16:02:59 -0500 )edit

I don't believe there is a mistake. It's just a reasonable approximation. This follow the formula for calculating the Angle of View (FOV) from focal length. The results I get match the measured results for all the cameras I've tested it on, within experimental error (about 2 pixels). To be fair, that's only three cameras. If you have a particularly large FOV it can make a difference, and it's not hard to change.

( 2016-06-14 18:04:15 -0500 )edit

It works smoothly!!!

My dot product for a wide range of testpoints are above 98pct :)

( 2016-06-15 01:23:19 -0500 )edit

And even more smoothly if I scale the angle inside the tan() function.

  #x *= (2 * math.atan(w/(2*camera_matrix[0][0])))/w
#y *= (2 * math.atan(h/(2*camera_matrix[1][1])))/h
x = (2 * math.atan(x/(2*camera_matrix[0][0])))
y = (2 * math.atan(y/(2*camera_matrix[1][1])))


Then my worst dot product between LOS and my original world point(normalized) is 0.999900007954

Kind regards

Jesper

( 2016-06-15 01:34:17 -0500 )edit

Alrighty, I'll add that change to the original source then. Glad I could help.

( 2016-06-15 17:10:18 -0500 )edit

Munch appreciated 😀

( 2016-06-15 18:09:21 -0500 )edit

Official site

GitHub

Wiki

Documentation