Distance between Camera and Marker (calculate with Tvec)

asked 2019-09-10 13:18:34 -0500

So i got a marker of known size and set the world coordinate origin to the center of the marker. With solvePnP i calculated the corresponding rotation vector as well as the translation vector. Camera calibration is done in advance. When i project points given in world coordinates it shows them correctly on the display in pixel coordinates. Now i want to calculated the distance between the camera and the marker. If i understood everything correctly, the shift between the world coordinate system and the camera coordinate system is given by the translation vector. Accordingly, the distance between the marker and the camera should be the norm of the vector, given by -Rvec(inverse)*Tvec. but if i do that the distance is way too high (about 2,5x). Am i missing something here? How can i get the right distance between the camera and the marker?

Thanks in advance

edit retag flag offensive close merge delete

Comments

I can't tell from your notation what you are doing, but if you multiplying the two vectors component by component, I don't think that's what you want to do. Use cv::Rodrigues(Rvec) to get a rotation matrix, and then multiply Tvec by Rmat(inverse) and see if that gives you the results that you expect.

swebb_denver gravatar imageswebb_denver ( 2019-09-12 11:35:28 -0500 )edit

That is exactly what I´m doing (sorry for the missunderstanding). Even if I´m holding the smartphone parallel to the marker, so that the rotation matrix is the identity matrix, I get a translation vector that is too big. I do not understand why the projection of points from world coordinates to pixel coordinates is working well on the other hand. I got like the same problem as one guy from stackoverflow https://stackoverflow.com/questions/4...

Unfortunately no one could gave him an explanation for that.

Markus11123 gravatar imageMarkus11123 ( 2019-09-12 13:54:57 -0500 )edit

My first guess would be that the intrinsic calibration is not very accurate (focal length in particular). How many images did you use during intrinsic calibration? You need number of images, and with a good amount of depth change among them (rotating the calibration target from one image to the next). What was the reprojection error you achieved when you calibrated the camera?

But that's just a guess. Also, make sure that if the image points are from a distorted image, call SolvePnP with distortion coefficients, and if the image points are from an undistorted image, pass empty distortion coefficients to SolvePnP.



Also, the smartphone camera autofocus will change the focal length...and also if you adjust zoom. Camera phones aren't the best for what you are doing.

swebb_denver gravatar imageswebb_denver ( 2019-09-12 16:59:58 -0500 )edit