Opencv - Camera position relative to Aruco Markers

asked 2015-08-24 10:37:35 -0600

NicolasJamal gravatar image

updated 2015-08-24 11:01:30 -0600

LBerger gravatar image

I've been playing around with Aruco markers for a while and I want to start using them for a purpose. I created a board 2:2 of markers, calibrated my camera and printed out Rvec & Tvec (Rotation and translation vectors) of the detected board:

detectedBoard = TheBoardDetector.getDetectedBoard(); rvec = detectedBoard.Rvec; tvec = detectedBoard.Tvec;

After that I used a code snippet that I found online for getting the real position of the camera relative to the markers:

cv::Point3f CameraParameters::getCameraLocation(cv::Mat Rvec,cv::Mat Tvec) {

cv::Mat m33(3,3,CV_32FC1);
cv::Rodrigues(Rvec, m33)  ;

cv::Mat m44=cv::Mat::eye(4,4,CV_32FC1);
for (int i=0;i<3;i++)
    for (int j=0;j<3;j++)<float>(i,j)<float>(i,j);

//now, add translation information
for (int i=0;i<3;i++)<float>(i,3)<float>(0,i);
//invert the matrix
return  cv::Point3f(<float>(0,0),<float>(0,1),<float>(0,2));

I tested that code but I got random results. The results changed when I moved my camera but I don't know the unit of measurement. My questions:

1- Is that function correct? I don't know anything about solvepnp nor Rodrigues functions and I tried reading about them but didn't understand a thing.

2- Can I get (in meters) the coordinates (x, y, z) of the camera relative to the markers?

3- How?

I know that I might be lazy a bit for not testing those functions, but I guess that question will be a piece of cake for an experienced person in computer vision and opencv.

Thank you!


edit retag flag offensive close merge delete