Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

ArUco tracking in OpenCV

I have been working on ArUco detection and tracking to get the location of moving camera which is at a distance from the marker(static marker). I implemented solvePnP to achieve the pose of the marker and ultimately got the pose of the of the camera (Z axis distance). I followed this How to calculate the angle from rotation matrix to get the yaw, roll and pitch of the camera. The roll angle results are stable. But I see the yaw angle results are unstable. I have the following questions: 1. As the distance of the camera from the marker increases would there be any instability in detecting the corners? I am using a fairly large sized marker. 2. The yaw angle results seem to be alright when camera is closer to the marker. But as the distance increases the "Z" axis flips giving me the inverted sign values beyond a certain point. How do I go about this? 3. Are there any other possible ways to calculate the yaw angle of the camera which would lead me to stable results?

Any help would be greatly appreciated.

ArUco tracking in OpenCV

I have been working on ArUco detection and tracking to get the location of moving camera which is at a distance from the marker(static marker). I implemented solvePnP to achieve the pose of the marker and ultimately got the pose of the of the camera (Z axis distance). I followed this How to calculate the angle from rotation matrix to get the yaw, roll and pitch of the camera. The roll angle results are stable. But I see the yaw angle results are unstable. I have the following questions: 1. As the distance of the camera from the marker increases would there be any instability in detecting the corners? I am using a fairly large sized marker. 2. The yaw angle results seem to be alright when camera is closer to the marker. But as the distance increases the "Z" axis flips giving me the inverted sign values beyond a certain point. How do I go about this? 3. Are there any other possible ways to calculate the yaw angle of the camera which would lead me to stable results?

results? The code snippet: cv::solvePnP(objectPoints,imagePoints,camMatrix,distCoeffs,rvecs1,tvecs1, CV_ITERATIVE) cv::Mat rotate; cv::Rodrigues(rvecs1,rotate);cv::Mat cam = rotate.t(); //rotation matrix cv::Mat camdist; camdist = -cam * tvecs1; //camera pose double m11 = cam.at<double>(0, 0); double m12 = cam.at<double>(0,1); double m13 = cam.at<double>(0, 2); double m21 = cam.at<double>(1, 0); double m22 = cam.at<double>(1, 1); double m23 = cam.at<double>(1, 2); double m31 = cam.at<double>(2, 0); double m32 = cam.at<double>(2,1); double m33 = cam.at<double>(2, 2); double yaw; yaw = atan2(-m31,cv::sqrt((m32*m32) + (m33*m33)));
Any help would be greatly appreciated. appreciated.

ArUco tracking in OpenCV

I have been working on ArUco detection and tracking to get the location of moving camera which is at a distance from the marker(static marker). I implemented solvePnP used cv::solvePnP() to achieve the pose of the marker and ultimately got the pose of the of the camera (Z axis distance). I followed this How to calculate the angle from rotation matrix to get the yaw, roll and pitch of the camera. The roll angle results are stable. But I see the yaw angle results are unstable. I have the following questions: 1.

  1. As the distance of the camera from the marker increases would there be any instability in detecting the corners? I am using a fairly large sized marker. 2. marker.
  2. The yaw angle results seem to be alright when camera is closer to the marker. But as the distance increases the "Z" axis flips giving me the inverted sign values beyond a certain point. How do I go about this? 3. this?
  3. Are there any other possible ways to calculate the yaw angle of the camera which would lead me to stable results? results?

The code snippet: snippet:

cv::solvePnP(objectPoints,imagePoints,camMatrix,distCoeffs,rvecs1,tvecs1, CV_ITERATIVE)
cv::solvePnP(objectPoints,imagePoints,camMatrix,distCoeffs,rvecs1,tvecs1,CV_ITERATIVE);
cv::Mat rotate;
  cv::Rodrigues(rvecs1,rotate);cv::Mat cv::Rodrigues(rvecs1,rotate);
cv::Mat cam = rotate.t(); //rotation // rotation matrix
 cv::Mat camdist;
            camdist = -cam * tvecs1;   //camera // camera pose 
 double m11 = cam.at<double>(0, 0);
 double m12 = cam.at<double>(0,1);
 double m13 = cam.at<double>(0, 2);
 double m21 = cam.at<double>(1, 0);
 double m22 = cam.at<double>(1, 1);
 double m23 = cam.at<double>(1, 2);
 double m31 = cam.at<double>(2, 0);
 double m32 = cam.at<double>(2,1);
 double m33 = cam.at<double>(2, 2);
double yaw;
yaw = atan2(-m31,cv::sqrt((m32*m32) + (m33*m33)));(m33*m33)));
 

Any help would be greatly appreciated.