Need explaination about rvecs returned from SolvePnP

asked 2017-03-14 08:48:04 -0600

b2meer gravatar image

I am using ArUco for pose estimation and i want to get the global world coordinates of my camera using say a single detected aruco marker. For this, I need to know the rotation of camera wrt marker along y axies (upward/downward axis). The output of ArUco /SolvePnP gives me rvecs which contains rotation vector. Now, I really don't understand how this rotation vector represents the angle of rotation. I can convert it to rotation matrix using Rodrigues but still i don't get the actual roll, yaw, pitch angles (or rotation along x,y and z axes) which i really need.

So, can anyone please explain how to manipulate rotation using the rotation vector in rvecs and also how to get simple rotation angles along the three axes from them. Thanks

edit retag flag offensive close merge delete

Comments

Note: it is also possible to compute the Euler angles directly from the Rodrigues (axis-angle) rotation vector. You just need to find the correct equations.

Eduardo gravatar imageEduardo ( 2017-03-14 09:20:57 -0600 )edit

Check out Euclidean Space. It has explanations for, and equations to get from anything to just about anything.

Tetragramm gravatar imageTetragramm ( 2017-03-14 18:50:31 -0600 )edit

I tried with Matrix to Euler conversion as implemented in the link shared by @Eduardo. The attitude is giving angles fine which in my case is rotation along the z-axis (that is line between camera and marker). The other two angles' values are not making sense. I actually want the rotation angle along y-axis (that is the line going upwards or downwards). Any ideas where am I going wrong ?

b2meer gravatar imageb2meer ( 2017-03-15 06:08:49 -0600 )edit

It looks like you don't really understand what's being represented by these angles. Check out this page and try to understand what that means, and how to reverse it. The drawAxis function from the ARUCO can be a useful tool.

Tetragramm gravatar imageTetragramm ( 2017-03-15 17:18:33 -0600 )edit

Yeah I'm a bit confused with these values as this is the first time I am doing pose estimation. My concept till now is that I need the rotation angle along y-axis to find the world coordinates of my camera. For this, I am trying to find the rotation along y-axis but I am getting the rotation angle along z-axis correctly. Please correct me if I am wrong in my concept or my approach for calculating camera's world coordinates. Also, I have gone through the articles you advised and I am already using the drawAxis function of ArUco. Thanks

b2meer gravatar imageb2meer ( 2017-03-16 07:59:18 -0600 )edit

Don't know whether you got your answer yet, but bear in mind that (1) solvePnP gives you the translation from the camera coordinate frame to the object coordinate frame as tvec, and the rotation from the camera frame to the object frame as rvec , and (2) - which always gets me - that OpenCV specifies the camera coordinate frame as +Z=forward=from camera sensor through lens into the world, +X=right, +Y=down

WillC gravatar imageWillC ( 2017-04-26 01:25:40 -0600 )edit

This question, comments, and answers should help point you in the right direction too: http://answers.opencv.org/question/92211/solvepnp-inconsistency-when-rotating-markers-on-a-plane/ (http://answers.opencv.org/question/92...)

WillC gravatar imageWillC ( 2017-04-26 01:29:46 -0600 )edit