OpenCV Q&A Forum - RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 2012-2018.Mon, 02 Dec 2019 13:54:19 -0600Why OpenCV uses Rodrigues rotation vector instead of Cayley's formula?http://answers.opencv.org/question/222656/why-opencv-uses-rodrigues-rotation-vector-instead-of-cayleys-formula/I was a fan of Rodrigues rotation formula [1] until I learned about Cayley's formula for orthonormal matrices [2]. If I understand correctly Rodrigues is not meant necessarily to produce a 3x3 rotation matrix but to rotate a vector with a computationally light formula. So maybe OpenCV uses this formula internally to do all transformations? (skipping the calculation of the rotation matrix). Just
But when it comes to producing a 3x3 rotation matrix it seems, at a glance, that Cayley's formula is much simpler.
So why not use Cayley instead of Rodrigues?
[1] https://docs.opencv.org/ref/master/d9/d0c/group__calib3d.html#ga61585db663d9da06b68e70cfbf6a1eac
[2] Murray, R. M.; Li, Z. & Sastry, S. S. A mathematical introduction to robotic manipulation Book, CRC press, 1994, 29, 214-222JamesBaxterMon, 02 Dec 2019 13:54:19 -0600http://answers.opencv.org/question/222656/Rotation matrix to rotation vector (Rodrigues function)http://answers.opencv.org/question/85360/rotation-matrix-to-rotation-vector-rodrigues-function/Hello,
I have a 3x3 rotation matrix that I obtained from stereoCalibrate (using the ros stereo calibration node). I need to obtain a rotation vector (1x3), therefore I used the rodrigues formula. When I went to check the result that I got with this in matlab using the [Pietro Perona - California Institute of Technology](http://www.mathworks.com/matlabcentral/fileexchange/41511-deprecated-light-field-toolbox-v0-2-v0-3-now-available/content/LFToolbox0.2/SupportFunctions/CameraCal/rodrigues.m) I get two different results:
This is the code in cpp:
#include <ros/ros.h>
#include <image_transport/image_transport.h>
#include <tf/transform_broadcaster.h>
#include <ros/param.h>
#include <iostream>
#include <opencv2/core/core.hpp>
#include <opencv2/highgui/highgui.hpp>
#include <opencv2/calib3d/calib3d.hpp>
#include <cv_bridge/cv_bridge.h>
using namespace std;
using namespace cv;
int main(int argc, char** argv)
{
std::vector<double> T, R;
double cols, rows, R_cols, R_rows;
int i, j;
cv::Mat rot_vec = Mat::zeros(1,3,CV_64F), rot_mat = Mat::zeros(3,3,CV_64F);
ros::init(argc, argv, "get_extrinsic");
ros::NodeHandle node;
if(!node.getParam("translation_vector/cols", cols))
{
ROS_ERROR_STREAM("Translation vector (cols) could not be read.");
return 0;
}
if(!node.getParam("translation_vector/rows", rows))
{
ROS_ERROR_STREAM("Translation vector (rows) could not be read.");
return 0;
}
T.reserve(cols*rows);
if(!node.getParam("rotation_matrix/cols", cols))
{
ROS_ERROR_STREAM("Rotation matrix (cols) could not be read.");
return 0;
}
if(!node.getParam("rotation_matrix/rows", rows))
{
ROS_ERROR_STREAM("Rotation matrix (rows) could not be read.");
return 0;
}
R.reserve(cols*rows);
if(!node.getParam("translation_vector/data", T))
{
ROS_ERROR_STREAM("Translation vector could not be read.");
return 0;
}
if(!node.getParam("rotation_matrix/data", R))
{
ROS_ERROR_STREAM("Rotation matrix could not be read.");
return 0;
}
for(i=0; i<3; i++)
{
for(j=0; j<3; j++)
rot_mat.at<double>(i,j) = R[i*3+j];
}
std::cout << "Rotation Matrix:"<<endl;
for(i=0; i<3; i++)
{
for(j=0; j<3; j++)
std::cout<< rot_mat.at<double>(i,j) << " ";
std::cout << endl;
}
std::cout << endl;
std::cout << "Rodrigues: "<<endl;
Rodrigues(rot_mat, rot_vec);
for(i=0; i<3; i++)
std::cout << rot_vec.at<double>(1,i) << " ";
std::cout << endl;
ros::spin();
return 0;
};
And its output is:
Rotation Matrix:
-0.999998 -0.00188887 -0.000125644
0.0018868 -0.999888 0.014822
-0.000153626 0.0148217 0.99989
Rodrigues:
0.0232688 3.13962 4.94066e-324
But when I load the same rotation matrix in matlab and use the rodrigues function I get the following:
R =
-1.0000 -0.0019 -0.0001
0.0019 -0.9999 0.0148
-0.0002 0.0148 0.9999
>> rodrigues(R)
ans =
-0.0002
0.0233
3.1396
I can see that the numbers match, but they are in different positions and there seems to be an issue also with the signs.....Which formula should I trust?aripodMon, 25 Jan 2016 07:54:16 -0600http://answers.opencv.org/question/85360/Method for finding Orientation Error using Axis-Anglehttp://answers.opencv.org/question/193675/method-for-finding-orientation-error-using-axis-angle/Hi,
I have a reference value for Roll, pitch and yaw (Euler Angles)
and my estimate for the same. I want to find the error between the two.
If I convert each of the RPY values to a Rotation matrix, I see some possible ways (see below) of finding
the orientation error.
I recently came across this openCV function in the calib3d module: [get_rotation_error](https://github.com/opencv/opencv/pull/11506) that uses Rodrigues/Axis-Angle (I think they mean the same) for finding the error between 2 rotation matrices.
**I have 2 questions** -
1) In the method given in [get_rotation_error](https://github.com/opencv/opencv/pull/11506), it seems to "subtract" the two rotation matrices by transposing one (not sure what the negative sign is about)
error_mat = R_ref * R.transpose() * -1
error = cv::norm( cv::Rodrigues ( error_mat ))
**How are we supposed to interpret the output** ( I believe the output of the cv::norm( rodrigues_vector) is the angle of the rodrigues vector according to openCV convention. Does this mean I simply need to convert it to degrees to find the angle error (between reference and my estimates) in degrees ?
I would also like to mention that **this method keeps returning 3.14159** even for wildly different values of the reference and my estimates. Is there something that I''m missing ?
======
2) I thought of another method, slightly different from the above , what if I do the following -
my_angle = cv::norm (cv::Rodrigues ( R ))
reference_angle = cv::norm (cv::Rodrigues ( R_ref ))
error = reference_angle - my_angle
**Is there something wrong** with method 2) ? I have tried it and it gives a different output compared to method 1).
I would be very grateful if someone can answer the above queries or even point me in the right direction.
Thanks!malharjajooTue, 12 Jun 2018 21:05:54 -0500http://answers.opencv.org/question/193675/Rodrigues rotationhttp://answers.opencv.org/question/163351/rodrigues-rotation/I do not understand the difference between these two equations:
<br>
1. from wikipedia:
![wiki Rodrigues formula](https://wikimedia.org/api/rest_v1/media/math/render/svg/14de5f7bfa4af6a7867008d8fd790d14e3a54530)
https://en.wikipedia.org/wiki/Rodrigues%27_rotation_formula
<br>
2. from open CV doc:
![cv2 Rodrigues formal](http://docs.opencv.org/2.4/_images/math/8bffbe8d9297cebc136dc8ead9a40cad3940a640.png)
http://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#void%20Rodrigues(InputArray%20src,%20OutputArray%20dst,%20OutputArray%20jacobian)
<br>
Where is the **cos(θ)** gone on the wiki page in the formula 1. ?
Shouln't it be: v_{rot} = cos(θ)v + sin.... ?
Then on the wiki page, there is no more cos(θ) in the definition of R...
<br>
Or did I miss something?
swiss_knightSun, 02 Jul 2017 16:47:32 -0500http://answers.opencv.org/question/163351/Rotation and Translation questionhttp://answers.opencv.org/question/95782/rotation-and-translation-question/ I've used OpenCV's calibrateCamera function to give me the rvecs and tvecs to try to get the transformation from that of my camera with world coordinates (0,0,0) to that of a chessboard. I'm using it to find the corners of the chessboard with respect to the camera.
I'm confused as to how to apply the rvec and tvec to get the transformation. Do I just do (0,0,0)*rvec + tvec. If this is the case, how do I invert the transformation, as I'll need this for something else as well?
After doing research online I realized I may have to use the Rodrigues() function to get the rotation matrix from rvec, augment that with the tvec, and then add a row of (0,0,0,1) to get a 4x4 transformation matrix. This way, I would be able to get the inverse. However, If that is the case, how do I multiply that by (0,0,0)? Am I just supposed to do (0,0,0,1)*(rotm|tvec, 0,0,0,1)?
I'm not sure which of the two methods to use, I'm doubtful about how to proceed with either of them.sourmanbTue, 07 Jun 2016 01:19:52 -0500http://answers.opencv.org/question/95782/Rodrigues giving empty Matrixhttp://answers.opencv.org/question/88581/rodrigues-giving-empty-matrix/ Hi,
I'm implementing markerbased tracking on Android and I'm stuck at the rotation matrix.
Up until the solvePnP everything works fine and I get results (rvec, tvec) that seem legit.
But after using the rvec in Rodrigues for having a rotation matrix, it just results in a 3x3 Matrix, filled only with zeroes.
I really don't get this, because there are definitely values in rvec, but Rodrigues just doesn't use them?
I hope somebody knows whats wrong.
Heres the code snippet:
Mat rvec = new Mat(3,1,CvType.CV_64F);
Mat tvec = new Mat(3,1,CvType.CV_64F);
MatOfPoint3f markerPointsMat = new MatOfPoint3f(new Point3(0,0,0),new Point3(1,0,0),new Point3(1,1,0),new Point3(0,1,0));
MatOfPoint2f pointsMat = new MatOfPoint2f(points.get(0),points.get(1),points.get(2),points.get(3));
//cam matrix, by hand
Mat cam=new Mat(3,3,CvType.CV_64F);
cam.zeros(cam.size(),CvType.CV_64F);
cam.put(0, 0, 1400.533595140377 );
cam.put(0, 2, 175.5 );
cam.put(1, 1, 1400.533595140377 );
cam.put(1, 2, 143.5);
cam.put(2,2,1.0);
double[] a = {0.1155606860709641,-0.3046292381782507,0,0,0};
MatOfDouble dist = new MatOfDouble(a);
Calib3d.solvePnP(markerPointsMat, pointsMat, cam, dist, rvec, tvec);
//rot vector to matrix
Mat rmat = Mat.zeros(3,3,CvType.CV_64F);
Rodrigues(rvec, rmat);eqnxThu, 25 Feb 2016 09:28:12 -0600http://answers.opencv.org/question/88581/