Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Wrong result in calibrateHandEye function

I am working in a robot application, in which I have a camera fixed to a robot gripper. In order to calculate the matrix transformation between the camera and the gripper Hcg I am using the calibrateHandEye new function provided in the OpenCV version 4.1.

In the next picture, you can see the description of the function from the OpenCV documentation

image description

To use this function I am editing the camera_calibration example of samples\cpp\tutorial_code\calib3d\

I had taken 10 pictures of the chessboard from the camera mounted in the gripper and at the same time I recorded the robot position.

To get the R_gripper2base I loaded the RPY angles from the robot and transformed them to rotation matrices using the function eulerAnglesToRotationMatrix I found on the Internet:

Vec3f theta_01{ deg2rad(-153.61), deg2rad(8.3),   deg2rad(-91.91) };
Mat robot_rot_01 = eulerAnglesToRotationMatrix(theta_01);
R_gripper2base.push_back(robot_rot_01); // and for the other 9 angles

Mat eulerAnglesToRotationMatrix(Vec3f &theta)
{
    // Calculate rotation about x axis
    Mat R_x = (Mat_<double>(3, 3) <<
        1, 0, 0,
        0, cos(theta[0]), -sin(theta[0]),
        0, sin(theta[0]), cos(theta[0])
        );
    // Calculate rotation about y axis
    Mat R_y = (Mat_<double>(3, 3) <<
        cos(theta[1]), 0, sin(theta[1]),
        0, 1, 0,
        -sin(theta[1]), 0, cos(theta[1])
        );
    // Calculate rotation about z axis
    Mat R_z = (Mat_<double>(3, 3) <<
        cos(theta[2]), -sin(theta[2]), 0,
        sin(theta[2]), cos(theta[2]), 0,
        0, 0, 1);
    // Combined rotation matrix
    Mat R = R_z * R_y * R_x;
    return R;
}

To get the t_gripper2base I loaded them as:

const Mat robot_tr_01 = (Mat_<float>(3, 1) << 781.2, 338.59, 903.48); //in mm
t_gripper2base.push_back(robot_tr_01); // the same for the other 9 positions

To get the last two parameters R_target2cam and t_target2cam matrices I am taking the values from the bigmat matrix with the information of rotation and translation of the camera_calibration example and with the Rodriguez function I am getting the rotation matrix fro the chessboard positions in the pictures.

cv::Mat mrot;

for (size_t i = 0; i < rvecs.size(); i++)
{
    Mat Rt = bigmat(Range(int(i), int(i + 1)), Range(0, 3)); //values in radians
    Mat Tr = bigmat(Range(int(i), int(i + 1)), Range(3, 6));

    Rodrigues(Rt, mrot);
    R_target2cam.push_back(mrot);
    t_target2cam.push_back(Tr);
}

Finally, to call the function I am using the next lines of code:

Mat R_cam2gripper = (Mat_<float>(3, 3));
Mat t_cam2gripper = (Mat_<float>(3, 1));

calibrateHandEye(R_gripper2base, t_gripper2base, R_target2cam, t_target2cam, R_cam2gripper, t_cam2gripper);
Vec3f R_cam2gripper_r = rotationMatrixToEulerAngles(R_cam2gripper);

The result the function is giving me is the next one:

    R_cam2gripper =
     [1, -6.352335846505455e-18, 7.59681797025031e-18;
     6.352335846505455e-18, 1, 2.469047932539178e-17;
     -7.59681797025031e-18, -2.469047932539178e-17, 1]

    R_cam2gripper_r =
    [-0.000, 0.000, 0.000]

    t_cam2gripper =
     [105.9198015726369;
     77.66945493316197;
     157.7887763879414]

While the translation vector is close to the reality (it should be 103mm, 84mm and -132mm. The last sign been opposite), the rotation matrix is, in fact, the identity matrix I, when I should be expecting something similar to Rx=-1.4°, Ry=-0.8°, Rz=178°)

The 'good' results I am expecting were provided from another software. It gives those values and actually with them, the robot got to the exact points I am pointing in the camera (I have a 3D camera, from which I am getting the x, y, z from the camera world) so they are certainly correct.

Sorry for the long introduction to my problem. Any understanding of why the solutions are not what is supposed to be? My guess is, that I have a problem understanding the angles or converting them but I couldn't find any way to solve this. Any hint will be well welcome!

Wrong result in calibrateHandEye function

I am working in a robot application, in which I have a camera fixed to a robot gripper. In order to calculate the matrix transformation between the camera and the gripper Hcg I am using the calibrateHandEye new function provided in the OpenCV version 4.1.

In the next picture, you can see the description of the function from the OpenCV documentation

image description

To use this function I am editing the camera_calibration example of samples\cpp\tutorial_code\calib3d\

I had taken 10 pictures of the chessboard from the camera mounted in the gripper and at the same time I recorded the robot position.

To get the R_gripper2base I loaded the RPY angles from the robot and transformed them to rotation matrices using the function eulerAnglesToRotationMatrix I found on the Internet:changed the code, now It looks like:

// My_handeye.cpp : This file contains the 'main' function. Program execution begins and ends there.
//
#include <iostream>
#include <sstream>
#include <string>
#include <ctime>
#include <cstdio>
#include "pch.h"

#include <opencv2/opencv.hpp>
#include <opencv2/core.hpp>
#include <opencv2/core/utility.hpp>
#include <opencv2/imgproc.hpp>
#include <opencv2/calib3d.hpp>
#include <opencv2/imgcodecs.hpp>
#include <opencv2/videoio.hpp>
#include <opencv2/highgui.hpp>

using namespace cv;
using namespace std;

Mat eulerAnglesToRotationMatrix(Vec3f &theta);
Vec3f rotationMatrixToEulerAngles(Mat &R);
float rad2deg(float radian);
float deg2rad(float degree);

int main()
{

    // Camera calibration information

    std::vector<double> distortionCoefficients(5);  // camera distortion
    distortionCoefficients[0] = 2.4472856611074989e-01;
    distortionCoefficients[1] = -8.1042032574246325e-01;
    distortionCoefficients[2] = 0;
    distortionCoefficients[3] = 0;
    distortionCoefficients[4] = 7.8769462320821060e-01;

    double f_x = 1.3624172121852105e+03; // Focal length in x axis
    double f_y = 1.3624172121852105e+03; // Focal length in y axis (usually the same?)
    double c_x = 960; // Camera primary point x
    double c_y = 540; // Camera primary point y

    cv::Mat cameraMatrix(3, 3, CV_32FC1);
    cameraMatrix.at<float>(0, 0) = f_x;
    cameraMatrix.at<float>(0, 1) = 0.0;
    cameraMatrix.at<float>(0, 2) = c_x;
    cameraMatrix.at<float>(1, 0) = 0.0;
    cameraMatrix.at<float>(1, 1) = f_y;
    cameraMatrix.at<float>(1, 2) = c_y;
    cameraMatrix.at<float>(2, 0) = 0.0;
    cameraMatrix.at<float>(2, 1) = 0.0;
    cameraMatrix.at<float>(2, 2) = 1.0;

    Mat rvec(3, 1, CV_32F), tvec(3, 1, CV_32F);
    //

    std::vector<Mat> R_gripper2base;
    std::vector<Mat> t_gripper2base;
    std::vector<Mat> R_target2cam;
    std::vector<Mat> t_target2cam;
    Mat R_cam2gripper = (Mat_<float>(3, 3));
    Mat t_cam2gripper = (Mat_<float>(3, 1));

    vector<String> fn;
    glob("images/*.bmp", fn, false);

    vector<Mat> images;
    size_t num_images = fn.size(); //number of bmp files in images folder
    Size patternsize(6, 8); //number of centers
    vector<Point2f> centers; //this will be filled by the detected centers
    float cell_size = 30;
    vector<Point3f> obj_points;

    R_gripper2base.reserve(num_images);
    t_gripper2base.reserve(num_images);
    R_target2cam.reserve(num_images);
    t_target2cam.reserve(num_images);

    for (int i = 0; i < patternsize.height; ++i)
        for (int j = 0; j < patternsize.width; ++j)
            obj_points.push_back(Point3f(float(j*cell_size),
                float(i*cell_size), 0.f));

    for (size_t i = 0; i < num_images; i++)
        images.push_back(imread(fn[i]));

    Mat frame;

    for (size_t i = 0; i < num_images; i++)
    {
        frame = imread(fn[i]); //source image
        bool patternfound = findChessboardCorners(frame, patternsize, centers);
        if (patternfound)
        {
            drawChessboardCorners(frame, patternsize, Mat(centers), patternfound);
            //imshow("window", frame);
            //int key = cv::waitKey(0) & 0xff;
            solvePnP(Mat(obj_points), Mat(centers), cameraMatrix, distortionCoefficients, rvec, tvec);

            Mat R;
            Rodrigues(rvec, R); // R is 3x3
            R_target2cam.push_back(R);
            t_target2cam.push_back(tvec);
            Mat T = Mat::eye(4, 4, R.type()); // T is 4x4
            T(Range(0, 3), Range(0, 3)) = R * 1; // copies R into T
            T(Range(0, 3), Range(3, 4)) = tvec * 1; // copies tvec into T

            cout << "T = " << endl << " " << T << endl << endl;

        }
        cout << patternfound << endl;
    }

    Vec3f theta_01{ deg2rad(-153.61), deg2rad(8.3),   deg2rad(-91.91) };
    Vec3f theta_02{ deg2rad(-166.71), deg2rad(3.04),  deg2rad(-93.31) };
    Vec3f theta_03{ deg2rad(-170.04), deg2rad(24.92), deg2rad(-88.29) };
    Vec3f theta_04{ deg2rad(-165.71), deg2rad(24.68), deg2rad(-84.85) };
    Vec3f theta_05{ deg2rad(-160.18), deg2rad(-15.94),deg2rad(-56.24) };
    Vec3f theta_06{ deg2rad(175.68),  deg2rad(10.95), deg2rad(180) };
    Vec3f theta_07{ deg2rad(175.73),  deg2rad(45.78), deg2rad(-179.92) };
    Vec3f theta_08{ deg2rad(-165.34), deg2rad(47.37), deg2rad(-166.25) };
    Vec3f theta_09{ deg2rad(-165.62), deg2rad(17.95), deg2rad(-166.17) };
    Vec3f theta_10{ deg2rad(-151.99), deg2rad(-14.59),deg2rad(-94.19) };

    Mat robot_rot_01 = eulerAnglesToRotationMatrix(theta_01);
R_gripper2base.push_back(robot_rot_01); // and for the other 9 angles
    Mat robot_rot_02 = eulerAnglesToRotationMatrix(theta_02);
    Mat robot_rot_03 = eulerAnglesToRotationMatrix(theta_03);
    Mat robot_rot_04 = eulerAnglesToRotationMatrix(theta_04);
    Mat robot_rot_05 = eulerAnglesToRotationMatrix(theta_05);
    Mat robot_rot_06 = eulerAnglesToRotationMatrix(theta_06);
    Mat robot_rot_07 = eulerAnglesToRotationMatrix(theta_07);
    Mat robot_rot_08 = eulerAnglesToRotationMatrix(theta_08);
    Mat robot_rot_09 = eulerAnglesToRotationMatrix(theta_09);
    Mat robot_rot_10 = eulerAnglesToRotationMatrix(theta_10);

    const Mat robot_tr_01 = (Mat_<float>(3, 1) << 781.2, 338.59, 903.48);
    const Mat robot_tr_02 = (Mat_<float>(3, 1) << 867.65, 382.52, 884.42);
    const Mat robot_tr_03 = (Mat_<float>(3, 1) << 856.91, 172.99, 964.61);
    const Mat robot_tr_04 = (Mat_<float>(3, 1) << 748.81, 146.75, 1043.29);
    const Mat robot_tr_05 = (Mat_<float>(3, 1) << 627.66, 554.08, 920.85);
    const Mat robot_tr_06 = (Mat_<float>(3, 1) << 715.06, 195.96, 889.38);
    const Mat robot_tr_07 = (Mat_<float>(3, 1) << 790.9, 196.29, 1117.38);
    const Mat robot_tr_08 = (Mat_<float>(3, 1) << 743.5, 283.93, 1131.92);
    const Mat robot_tr_09 = (Mat_<float>(3, 1) << 748.9, 288.19, 910.58);
    const Mat robot_tr_10 = (Mat_<float>(3, 1) << 813.18, 400.44, 917.16);

    R_gripper2base.push_back(robot_rot_01);
    R_gripper2base.push_back(robot_rot_02);
    R_gripper2base.push_back(robot_rot_03);
    R_gripper2base.push_back(robot_rot_04);
    R_gripper2base.push_back(robot_rot_05);
    R_gripper2base.push_back(robot_rot_06);
    R_gripper2base.push_back(robot_rot_07);
    R_gripper2base.push_back(robot_rot_08);
    R_gripper2base.push_back(robot_rot_09);
    R_gripper2base.push_back(robot_rot_10);

    t_gripper2base.push_back(robot_tr_01);
    t_gripper2base.push_back(robot_tr_02);
    t_gripper2base.push_back(robot_tr_03);
    t_gripper2base.push_back(robot_tr_04);
    t_gripper2base.push_back(robot_tr_05);
    t_gripper2base.push_back(robot_tr_06);
    t_gripper2base.push_back(robot_tr_07);
    t_gripper2base.push_back(robot_tr_08);
    t_gripper2base.push_back(robot_tr_09);
    t_gripper2base.push_back(robot_tr_10);

    calibrateHandEye(R_gripper2base, t_gripper2base, R_target2cam, t_target2cam, R_cam2gripper, t_cam2gripper, CALIB_HAND_EYE_TSAI);

    Vec3f R_cam2gripper_r = rotationMatrixToEulerAngles(R_cam2gripper);

    cout << "R_cam2gripper = " << endl << " " << R_cam2gripper << endl << endl;
    cout << "R_cam2gripper_r = " << endl << " " << R_cam2gripper_r << endl << endl;
    cout << "t_cam2gripper = " << endl << " " << t_cam2gripper << endl << endl;
}

Mat eulerAnglesToRotationMatrix(Vec3f &theta)
{
    // Calculate rotation about x axis
    Mat R_x = (Mat_<double>(3, 3) <<
        1, 0, 0,
        0, cos(theta[0]), -sin(theta[0]),
        0, sin(theta[0]), cos(theta[0])
        );
     // Calculate rotation about y axis
    Mat R_y = (Mat_<double>(3, 3) <<
        cos(theta[1]), 0, sin(theta[1]),
        0, 1, 0,
        -sin(theta[1]), 0, cos(theta[1])
        );
     // Calculate rotation about z axis
    Mat R_z = (Mat_<double>(3, 3) <<
        cos(theta[2]), -sin(theta[2]), 0,
        sin(theta[2]), cos(theta[2]), 0,
        0, 0, 1);
     // Combined rotation matrix
    Mat R = R_z * R_y * R_x;
     return R;
 }

To get the t_gripper2base float rad2deg(float radian) { double pi = 3.14159; return(radian * (180 / pi)); } float deg2rad(float degree) { double pi = 3.14159; return(degree * (pi / 180)); } // Checks if a matrix is a valid rotation matrix. bool isRotationMatrix(Mat &R) { Mat Rt; transpose(R, Rt); Mat shouldBeIdentity = Rt * R; Mat I loaded them as:

const Mat robot_tr_01 = (Mat_<float>(3, 1) << 781.2, 338.59, 903.48); //in mm
t_gripper2base.push_back(robot_tr_01); = Mat::eye(3, 3, shouldBeIdentity.type());

    return  norm(I, shouldBeIdentity) < 1e-6;

}

// the same for the other 9 positions

To get the last two parameters R_target2cam and t_target2cam matrices I am taking the values from the bigmat matrix with the information of rotation and translation of the camera_calibration example and with the Rodriguez function I am getting the Calculates rotation matrix fro the chessboard positions in the pictures.

cv::Mat mrot;

for (size_t i = 0; i to euler angles
// The result is the same as MATLAB except the order
// of the euler angles ( x and z are swapped ).
Vec3f rotationMatrixToEulerAngles(Mat &R)
{

    assert(isRotationMatrix(R));

    float sy = sqrt(R.at<double>(0, 0) * R.at<double>(0, 0) + R.at<double>(1, 0) * R.at<double>(1, 0));

    bool singular = sy < rvecs.size(); i++)
1e-6; // If

    float x, y, z;
    if (!singular)
    {
    Mat Rt = bigmat(Range(int(i), int(i + 1)), Range(0, 3)); //values in radians
    Mat Tr = bigmat(Range(int(i), int(i + 1)), Range(3, 6));

    Rodrigues(Rt, mrot);
    R_target2cam.push_back(mrot);
    t_target2cam.push_back(Tr);
    x = atan2(R.at<double>(2, 1), R.at<double>(2, 2));
        y = atan2(-R.at<double>(2, 0), sy);
        z = atan2(R.at<double>(1, 0), R.at<double>(0, 0));
    }

Finally, to call the function I am using the next lines of code:

Mat R_cam2gripper = (Mat_<float>(3, 3));
Mat t_cam2gripper = (Mat_<float>(3,     else
    {
        x = atan2(-R.at<double>(1, 2), R.at<double>(1, 1));

calibrateHandEye(R_gripper2base, t_gripper2base, R_target2cam, t_target2cam, R_cam2gripper, t_cam2gripper);
Vec3f R_cam2gripper_r = rotationMatrixToEulerAngles(R_cam2gripper);
        y = atan2(-R.at<double>(2, 0), sy);
        z = 0;
    }
    return Vec3f(x, y, z);

}

The result the function is giving me is the next one:

 R_cam2gripper =
     [1, -6.352335846505455e-18, 7.59681797025031e-18;
     6.352335846505455e-18, 1, 2.469047932539178e-17;
     -7.59681797025031e-18, -2.469047932539178e-17, 1]

   =  [0.8357872530342421, 0.4275441928943815, 0.3444787813316921;  
-0.4108751612084568, 0.90320366951094, -0.1241158059390393;  
-0.3641994914430832, -0.03780336630863103, 0.9305534030502145]

 R_cam2gripper_r =
    [-0.000, 0.000, 0.000]

   =  [-0.0406023, 0.372773, -0.456907]

 t_cam2gripper =
     [105.9198015726369;
     77.66945493316197;
     157.7887763879414]
=  [11.99249455382922; 
  213.3407196054294;
  320.1514538736999]

While the translation vector is close to the reality (it should be 103mm, 84mm and -132mm. The last sign been opposite), the rotation matrix is, in fact, the identity matrix I, when I should be expecting something similar to Rx=-1.4°, Ry=-0.8°, Rz=178°)

The am getting 'good' results I am expecting were provided from another by other software. It gives those values and actually with With them, the robot got to the exact points I am pointing in the camera (I have a 3D camera, from which I am getting the x, y, z from the camera world) so they are certainly correct. correct, but I am having troubles to repeat the same result with the OpenCV function.

Sorry for the long introduction to my problem. Any understanding of why the solutions are not what is supposed to be? My guess is, that I have a problem understanding the angles or converting them but I couldn't find any way to solve this. Any hint will be well welcome!

Wrong result in calibrateHandEye function

I am working in a robot application, in which I have a camera fixed to a robot gripper. In order to calculate the matrix transformation between the camera and the gripper Hcg I am using the calibrateHandEye new function provided in the OpenCV version 4.1.

In the next picture, you can see the description of the function from the OpenCV documentation

image description

To use this function I am editing the camera_calibration example of samples\cpp\tutorial_code\calib3d\

I had taken 10 pictures of the chessboard from the camera mounted in the gripper and at the same time I recorded the robot position.

I changed the code, now It looks like:

// My_handeye.cpp : This file contains the 'main' function. Program execution begins and ends there.
//
#include <iostream>
#include <sstream>
#include <string>
#include <ctime>
#include <cstdio>
#include "pch.h"

#include <opencv2/opencv.hpp>
#include <opencv2/core.hpp>
#include <opencv2/core/utility.hpp>
#include <opencv2/imgproc.hpp>
#include <opencv2/calib3d.hpp>
#include <opencv2/imgcodecs.hpp>
#include <opencv2/videoio.hpp>
#include <opencv2/highgui.hpp>

using namespace cv;
using namespace std;

Mat eulerAnglesToRotationMatrix(Vec3f &theta);
Vec3f rotationMatrixToEulerAngles(Mat &R);
float rad2deg(float radian);
float deg2rad(float degree);

int main()
{

    // Camera calibration information

    std::vector<double> distortionCoefficients(5);  // camera distortion
    distortionCoefficients[0] = 2.4472856611074989e-01;
    distortionCoefficients[1] = -8.1042032574246325e-01;
    distortionCoefficients[2] = 0;
    distortionCoefficients[3] = 0;
    distortionCoefficients[4] = 7.8769462320821060e-01;

    double f_x = 1.3624172121852105e+03; // Focal length in x axis
    double f_y = 1.3624172121852105e+03; // Focal length in y axis (usually the same?)
    double c_x = 960; // Camera primary point x
    double c_y = 540; // Camera primary point y

    cv::Mat cameraMatrix(3, 3, CV_32FC1);
    cameraMatrix.at<float>(0, 0) = f_x;
    cameraMatrix.at<float>(0, 1) = 0.0;
    cameraMatrix.at<float>(0, 2) = c_x;
    cameraMatrix.at<float>(1, 0) = 0.0;
    cameraMatrix.at<float>(1, 1) = f_y;
    cameraMatrix.at<float>(1, 2) = c_y;
    cameraMatrix.at<float>(2, 0) = 0.0;
    cameraMatrix.at<float>(2, 1) = 0.0;
    cameraMatrix.at<float>(2, 2) = 1.0;

    Mat rvec(3, 1, CV_32F), tvec(3, 1, CV_32F);
    //

    std::vector<Mat> R_gripper2base;
    std::vector<Mat> t_gripper2base;
    std::vector<Mat> R_target2cam;
    std::vector<Mat> t_target2cam;
    Mat R_cam2gripper = (Mat_<float>(3, 3));
    Mat t_cam2gripper = (Mat_<float>(3, 1));

    vector<String> fn;
    glob("images/*.bmp", fn, false);

    vector<Mat> images;
    size_t num_images = fn.size(); //number of bmp files in images folder
    Size patternsize(6, 8); //number of centers
    vector<Point2f> centers; //this will be filled by the detected centers
    float cell_size = 30;
    vector<Point3f> obj_points;

    R_gripper2base.reserve(num_images);
    t_gripper2base.reserve(num_images);
    R_target2cam.reserve(num_images);
    t_target2cam.reserve(num_images);

    for (int i = 0; i < patternsize.height; ++i)
        for (int j = 0; j < patternsize.width; ++j)
            obj_points.push_back(Point3f(float(j*cell_size),
                float(i*cell_size), 0.f));

    for (size_t i = 0; i < num_images; i++)
        images.push_back(imread(fn[i]));

    Mat frame;

    for (size_t i = 0; i < num_images; i++)
    {
        frame = imread(fn[i]); //source image
        bool patternfound = findChessboardCorners(frame, patternsize, centers);
        if (patternfound)
        {
            drawChessboardCorners(frame, patternsize, Mat(centers), patternfound);
            //imshow("window", frame);
            //int key = cv::waitKey(0) & 0xff;
            solvePnP(Mat(obj_points), Mat(centers), cameraMatrix, distortionCoefficients, rvec, tvec);

            Mat R;
            Rodrigues(rvec, R); // R is 3x3
            R_target2cam.push_back(R);
            t_target2cam.push_back(tvec);
            Mat T = Mat::eye(4, 4, R.type()); // T is 4x4
            T(Range(0, 3), Range(0, 3)) = R * 1; // copies R into T
            T(Range(0, 3), Range(3, 4)) = tvec * 1; // copies tvec into T

            cout << "T = " << endl << " " << T << endl << endl;

        }
        cout << patternfound << endl;
    }

    Vec3f theta_01{ deg2rad(-153.61), deg2rad(8.3),   deg2rad(-91.91) };
    Vec3f theta_02{ deg2rad(-166.71), deg2rad(3.04),  deg2rad(-93.31) };
    Vec3f theta_03{ deg2rad(-170.04), deg2rad(24.92), deg2rad(-88.29) };
    Vec3f theta_04{ deg2rad(-165.71), deg2rad(24.68), deg2rad(-84.85) };
    Vec3f theta_05{ deg2rad(-160.18), deg2rad(-15.94),deg2rad(-56.24) };
    Vec3f theta_06{ deg2rad(175.68),  deg2rad(10.95), deg2rad(180) };
    Vec3f theta_07{ deg2rad(175.73),  deg2rad(45.78), deg2rad(-179.92) };
    Vec3f theta_08{ deg2rad(-165.34), deg2rad(47.37), deg2rad(-166.25) };
    Vec3f theta_09{ deg2rad(-165.62), deg2rad(17.95), deg2rad(-166.17) };
    Vec3f theta_10{ deg2rad(-151.99), deg2rad(-14.59),deg2rad(-94.19) };

    Mat robot_rot_01 = eulerAnglesToRotationMatrix(theta_01);
    Mat robot_rot_02 = eulerAnglesToRotationMatrix(theta_02);
    Mat robot_rot_03 = eulerAnglesToRotationMatrix(theta_03);
    Mat robot_rot_04 = eulerAnglesToRotationMatrix(theta_04);
    Mat robot_rot_05 = eulerAnglesToRotationMatrix(theta_05);
    Mat robot_rot_06 = eulerAnglesToRotationMatrix(theta_06);
    Mat robot_rot_07 = eulerAnglesToRotationMatrix(theta_07);
    Mat robot_rot_08 = eulerAnglesToRotationMatrix(theta_08);
    Mat robot_rot_09 = eulerAnglesToRotationMatrix(theta_09);
    Mat robot_rot_10 = eulerAnglesToRotationMatrix(theta_10);

    const Mat robot_tr_01 = (Mat_<float>(3, 1) << 781.2, 338.59, 903.48);
    const Mat robot_tr_02 = (Mat_<float>(3, 1) << 867.65, 382.52, 884.42);
    const Mat robot_tr_03 = (Mat_<float>(3, 1) << 856.91, 172.99, 964.61);
    const Mat robot_tr_04 = (Mat_<float>(3, 1) << 748.81, 146.75, 1043.29);
    const Mat robot_tr_05 = (Mat_<float>(3, 1) << 627.66, 554.08, 920.85);
    const Mat robot_tr_06 = (Mat_<float>(3, 1) << 715.06, 195.96, 889.38);
    const Mat robot_tr_07 = (Mat_<float>(3, 1) << 790.9, 196.29, 1117.38);
    const Mat robot_tr_08 = (Mat_<float>(3, 1) << 743.5, 283.93, 1131.92);
    const Mat robot_tr_09 = (Mat_<float>(3, 1) << 748.9, 288.19, 910.58);
    const Mat robot_tr_10 = (Mat_<float>(3, 1) << 813.18, 400.44, 917.16);

    R_gripper2base.push_back(robot_rot_01);
    R_gripper2base.push_back(robot_rot_02);
    R_gripper2base.push_back(robot_rot_03);
    R_gripper2base.push_back(robot_rot_04);
    R_gripper2base.push_back(robot_rot_05);
    R_gripper2base.push_back(robot_rot_06);
    R_gripper2base.push_back(robot_rot_07);
    R_gripper2base.push_back(robot_rot_08);
    R_gripper2base.push_back(robot_rot_09);
    R_gripper2base.push_back(robot_rot_10);

    t_gripper2base.push_back(robot_tr_01);
    t_gripper2base.push_back(robot_tr_02);
    t_gripper2base.push_back(robot_tr_03);
    t_gripper2base.push_back(robot_tr_04);
    t_gripper2base.push_back(robot_tr_05);
    t_gripper2base.push_back(robot_tr_06);
    t_gripper2base.push_back(robot_tr_07);
    t_gripper2base.push_back(robot_tr_08);
    t_gripper2base.push_back(robot_tr_09);
    t_gripper2base.push_back(robot_tr_10);

    calibrateHandEye(R_gripper2base, t_gripper2base, R_target2cam, t_target2cam, R_cam2gripper, t_cam2gripper, CALIB_HAND_EYE_TSAI);

    Vec3f R_cam2gripper_r = rotationMatrixToEulerAngles(R_cam2gripper);

    cout << "R_cam2gripper = " << endl << " " << R_cam2gripper << endl << endl;
    cout << "R_cam2gripper_r = " << endl << " " << R_cam2gripper_r << endl << endl;
    cout << "t_cam2gripper = " << endl << " " << t_cam2gripper << endl << endl;
}

Mat eulerAnglesToRotationMatrix(Vec3f &theta)
{
    // Calculate rotation about x axis
    Mat R_x = (Mat_<double>(3, 3) <<
        1, 0, 0,
        0, cos(theta[0]), -sin(theta[0]),
        0, sin(theta[0]), cos(theta[0])
        );

    // Calculate rotation about y axis
    Mat R_y = (Mat_<double>(3, 3) <<
        cos(theta[1]), 0, sin(theta[1]),
        0, 1, 0,
        -sin(theta[1]), 0, cos(theta[1])
        );

    // Calculate rotation about z axis
    Mat R_z = (Mat_<double>(3, 3) <<
        cos(theta[2]), -sin(theta[2]), 0,
        sin(theta[2]), cos(theta[2]), 0,
        0, 0, 1);


    // Combined rotation matrix
    Mat R = R_z R_x * R_y * R_x;
R_z;

    return R;

}

float rad2deg(float radian) {
    double pi = 3.14159;
    return(radian * (180 / pi));
}

float deg2rad(float degree) {
    double pi = 3.14159;
    return(degree * (pi / 180));
}

// Checks if a matrix is a valid rotation matrix.
bool isRotationMatrix(Mat &R)
{
    Mat Rt;
    transpose(R, Rt);
    Mat shouldBeIdentity = Rt * R;
    Mat I = Mat::eye(3, 3, shouldBeIdentity.type());

    return  norm(I, shouldBeIdentity) < 1e-6;

}

// Calculates rotation matrix to euler angles
// The result is the same as MATLAB except the order
// of the euler angles ( x and z are swapped ).
Vec3f rotationMatrixToEulerAngles(Mat &R)
{

    assert(isRotationMatrix(R));

    float sy = sqrt(R.at<double>(0, 0) * R.at<double>(0, 0) + R.at<double>(1, 0) * R.at<double>(1, 0));

    bool singular = sy < 1e-6; // If

    float x, y, z;
    if (!singular)
    {
        x = atan2(R.at<double>(2, 1), R.at<double>(2, 2));
        y = atan2(-R.at<double>(2, 0), sy);
        z = atan2(R.at<double>(1, 0), R.at<double>(0, 0));
    }
    else
    {
        x = atan2(-R.at<double>(1, 2), R.at<double>(1, 1));
        y = atan2(-R.at<double>(2, 0), sy);
        z = 0;
    }
    return Vec3f(x, y, z);

}

The result the function is giving me is the next one:

R_cam2gripper =  [0.8357872530342421, 0.4275441928943815, 0.3444787813316921;  
-0.4108751612084568, 0.90320366951094, -0.1241158059390393;  
-0.3641994914430832, -0.03780336630863103, 0.9305534030502145]

 R_cam2gripper_r =  [-0.0406023, 0.372773, -0.456907]

 t_cam2gripper =  [11.99249455382922; 
  213.3407196054294;
  320.1514538736999]

I am getting 'good' results provided by other software. With them, the robot got to the exact points I am pointing in the camera (I have a 3D camera, from which I am getting the x, y, z from the camera world) so they are certainly correct, but I am having troubles to repeat the same result with the OpenCV function.

Sorry for the long introduction to my problem. Any understanding of why the solutions are not what is supposed to be? My guess is, that I have a problem understanding the angles or converting them but I couldn't find any way to solve this. Any hint will be well welcome!

Wrong result in calibrateHandEye function

I am working in a robot application, in which I have a camera fixed to a robot gripper. In order to calculate the matrix transformation between the camera and the gripper Hcg I am using the calibrateHandEye new function provided in the OpenCV version 4.1.

In the next picture, you can see the description of the function from the OpenCV documentation

image description

To use this function I am editing the camera_calibration example of samples\cpp\tutorial_code\calib3d\

I had taken 10 pictures of the chessboard from the camera mounted in the gripper and at the same time I recorded the robot position.

I changed the code, now It looks like:

// My_handeye.cpp : This file contains the 'main' function. Program execution begins and ends there.
//
#include <iostream>
#include <sstream>
#include <string>
#include <ctime>
#include <cstdio>
#include "pch.h"

#include <opencv2/opencv.hpp>
#include <opencv2/core.hpp>
#include <opencv2/core/utility.hpp>
#include <opencv2/imgproc.hpp>
#include <opencv2/calib3d.hpp>
#include <opencv2/imgcodecs.hpp>
#include <opencv2/videoio.hpp>
#include <opencv2/highgui.hpp>

using namespace cv;
using namespace std;

Mat eulerAnglesToRotationMatrix(Vec3f &theta);
Vec3f rotationMatrixToEulerAngles(Mat &R);
float rad2deg(float radian);
float deg2rad(float degree);

int main()
{

    // Camera calibration information

    std::vector<double> distortionCoefficients(5);  // camera distortion
    distortionCoefficients[0] = 2.4472856611074989e-01;
    distortionCoefficients[1] = -8.1042032574246325e-01;
    distortionCoefficients[2] = 0;
    distortionCoefficients[3] = 0;
    distortionCoefficients[4] = 7.8769462320821060e-01;

    double f_x = 1.3624172121852105e+03; // Focal length in x axis
    double f_y = 1.3624172121852105e+03; // Focal length in y axis (usually the same?)
    double c_x = 960; // Camera primary point x
    double c_y = 540; // Camera primary point y

    cv::Mat cameraMatrix(3, 3, CV_32FC1);
    cameraMatrix.at<float>(0, 0) = f_x;
    cameraMatrix.at<float>(0, 1) = 0.0;
    cameraMatrix.at<float>(0, 2) = c_x;
    cameraMatrix.at<float>(1, 0) = 0.0;
    cameraMatrix.at<float>(1, 1) = f_y;
    cameraMatrix.at<float>(1, 2) = c_y;
    cameraMatrix.at<float>(2, 0) = 0.0;
    cameraMatrix.at<float>(2, 1) = 0.0;
    cameraMatrix.at<float>(2, 2) = 1.0;

    Mat rvec(3, 1, CV_32F), tvec(3, 1, CV_32F);
    //

    std::vector<Mat> R_gripper2base;
    std::vector<Mat> t_gripper2base;
    std::vector<Mat> R_target2cam;
    std::vector<Mat> t_target2cam;
    Mat R_cam2gripper = (Mat_<float>(3, 3));
    Mat t_cam2gripper = (Mat_<float>(3, 1));

    vector<String> fn;
    glob("images/*.bmp", fn, false);

    vector<Mat> images;
    size_t num_images = fn.size(); //number of bmp files in images folder
    Size patternsize(6, 8); //number of centers
    vector<Point2f> centers; //this will be filled by the detected centers
    float cell_size = 30;
    vector<Point3f> obj_points;

    R_gripper2base.reserve(num_images);
    t_gripper2base.reserve(num_images);
    R_target2cam.reserve(num_images);
    t_target2cam.reserve(num_images);

    for (int i = 0; i < patternsize.height; ++i)
        for (int j = 0; j < patternsize.width; ++j)
            obj_points.push_back(Point3f(float(j*cell_size),
                float(i*cell_size), 0.f));

    for (size_t i = 0; i < num_images; i++)
        images.push_back(imread(fn[i]));

    Mat frame;

    for (size_t i = 0; i < num_images; i++)
    {
        frame = imread(fn[i]); //source image
        bool patternfound = findChessboardCorners(frame, patternsize, centers);
        if (patternfound)
        {
            drawChessboardCorners(frame, patternsize, Mat(centers), patternfound);
            //imshow("window", frame);
            //int key = cv::waitKey(0) & 0xff;
            solvePnP(Mat(obj_points), Mat(centers), cameraMatrix, distortionCoefficients, rvec, tvec);

            Mat R;
            Rodrigues(rvec, R); // R is 3x3
            R_target2cam.push_back(R);
            t_target2cam.push_back(tvec);
            Mat T = Mat::eye(4, 4, R.type()); // T is 4x4
            T(Range(0, 3), Range(0, 3)) = R * 1; // copies R into T
            T(Range(0, 3), Range(3, 4)) = tvec * 1; // copies tvec into T

            cout << "T = " << endl << " " << T << endl << endl;

        }
        cout << patternfound << endl;
    }

    Vec3f theta_01{ deg2rad(-153.61), deg2rad(8.3),   deg2rad(-91.91) };
    Vec3f theta_02{ deg2rad(-166.71), deg2rad(3.04),  deg2rad(-93.31) };
    Vec3f theta_03{ deg2rad(-170.04), deg2rad(24.92), deg2rad(-88.29) };
    Vec3f theta_04{ deg2rad(-165.71), deg2rad(24.68), deg2rad(-84.85) };
    Vec3f theta_05{ deg2rad(-160.18), deg2rad(-15.94),deg2rad(-56.24) };
    Vec3f theta_06{ deg2rad(175.68),  deg2rad(10.95), deg2rad(180) };
    Vec3f theta_07{ deg2rad(175.73),  deg2rad(45.78), deg2rad(-179.92) };
    Vec3f theta_08{ deg2rad(-165.34), deg2rad(47.37), deg2rad(-166.25) };
    Vec3f theta_09{ deg2rad(-165.62), deg2rad(17.95), deg2rad(-166.17) };
    Vec3f theta_10{ deg2rad(-151.99), deg2rad(-14.59),deg2rad(-94.19) };

    Mat robot_rot_01 = eulerAnglesToRotationMatrix(theta_01);
    Mat robot_rot_02 = eulerAnglesToRotationMatrix(theta_02);
    Mat robot_rot_03 = eulerAnglesToRotationMatrix(theta_03);
    Mat robot_rot_04 = eulerAnglesToRotationMatrix(theta_04);
    Mat robot_rot_05 = eulerAnglesToRotationMatrix(theta_05);
    Mat robot_rot_06 = eulerAnglesToRotationMatrix(theta_06);
    Mat robot_rot_07 = eulerAnglesToRotationMatrix(theta_07);
    Mat robot_rot_08 = eulerAnglesToRotationMatrix(theta_08);
    Mat robot_rot_09 = eulerAnglesToRotationMatrix(theta_09);
    Mat robot_rot_10 = eulerAnglesToRotationMatrix(theta_10);

    const Mat robot_tr_01 = (Mat_<float>(3, 1) << 781.2, 338.59, 903.48);
    const Mat robot_tr_02 = (Mat_<float>(3, 1) << 867.65, 382.52, 884.42);
    const Mat robot_tr_03 = (Mat_<float>(3, 1) << 856.91, 172.99, 964.61);
    const Mat robot_tr_04 = (Mat_<float>(3, 1) << 748.81, 146.75, 1043.29);
    const Mat robot_tr_05 = (Mat_<float>(3, 1) << 627.66, 554.08, 920.85);
    const Mat robot_tr_06 = (Mat_<float>(3, 1) << 715.06, 195.96, 889.38);
    const Mat robot_tr_07 = (Mat_<float>(3, 1) << 790.9, 196.29, 1117.38);
    const Mat robot_tr_08 = (Mat_<float>(3, 1) << 743.5, 283.93, 1131.92);
    const Mat robot_tr_09 = (Mat_<float>(3, 1) << 748.9, 288.19, 910.58);
    const Mat robot_tr_10 = (Mat_<float>(3, 1) << 813.18, 400.44, 917.16);

    R_gripper2base.push_back(robot_rot_01);
    R_gripper2base.push_back(robot_rot_02);
    R_gripper2base.push_back(robot_rot_03);
    R_gripper2base.push_back(robot_rot_04);
    R_gripper2base.push_back(robot_rot_05);
    R_gripper2base.push_back(robot_rot_06);
    R_gripper2base.push_back(robot_rot_07);
    R_gripper2base.push_back(robot_rot_08);
    R_gripper2base.push_back(robot_rot_09);
    R_gripper2base.push_back(robot_rot_10);

    t_gripper2base.push_back(robot_tr_01);
    t_gripper2base.push_back(robot_tr_02);
    t_gripper2base.push_back(robot_tr_03);
    t_gripper2base.push_back(robot_tr_04);
    t_gripper2base.push_back(robot_tr_05);
    t_gripper2base.push_back(robot_tr_06);
    t_gripper2base.push_back(robot_tr_07);
    t_gripper2base.push_back(robot_tr_08);
    t_gripper2base.push_back(robot_tr_09);
    t_gripper2base.push_back(robot_tr_10);

    calibrateHandEye(R_gripper2base, t_gripper2base, R_target2cam, t_target2cam, R_cam2gripper, t_cam2gripper, CALIB_HAND_EYE_TSAI);

    Vec3f R_cam2gripper_r = rotationMatrixToEulerAngles(R_cam2gripper);

    cout << "R_cam2gripper = " << endl << " " << R_cam2gripper << endl << endl;
    cout << "R_cam2gripper_r = " << endl << " " << R_cam2gripper_r << endl << endl;
    cout << "t_cam2gripper = " << endl << " " << t_cam2gripper << endl << endl;
}

Mat eulerAnglesToRotationMatrix(Vec3f &theta)
{
    // Calculate rotation about x axis
    Mat R_x = (Mat_<double>(3, 3) <<
        1, 0, 0,
        0, cos(theta[0]), -sin(theta[0]),
        0, sin(theta[0]), cos(theta[0])
        );

    // Calculate rotation about y axis
    Mat R_y = (Mat_<double>(3, 3) <<
        cos(theta[1]), 0, sin(theta[1]),
        0, 1, 0,
        -sin(theta[1]), 0, cos(theta[1])
        );

    // Calculate rotation about z axis
    Mat R_z = (Mat_<double>(3, 3) <<
        cos(theta[2]), -sin(theta[2]), 0,
        sin(theta[2]), cos(theta[2]), 0,
        0, 0, 1);


    // Combined rotation matrix
    Mat R = R_x * R_y * R_z;

    return R;

}

float rad2deg(float radian) {
    double pi = 3.14159;
    return(radian * (180 / pi));
}

float deg2rad(float degree) {
    double pi = 3.14159;
    return(degree * (pi / 180));
}

// Checks if a matrix is a valid rotation matrix.
bool isRotationMatrix(Mat &R)
{
    Mat Rt;
    transpose(R, Rt);
    Mat shouldBeIdentity = Rt * R;
    Mat I = Mat::eye(3, 3, shouldBeIdentity.type());

    return  norm(I, shouldBeIdentity) < 1e-6;

}

// Calculates rotation matrix to euler angles
// The result is the same as MATLAB except the order
// of the euler angles ( x and z are swapped ).
Vec3f rotationMatrixToEulerAngles(Mat &R)
{

    assert(isRotationMatrix(R));

    float sy = sqrt(R.at<double>(0, 0) * R.at<double>(0, 0) + R.at<double>(1, 0) * R.at<double>(1, 0));

    bool singular = sy < 1e-6; // If

    float x, y, z;
    if (!singular)
    {
        x = atan2(R.at<double>(2, 1), R.at<double>(2, 2));
        y = atan2(-R.at<double>(2, 0), sy);
        z = atan2(R.at<double>(1, 0), R.at<double>(0, 0));
    }
    else
    {
        x = atan2(-R.at<double>(1, 2), R.at<double>(1, 1));
        y = atan2(-R.at<double>(2, 0), sy);
        z = 0;
    }
    return Vec3f(x, y, z);

}

The result the function is giving me is the next one:

 R_cam2gripper =  [0.8357872530342421, 0.4275441928943815, 0.3444787813316921;  
-0.4108751612084568, 0.90320366951094, -0.1241158059390393;  
-0.3641994914430832, -0.03780336630863103, 0.9305534030502145]

 =
 [0.3099803593003124, -0.8923086952824562, -0.3281727733547833;
 0.7129271761196039, 0.4465219155360299, -0.5406967916458927;
 0.6290047840821058, -0.0663579028402444, 0.7745641421680119]

R_cam2gripper_r =  [-0.0406023, 0.372773, -0.456907]

 =
 [-0.0854626, -0.680272, 1.16065]

t_cam2gripper =  [11.99249455382922; 
  213.3407196054294;
  320.1514538736999]
=
 [-35.02063730299775;
 -74.80633768251272;
 -307.6725851251873]

I am getting 'good' results provided by other software. With them, the robot got to the exact points I am pointing in the camera (I have a 3D camera, from which I am getting the x, y, z from the camera world) so they are certainly correct, but I am having troubles to repeat the same result with the OpenCV function.

Sorry for the long introduction to my problem. Any understanding of why the solutions are not what is supposed to be? My guess is, that I have a problem understanding the angles or converting them but I couldn't find any way to solve this. Any hint will be well welcome!

Wrong result in calibrateHandEye function

I am working in a robot application, in which I have a camera fixed to a robot gripper. In order to calculate the matrix transformation between the camera and the gripper Hcg I am using the calibrateHandEye new function provided in the OpenCV version 4.1.

In the next picture, you can see the description of the function from the OpenCV documentation

image description

To use this function I am editing the camera_calibration example of samples\cpp\tutorial_code\calib3d\

I had taken 10 pictures of the chessboard from the camera mounted in the gripper and at the same time I recorded the robot position.

I changed the code, now It looks like:

// My_handeye.cpp : This file contains the 'main' function. Program execution begins and ends there.
//
#include <iostream>
#include <sstream>
#include <string>
#include <ctime>
#include <cstdio>
#include "pch.h"

#include <opencv2/opencv.hpp>
#include <opencv2/core.hpp>
#include <opencv2/core/utility.hpp>
#include <opencv2/imgproc.hpp>
#include <opencv2/calib3d.hpp>
#include <opencv2/imgcodecs.hpp>
#include <opencv2/videoio.hpp>
#include <opencv2/highgui.hpp>

using namespace cv;
using namespace std;

Mat eulerAnglesToRotationMatrix(Vec3f &theta);
Vec3f rotationMatrixToEulerAngles(Mat &R);
float rad2deg(float radian);
float deg2rad(float degree);

int main()
{

    // Camera calibration information

    std::vector<double> distortionCoefficients(5);  // camera distortion
    distortionCoefficients[0] = 2.4472856611074989e-01;
    distortionCoefficients[1] = -8.1042032574246325e-01;
    distortionCoefficients[2] = 0;
    distortionCoefficients[3] = 0;
    distortionCoefficients[4] = 7.8769462320821060e-01;

    double f_x = 1.3624172121852105e+03; // Focal length in x axis
    double f_y = 1.3624172121852105e+03; // Focal length in y axis (usually the same?)
    double c_x = 960; // Camera primary point x
    double c_y = 540; // Camera primary point y

    cv::Mat cameraMatrix(3, 3, CV_32FC1);
    cameraMatrix.at<float>(0, 0) = f_x;
    cameraMatrix.at<float>(0, 1) = 0.0;
    cameraMatrix.at<float>(0, 2) = c_x;
    cameraMatrix.at<float>(1, 0) = 0.0;
    cameraMatrix.at<float>(1, 1) = f_y;
    cameraMatrix.at<float>(1, 2) = c_y;
    cameraMatrix.at<float>(2, 0) = 0.0;
    cameraMatrix.at<float>(2, 1) = 0.0;
    cameraMatrix.at<float>(2, 2) = 1.0;

    Mat rvec(3, 1, CV_32F), tvec(3, 1, CV_32F);
    //

    std::vector<Mat> R_gripper2base;
    std::vector<Mat> t_gripper2base;
    std::vector<Mat> R_target2cam;
    std::vector<Mat> t_target2cam;
    Mat R_cam2gripper = (Mat_<float>(3, 3));
    Mat t_cam2gripper = (Mat_<float>(3, 1));

    vector<String> fn;
    glob("images/*.bmp", fn, false);

    vector<Mat> images;
    size_t num_images = fn.size(); //number of bmp files in images folder
    Size patternsize(6, 8); //number of centers
    vector<Point2f> centers; //this will be filled by the detected centers
    float cell_size = 30;
    vector<Point3f> obj_points;

    R_gripper2base.reserve(num_images);
    t_gripper2base.reserve(num_images);
    R_target2cam.reserve(num_images);
    t_target2cam.reserve(num_images);

    for (int i = 0; i < patternsize.height; ++i)
        for (int j = 0; j < patternsize.width; ++j)
            obj_points.push_back(Point3f(float(j*cell_size),
                float(i*cell_size), 0.f));

    for (size_t i = 0; i < num_images; i++)
        images.push_back(imread(fn[i]));

    Mat frame;

    for (size_t i = 0; i < num_images; i++)
    {
        frame = imread(fn[i]); //source image
        bool patternfound = findChessboardCorners(frame, patternsize, centers);
        if (patternfound)
        {
            drawChessboardCorners(frame, patternsize, Mat(centers), patternfound);
            //imshow("window", frame);
            //int key = cv::waitKey(0) & 0xff;
            solvePnP(Mat(obj_points), Mat(centers), cameraMatrix, distortionCoefficients, rvec, tvec);

            Mat R;
            Rodrigues(rvec, R); // R is 3x3
            R_target2cam.push_back(R);
            t_target2cam.push_back(tvec);
            Mat T = Mat::eye(4, 4, R.type()); // T is 4x4
            T(Range(0, 3), Range(0, 3)) = R * 1; // copies R into T
            T(Range(0, 3), Range(3, 4)) = tvec * 1; // copies tvec into T

            cout << "T = " << endl << " " << T << endl << endl;

        }
        cout << patternfound << endl;
    }

    Vec3f theta_01{ deg2rad(-153.61), deg2rad(8.3),   deg2rad(-91.91) };
    Vec3f theta_02{ deg2rad(-166.71), deg2rad(3.04),  deg2rad(-93.31) };
    Vec3f theta_03{ deg2rad(-170.04), deg2rad(24.92), deg2rad(-88.29) };
    Vec3f theta_04{ deg2rad(-165.71), deg2rad(24.68), deg2rad(-84.85) };
    Vec3f theta_05{ deg2rad(-160.18), deg2rad(-15.94),deg2rad(-56.24) };
    Vec3f theta_06{ deg2rad(175.68),  deg2rad(10.95), deg2rad(180) };
    Vec3f theta_07{ deg2rad(175.73),  deg2rad(45.78), deg2rad(-179.92) };
    Vec3f theta_08{ deg2rad(-165.34), deg2rad(47.37), deg2rad(-166.25) };
    Vec3f theta_09{ deg2rad(-165.62), deg2rad(17.95), deg2rad(-166.17) };
    Vec3f theta_10{ deg2rad(-151.99), deg2rad(-14.59),deg2rad(-94.19) };

    Mat robot_rot_01 = eulerAnglesToRotationMatrix(theta_01);
    Mat robot_rot_02 = eulerAnglesToRotationMatrix(theta_02);
    Mat robot_rot_03 = eulerAnglesToRotationMatrix(theta_03);
    Mat robot_rot_04 = eulerAnglesToRotationMatrix(theta_04);
    Mat robot_rot_05 = eulerAnglesToRotationMatrix(theta_05);
    Mat robot_rot_06 = eulerAnglesToRotationMatrix(theta_06);
    Mat robot_rot_07 = eulerAnglesToRotationMatrix(theta_07);
    Mat robot_rot_08 = eulerAnglesToRotationMatrix(theta_08);
    Mat robot_rot_09 = eulerAnglesToRotationMatrix(theta_09);
    Mat robot_rot_10 = eulerAnglesToRotationMatrix(theta_10);

    const Mat robot_tr_01 = (Mat_<float>(3, 1) << 781.2, 338.59, 903.48);
    const Mat robot_tr_02 = (Mat_<float>(3, 1) << 867.65, 382.52, 884.42);
    const Mat robot_tr_03 = (Mat_<float>(3, 1) << 856.91, 172.99, 964.61);
    const Mat robot_tr_04 = (Mat_<float>(3, 1) << 748.81, 146.75, 1043.29);
    const Mat robot_tr_05 = (Mat_<float>(3, 1) << 627.66, 554.08, 920.85);
    const Mat robot_tr_06 = (Mat_<float>(3, 1) << 715.06, 195.96, 889.38);
    const Mat robot_tr_07 = (Mat_<float>(3, 1) << 790.9, 196.29, 1117.38);
    const Mat robot_tr_08 = (Mat_<float>(3, 1) << 743.5, 283.93, 1131.92);
    const Mat robot_tr_09 = (Mat_<float>(3, 1) << 748.9, 288.19, 910.58);
    const Mat robot_tr_10 = (Mat_<float>(3, 1) << 813.18, 400.44, 917.16);

    R_gripper2base.push_back(robot_rot_01);
    R_gripper2base.push_back(robot_rot_02);
    R_gripper2base.push_back(robot_rot_03);
    R_gripper2base.push_back(robot_rot_04);
    R_gripper2base.push_back(robot_rot_05);
    R_gripper2base.push_back(robot_rot_06);
    R_gripper2base.push_back(robot_rot_07);
    R_gripper2base.push_back(robot_rot_08);
    R_gripper2base.push_back(robot_rot_09);
    R_gripper2base.push_back(robot_rot_10);

    t_gripper2base.push_back(robot_tr_01);
    t_gripper2base.push_back(robot_tr_02);
    t_gripper2base.push_back(robot_tr_03);
    t_gripper2base.push_back(robot_tr_04);
    t_gripper2base.push_back(robot_tr_05);
    t_gripper2base.push_back(robot_tr_06);
    t_gripper2base.push_back(robot_tr_07);
    t_gripper2base.push_back(robot_tr_08);
    t_gripper2base.push_back(robot_tr_09);
    t_gripper2base.push_back(robot_tr_10);

    calibrateHandEye(R_gripper2base, t_gripper2base, R_target2cam, t_target2cam, R_cam2gripper, t_cam2gripper, CALIB_HAND_EYE_TSAI);

    Vec3f R_cam2gripper_r = rotationMatrixToEulerAngles(R_cam2gripper);

    cout << "R_cam2gripper = " << endl << " " << R_cam2gripper << endl << endl;
    cout << "R_cam2gripper_r = " << endl << " " << R_cam2gripper_r << endl << endl;
    cout << "t_cam2gripper = " << endl << " " << t_cam2gripper << endl << endl;
}

Mat eulerAnglesToRotationMatrix(Vec3f &theta)
{
    // Calculate rotation about x axis
    Mat R_x = (Mat_<double>(3, 3) <<
        1, 0, 0,
        0, cos(theta[0]), -sin(theta[0]),
        0, sin(theta[0]), cos(theta[0])
        );

    // Calculate rotation about y axis
    Mat R_y = (Mat_<double>(3, 3) <<
        cos(theta[1]), 0, sin(theta[1]),
        0, 1, 0,
        -sin(theta[1]), 0, cos(theta[1])
        );

    // Calculate rotation about z axis
    Mat R_z = (Mat_<double>(3, 3) <<
        cos(theta[2]), -sin(theta[2]), 0,
        sin(theta[2]), cos(theta[2]), 0,
        0, 0, 1);


    // Combined rotation matrix
    Mat R = R_x * R_y * R_z;

    return R;

}

float rad2deg(float radian) {
    double pi = 3.14159;
    return(radian * (180 / pi));
}

float deg2rad(float degree) {
    double pi = 3.14159;
    return(degree * (pi / 180));
}

// Checks if a matrix is a valid rotation matrix.
bool isRotationMatrix(Mat &R)
{
    Mat Rt;
    transpose(R, Rt);
    Mat shouldBeIdentity = Rt * R;
    Mat I = Mat::eye(3, 3, shouldBeIdentity.type());

    return  norm(I, shouldBeIdentity) < 1e-6;

}

// Calculates rotation matrix to euler angles
// The result is the same as MATLAB except the order
// of the euler angles ( x and z are swapped ).
Vec3f rotationMatrixToEulerAngles(Mat &R)
{

    assert(isRotationMatrix(R));

    float sy = sqrt(R.at<double>(0, 0) * R.at<double>(0, 0) + R.at<double>(1, 0) * R.at<double>(1, 0));

    bool singular = sy < 1e-6; // If

    float x, y, z;
    if (!singular)
    {
        x = atan2(R.at<double>(2, 1), R.at<double>(2, 2));
        y = atan2(-R.at<double>(2, 0), sy);
        z = atan2(R.at<double>(1, 0), R.at<double>(0, 0));
    }
    else
    {
        x = atan2(-R.at<double>(1, 2), R.at<double>(1, 1));
        y = atan2(-R.at<double>(2, 0), sy);
        z = 0;
    }
    return Vec3f(x, y, z);

}

The result the function is giving me is the next one:

 R_cam2gripper =
 [0.3099803593003124, -0.8923086952824562, -0.3281727733547833;
 0.7129271761196039, 0.4465219155360299, -0.5406967916458927;
 0.6290047840821058, -0.0663579028402444, 0.7745641421680119]

R_cam2gripper_r =
 [-0.0854626, -0.680272, 1.16065]

t_cam2gripper =
 [-35.02063730299775;
 -74.80633768251272;
 -307.6725851251873]

I am getting 'good' results provided by other software. With them, the robot got to the exact points I am pointing in the camera (I have a 3D camera, from which I am getting the x, y, z from the camera world) so they are certainly correct, but I am having troubles to repeat the same result with the OpenCV function.

Sorry for the long introduction to my problem. Any understanding of why the solutions are not what is supposed to be? My guess is, that I have a problem understanding the angles or converting them but I couldn't find any way to solve this. Any hint will be well welcome!

Wrong result in calibrateHandEye function

I am working in a robot application, in which I have a camera fixed to a robot gripper. In order to calculate the matrix transformation between the camera and the gripper Hcg I am using the calibrateHandEye new function provided in the OpenCV version 4.1.

In the next picture, you can see the description of the function from the OpenCV documentation

image description

To use this function I am editing the camera_calibration example of samples\cpp\tutorial_code\calib3d\

I had taken 10 pictures of the chessboard from the camera mounted in the gripper and at the same time I recorded the robot position.

I changed the code, now It looks like:

// My_handeye.cpp : This file contains the 'main' function. Program execution begins and ends there.
//
#include <iostream>
#include <sstream>
#include <string>
#include <ctime>
#include <cstdio>
#include "pch.h"

#include <opencv2/opencv.hpp>
#include <opencv2/core.hpp>
#include <opencv2/core/utility.hpp>
#include <opencv2/imgproc.hpp>
#include <opencv2/calib3d.hpp>
#include <opencv2/imgcodecs.hpp>
#include <opencv2/videoio.hpp>
#include <opencv2/highgui.hpp>

using namespace cv;
using namespace std;

Mat eulerAnglesToRotationMatrix(Vec3f &theta);
Vec3f rotationMatrixToEulerAngles(Mat &R);
float rad2deg(float radian);
float deg2rad(float degree);

int main()
{

    // Camera calibration information

    std::vector<double> distortionCoefficients(5);  // camera distortion
    distortionCoefficients[0] = 2.4472856611074989e-01;
    distortionCoefficients[1] = -8.1042032574246325e-01;
    distortionCoefficients[2] = 0;
    distortionCoefficients[3] = 0;
    distortionCoefficients[4] = 7.8769462320821060e-01;

    double f_x = 1.3624172121852105e+03; // Focal length in x axis
    double f_y = 1.3624172121852105e+03; // Focal length in y axis (usually the same?)
    double c_x = 960; // Camera primary point x
    double c_y = 540; // Camera primary point y

    cv::Mat cameraMatrix(3, 3, CV_32FC1);
    cameraMatrix.at<float>(0, 0) = f_x;
    cameraMatrix.at<float>(0, 1) = 0.0;
    cameraMatrix.at<float>(0, 2) = c_x;
    cameraMatrix.at<float>(1, 0) = 0.0;
    cameraMatrix.at<float>(1, 1) = f_y;
    cameraMatrix.at<float>(1, 2) = c_y;
    cameraMatrix.at<float>(2, 0) = 0.0;
    cameraMatrix.at<float>(2, 1) = 0.0;
    cameraMatrix.at<float>(2, 2) = 1.0;

    Mat rvec(3, 1, CV_32F), tvec(3, 1, CV_32F);
    //

    std::vector<Mat> R_gripper2base;
    std::vector<Mat> t_gripper2base;
    std::vector<Mat> R_target2cam;
    std::vector<Mat> t_target2cam;
    Mat R_cam2gripper = (Mat_<float>(3, 3));
    Mat t_cam2gripper = (Mat_<float>(3, 1));

    vector<String> fn;
    glob("images/*.bmp", fn, false);

    vector<Mat> images;
    size_t num_images = fn.size(); //number of bmp files in images folder
    Size patternsize(6, 8); //number of centers
    vector<Point2f> centers; //this will be filled by the detected centers
    float cell_size = 30;
    vector<Point3f> obj_points;

    R_gripper2base.reserve(num_images);
    t_gripper2base.reserve(num_images);
    R_target2cam.reserve(num_images);
    t_target2cam.reserve(num_images);

    for (int i = 0; i < patternsize.height; ++i)
        for (int j = 0; j < patternsize.width; ++j)
            obj_points.push_back(Point3f(float(j*cell_size),
                float(i*cell_size), 0.f));

    for (size_t i = 0; i < num_images; i++)
        images.push_back(imread(fn[i]));

    Mat frame;

    for (size_t i = 0; i < num_images; i++)
    {
        frame = imread(fn[i]); //source image
        bool patternfound = findChessboardCorners(frame, patternsize, centers);
        if (patternfound)
        {
            drawChessboardCorners(frame, patternsize, Mat(centers), patternfound);
            //imshow("window", frame);
            //int key = cv::waitKey(0) & 0xff;
            solvePnP(Mat(obj_points), Mat(centers), cameraMatrix, distortionCoefficients, rvec, tvec);

            Mat R;
            Rodrigues(rvec, R); // R is 3x3
            R_target2cam.push_back(R);
            t_target2cam.push_back(tvec);
            Mat T = Mat::eye(4, 4, R.type()); // T is 4x4
            T(Range(0, 3), Range(0, 3)) = R * 1; // copies R into T
            T(Range(0, 3), Range(3, 4)) = tvec * 1; // copies tvec into T

            cout << "T = " << endl << " " << T << endl << endl;

        }
        cout << patternfound << endl;
    }

    Vec3f theta_01{ deg2rad(-153.61), deg2rad(8.3),   deg2rad(-91.91) };
    Vec3f theta_02{ deg2rad(-166.71), deg2rad(3.04),  deg2rad(-93.31) };
    Vec3f theta_03{ deg2rad(-170.04), deg2rad(24.92), deg2rad(-88.29) };
    Vec3f theta_04{ deg2rad(-165.71), deg2rad(24.68), deg2rad(-84.85) };
    Vec3f theta_05{ deg2rad(-160.18), deg2rad(-15.94),deg2rad(-56.24) };
    Vec3f theta_06{ deg2rad(175.68),  deg2rad(10.95), deg2rad(180) };
    Vec3f theta_07{ deg2rad(175.73),  deg2rad(45.78), deg2rad(-179.92) };
    Vec3f theta_08{ deg2rad(-165.34), deg2rad(47.37), deg2rad(-166.25) };
    Vec3f theta_09{ deg2rad(-165.62), deg2rad(17.95), deg2rad(-166.17) };
    Vec3f theta_10{ deg2rad(-151.99), deg2rad(-14.59),deg2rad(-94.19) };

    Mat robot_rot_01 = eulerAnglesToRotationMatrix(theta_01);
    Mat robot_rot_02 = eulerAnglesToRotationMatrix(theta_02);
    Mat robot_rot_03 = eulerAnglesToRotationMatrix(theta_03);
    Mat robot_rot_04 = eulerAnglesToRotationMatrix(theta_04);
    Mat robot_rot_05 = eulerAnglesToRotationMatrix(theta_05);
    Mat robot_rot_06 = eulerAnglesToRotationMatrix(theta_06);
    Mat robot_rot_07 = eulerAnglesToRotationMatrix(theta_07);
    Mat robot_rot_08 = eulerAnglesToRotationMatrix(theta_08);
    Mat robot_rot_09 = eulerAnglesToRotationMatrix(theta_09);
    Mat robot_rot_10 = eulerAnglesToRotationMatrix(theta_10);

    const Mat robot_tr_01 = (Mat_<float>(3, 1) << 781.2, 338.59, 903.48);
    const Mat robot_tr_02 = (Mat_<float>(3, 1) << 867.65, 382.52, 884.42);
    const Mat robot_tr_03 = (Mat_<float>(3, 1) << 856.91, 172.99, 964.61);
    const Mat robot_tr_04 = (Mat_<float>(3, 1) << 748.81, 146.75, 1043.29);
    const Mat robot_tr_05 = (Mat_<float>(3, 1) << 627.66, 554.08, 920.85);
    const Mat robot_tr_06 = (Mat_<float>(3, 1) << 715.06, 195.96, 889.38);
    const Mat robot_tr_07 = (Mat_<float>(3, 1) << 790.9, 196.29, 1117.38);
    const Mat robot_tr_08 = (Mat_<float>(3, 1) << 743.5, 283.93, 1131.92);
    const Mat robot_tr_09 = (Mat_<float>(3, 1) << 748.9, 288.19, 910.58);
    const Mat robot_tr_10 = (Mat_<float>(3, 1) << 813.18, 400.44, 917.16);

    R_gripper2base.push_back(robot_rot_01);
    R_gripper2base.push_back(robot_rot_02);
    R_gripper2base.push_back(robot_rot_03);
    R_gripper2base.push_back(robot_rot_04);
    R_gripper2base.push_back(robot_rot_05);
    R_gripper2base.push_back(robot_rot_06);
    R_gripper2base.push_back(robot_rot_07);
    R_gripper2base.push_back(robot_rot_08);
    R_gripper2base.push_back(robot_rot_09);
    R_gripper2base.push_back(robot_rot_10);

    t_gripper2base.push_back(robot_tr_01);
    t_gripper2base.push_back(robot_tr_02);
    t_gripper2base.push_back(robot_tr_03);
    t_gripper2base.push_back(robot_tr_04);
    t_gripper2base.push_back(robot_tr_05);
    t_gripper2base.push_back(robot_tr_06);
    t_gripper2base.push_back(robot_tr_07);
    t_gripper2base.push_back(robot_tr_08);
    t_gripper2base.push_back(robot_tr_09);
    t_gripper2base.push_back(robot_tr_10);

    calibrateHandEye(R_gripper2base, t_gripper2base, R_target2cam, t_target2cam, R_cam2gripper, t_cam2gripper, CALIB_HAND_EYE_TSAI);

    Vec3f R_cam2gripper_r = rotationMatrixToEulerAngles(R_cam2gripper);

    cout << "R_cam2gripper = " << endl << " " << R_cam2gripper << endl << endl;
    cout << "R_cam2gripper_r = " << endl << " " << R_cam2gripper_r << endl << endl;
    cout << "t_cam2gripper = " << endl << " " << t_cam2gripper << endl << endl;
}

Mat eulerAnglesToRotationMatrix(Vec3f &theta)
{
    // Calculate rotation about x axis
    Mat R_x = (Mat_<double>(3, 3) <<
        1, 0, 0,
        0, cos(theta[0]), -sin(theta[0]),
        0, sin(theta[0]), cos(theta[0])
        );

    // Calculate rotation about y axis
    Mat R_y = (Mat_<double>(3, 3) <<
        cos(theta[1]), 0, sin(theta[1]),
        0, 1, 0,
        -sin(theta[1]), 0, cos(theta[1])
        );

    // Calculate rotation about z axis
    Mat R_z = (Mat_<double>(3, 3) <<
        cos(theta[2]), -sin(theta[2]), 0,
        sin(theta[2]), cos(theta[2]), 0,
        0, 0, 1);


    // Combined rotation matrix
    Mat R = R_x * R_y * R_z;

    return R;

}

float rad2deg(float radian) {
    double pi = 3.14159;
    return(radian * (180 / pi));
}

float deg2rad(float degree) {
    double pi = 3.14159;
    return(degree * (pi / 180));
}

// Checks if a matrix is a valid rotation matrix.
bool isRotationMatrix(Mat &R)
{
    Mat Rt;
    transpose(R, Rt);
    Mat shouldBeIdentity = Rt * R;
    Mat I = Mat::eye(3, 3, shouldBeIdentity.type());

    return  norm(I, shouldBeIdentity) < 1e-6;

}

// Calculates rotation matrix to euler angles
// The result is the same as MATLAB except the order
// of the euler angles ( x and z are swapped ).
Vec3f rotationMatrixToEulerAngles(Mat &R)
{

    assert(isRotationMatrix(R));

    float sy = sqrt(R.at<double>(0, 0) * R.at<double>(0, 0) + R.at<double>(1, 0) * R.at<double>(1, 0));

    bool singular = sy < 1e-6; // If

    float x, y, z;
    if (!singular)
    {
        x = atan2(R.at<double>(2, 1), R.at<double>(2, 2));
        y = atan2(-R.at<double>(2, 0), sy);
        z = atan2(R.at<double>(1, 0), R.at<double>(0, 0));
    }
    else
    {
        x = atan2(-R.at<double>(1, 2), R.at<double>(1, 1));
        y = atan2(-R.at<double>(2, 0), sy);
        z = 0;
    }
    return Vec3f(x, y, z);

}

The result the function is giving me is the next one:

 R_cam2gripper =
 [0.3099803593003124, -0.8923086952824562, -0.3281727733547833;
 0.7129271761196039, 0.4465219155360299, -0.5406967916458927;
 0.6290047840821058, -0.0663579028402444, 0.7745641421680119]

R_cam2gripper_r =
 [-0.0854626, -0.680272, 1.16065]

t_cam2gripper =
 [-35.02063730299775;
 -74.80633768251272;
 -307.6725851251873]

I am getting 'good' results provided by other software. With them, the robot got to the exact points I am pointing in the camera (I have a 3D camera, from which I am getting the x, y, z from the camera world) so they are certainly correct, but I am having troubles to repeat the same result with the OpenCV function.

Sorry for the long introduction to my problem. Any understanding of why the solutions are not what is supposed to be? My guess is, that I have a problem understanding the angles or converting them but I couldn't find any way to solve this. Any hint will be well welcome!