Ask Your Question

dahonora's profile - activity

2019-08-09 17:57:10 -0600 received badge  Self-Learner (source)
2019-08-09 12:42:12 -0600 marked best answer calibrateHandEye precision

Hi there!

I am using the calibrateHandEye functions from the Calib3d library to get the handEye calibration of a UR10 robot with a camera RealSense fixed to the end effector. After some problems I think, that I got to get the function work properly. But now I am seeing two things that do not fit completely, and I was hoping for a third opinion from your point of view.

As you can see from the image below, I made 4 four measurements for each of the 5 methods that the function proposes. For each of the methods I used 10, 15, 18 and 20 robot poses with their respective pictures. From here I got a little confused:

  1. The first method (CALIB_HAND_EYE_TSAI) give completely different values depending on the amount of robot poses used. At the same time, and with the same values, the others 4 methods seem to converge to certain values close enough between them. Why is that?
  2. Though the values of the last 4 methods seem to converge, between them there are important differences. Why is that?

Thank you for the help!

The camera intrinsic parameters are:

std::vector<double> distortionCoefficients(5);  // camera distortion
distortionCoefficients[0] = 9.6349551984637724e-02;
distortionCoefficients[1] = -3.3260675111130217e-01;
distortionCoefficients[2] = 0;
distortionCoefficients[3] = 0;
distortionCoefficients[4] = 2.5833277679122602e-01;

double f_x = 1.2993539019658076e+03; // Focal length in x axis
double f_y = 1.2993539019658076e+03; // Focal length in y axis (usually the same?)
double c_x = 960; // Camera primary point x
double c_y = 540; // Camera primary point y

cv::Mat cameraMatrix(3, 3, CV_32FC1);
cameraMatrix.at<float>(0, 0) = f_x;
cameraMatrix.at<float>(0, 1) = 0.0;
cameraMatrix.at<float>(0, 2) = c_x;
cameraMatrix.at<float>(1, 0) = 0.0;
cameraMatrix.at<float>(1, 1) = f_y;
cameraMatrix.at<float>(1, 2) = c_y;
cameraMatrix.at<float>(2, 0) = 0.0;
cameraMatrix.at<float>(2, 1) = 0.0;
cameraMatrix.at<float>(2, 2) = 1.0;

Link to get the images and poses.

image description

2019-08-09 12:42:12 -0600 received badge  Scholar (source)
2019-08-09 12:42:11 -0600 edited answer calibrateHandEye precision

The problem was with the calibration of the camera itself. I was assuming, in the OpenCV calibration, that the aspect r

2019-08-09 12:41:04 -0600 answered a question calibrateHandEye precision

The problem was with the calibration of the camera itself. I was assuming, in the OpenCV calibration, that the aspect r

2019-08-07 19:29:07 -0600 received badge  Student (source)
2019-08-07 13:59:49 -0600 commented question calibrateHandEye precision

I am using a force sensor, there is no problem to hit the object. I am more concerned about the precision in the x, y co

2019-08-07 13:34:35 -0600 marked best answer Wrong result in calibrateHandEye function [answered]

I am working in a robot application, in which I have a camera fixed to a robot gripper. In order to calculate the matrix transformation between the camera and the gripper Hcg I am using the calibrateHandEye new function provided in the OpenCV version 4.1.

In the next picture, you can see the description of the function from the OpenCV documentation

image description

To use this function I am editing the camera_calibration example of samples\cpp\tutorial_code\calib3d\

I had taken 10 pictures of the chessboard from the camera mounted in the gripper and at the same time I recorded the robot position.

I changed the code, now It looks like:

// My_handeye.cpp : This file contains the 'main' function. Program execution begins and ends there.
//
#include <iostream>
#include <sstream>
#include <string>
#include <ctime>
#include <cstdio>
#include "pch.h"

#include <opencv2/opencv.hpp>
#include <opencv2/core.hpp>
#include <opencv2/core/utility.hpp>
#include <opencv2/imgproc.hpp>
#include <opencv2/calib3d.hpp>
#include <opencv2/imgcodecs.hpp>
#include <opencv2/videoio.hpp>
#include <opencv2/highgui.hpp>

using namespace cv;
using namespace std;

Mat eulerAnglesToRotationMatrix(Vec3f &theta);
Vec3f rotationMatrixToEulerAngles(Mat &R);
float rad2deg(float radian);
float deg2rad(float degree);

int main()
{

    // Camera calibration information

    std::vector<double> distortionCoefficients(5);  // camera distortion
    distortionCoefficients[0] = 2.4472856611074989e-01;
    distortionCoefficients[1] = -8.1042032574246325e-01;
    distortionCoefficients[2] = 0;
    distortionCoefficients[3] = 0;
    distortionCoefficients[4] = 7.8769462320821060e-01;

    double f_x = 1.3624172121852105e+03; // Focal length in x axis
    double f_y = 1.3624172121852105e+03; // Focal length in y axis (usually the same?)
    double c_x = 960; // Camera primary point x
    double c_y = 540; // Camera primary point y

    cv::Mat cameraMatrix(3, 3, CV_32FC1);
    cameraMatrix.at<float>(0, 0) = f_x;
    cameraMatrix.at<float>(0, 1) = 0.0;
    cameraMatrix.at<float>(0, 2) = c_x;
    cameraMatrix.at<float>(1, 0) = 0.0;
    cameraMatrix.at<float>(1, 1) = f_y;
    cameraMatrix.at<float>(1, 2) = c_y;
    cameraMatrix.at<float>(2, 0) = 0.0;
    cameraMatrix.at<float>(2, 1) = 0.0;
    cameraMatrix.at<float>(2, 2) = 1.0;

    Mat rvec(3, 1, CV_32F), tvec(3, 1, CV_32F);
    //

    std::vector<Mat> R_gripper2base;
    std::vector<Mat> t_gripper2base;
    std::vector<Mat> R_target2cam;
    std::vector<Mat> t_target2cam;
    Mat R_cam2gripper = (Mat_<float>(3, 3));
    Mat t_cam2gripper = (Mat_<float>(3, 1));

    vector<String> fn;
    glob("images/*.bmp", fn, false);

    vector<Mat> images;
    size_t num_images = fn.size(); //number of bmp files in images folder
    Size patternsize(6, 8); //number of centers
    vector<Point2f> centers; //this will be filled by the detected centers
    float cell_size = 30;
    vector<Point3f> obj_points;

    R_gripper2base.reserve(num_images);
    t_gripper2base.reserve(num_images);
    R_target2cam.reserve(num_images);
    t_target2cam.reserve(num_images);

    for (int i = 0; i < patternsize.height; ++i)
        for (int j = 0; j < patternsize.width; ++j)
            obj_points.push_back(Point3f(float(j*cell_size),
                float(i*cell_size), 0.f));

    for (size_t i = 0; i < num_images; i++)
        images.push_back(imread(fn[i]));

    Mat frame;

    for (size_t i = 0; i < num_images; i++)
    {
        frame = imread(fn[i]); //source image
        bool patternfound = findChessboardCorners(frame, patternsize, centers);
        if (patternfound)
        {
            drawChessboardCorners(frame, patternsize, Mat(centers), patternfound);
            //imshow("window", frame);
            //int key = cv::waitKey(0) & 0xff;
            solvePnP(Mat(obj_points), Mat(centers), cameraMatrix, distortionCoefficients, rvec, tvec);

            Mat R;
            Rodrigues(rvec, R); // R is 3x3
            R_target2cam.push_back ...
(more)
2019-08-07 12:37:28 -0600 commented question calibrateHandEye precision

What's the difference in the chessboard you are suggesting? I don't see any difference, besides the size of the squares

2019-08-07 12:36:20 -0600 commented question calibrateHandEye precision

What's the difference in the chessboard you are suggesting? I don't see any difference, besides the size of the squares

2019-08-07 12:36:06 -0600 commented question calibrateHandEye precision

What's the difference in the chessboard you are suggesting? I don't see any difference, besides the size of the squares

2019-08-07 12:25:33 -0600 commented question calibrateHandEye precision

Thank you so much for the feedback, Eduardo. Also for me, the results of the Tsai method were weird. It seems like a bug

2019-08-07 11:23:24 -0600 commented question calibrateHandEye precision

Sure, it's done! I got the calibration with the same images of the link above.

2019-08-07 11:22:40 -0600 commented question calibrateHandEye precision

Sure, it's done!

2019-08-07 11:22:37 -0600 edited question calibrateHandEye precision

calibrateHandEye precision Hi there! I am using the calibrateHandEye functions from the Calib3d library to get the hand

2019-08-07 08:15:52 -0600 edited question calibrateHandEye precision

calibrateHandEye precision Hi there! I am using the calibrateHandEye functions from the Calib3d library to get the hand

2019-08-07 08:15:32 -0600 commented question calibrateHandEye precision

Hi Eduardo! Good to see you again helping with this. I added to the original question a link to download the images and

2019-08-07 08:15:01 -0600 commented question calibrateHandEye precision

Hi Eduardo! Good to see you again helping with this. I added to the original question a link to download the images and

2019-08-07 08:08:05 -0600 edited question calibrateHandEye precision

calibrateHandEye precision Hi there! I am using the calibrateHandEye functions from the Calib3d library to get the hand

2019-08-06 16:57:57 -0600 commented question calibrateHandEye precision

In the picture assume the ',' are '.' That´s the Spanish way, sorry for that. Is in mm and degrees

2019-08-06 16:57:16 -0600 commented question calibrateHandEye precision

In the picture assume the ',' are '.' That´s the Spanish way, sorry for that.

2019-08-06 16:08:12 -0600 answered a question Wrong result in calibrateHandEye function [answered]

I actually managed to solve this problem. The general idea was correct, but: I was not understanding correctly the vec

2019-08-06 16:07:31 -0600 asked a question calibrateHandEye precision

calibrateHandEye precision Hi there! I am using the calibrateHandEye functions from the Calib3d library to get the hand

2019-08-01 10:34:01 -0600 received badge  Enthusiast
2019-07-12 09:17:27 -0600 commented question Wrong result in calibrateHandEye function [answered]

No, the input values are different. MVTec uses a different calibration board, so the images are different and the poses

2019-07-12 09:15:38 -0600 commented question Wrong result in calibrateHandEye function [answered]

No, the input values are different. MVTec uses a different calibration board, so the images are different and the poses

2019-07-12 09:08:31 -0600 commented question Wrong result in calibrateHandEye function [answered]

I rechecked the information of the robot, it seems that the rotation matrix is calculated in the way ZYX: The RPY vecto

2019-07-12 08:57:50 -0600 commented question Wrong result in calibrateHandEye function [answered]

I rechecked the information of the robot, it seems that the rotation matrix is calculated in the way ZYX: The RPY vecto

2019-07-12 08:41:14 -0600 commented question Wrong result in calibrateHandEye function [answered]

Yeah, I checked the frames many times now the best I could...

2019-07-12 08:39:38 -0600 commented question Wrong result in calibrateHandEye function [answered]

Hi Eduardo. Thanks for the interest. The other software is Halcon from MVTec. I got a trial license but is almost finis

2019-07-12 08:32:38 -0600 edited question Wrong result in calibrateHandEye function [answered]

Wrong result in calibrateHandEye function I am working in a robot application, in which I have a camera fixed to a robot

2019-07-12 08:29:15 -0600 edited question Wrong result in calibrateHandEye function [answered]

Wrong result in calibrateHandEye function I am working in a robot application, in which I have a camera fixed to a robot

2019-07-12 08:03:51 -0600 edited question Wrong result in calibrateHandEye function [answered]

Wrong result in calibrateHandEye function I am working in a robot application, in which I have a camera fixed to a robot

2019-07-12 08:02:38 -0600 commented question Wrong result in calibrateHandEye function [answered]

The robot doesn´t give the quaternion or rotation matrix. I can only extract the information about the orientation as RP

2019-07-12 08:02:20 -0600 commented question Wrong result in calibrateHandEye function [answered]

The robot doesn´t give the quaternion or rotation matrix. I can only extract the information about the orientation as RP

2019-07-12 08:01:28 -0600 commented question Wrong result in calibrateHandEye function [answered]

The robot doesn´t give the quaternion or rotation matrix. I can only extract the information about the orientation as RP

2019-07-11 15:11:32 -0600 received badge  Editor (source)
2019-07-11 15:11:32 -0600 edited question Wrong result in calibrateHandEye function [answered]

Wrong result in calibrateHandEye function I am working in a robot application, in which I have a camera fixed to a robot

2019-07-10 14:34:03 -0600 asked a question Wrong result in calibrateHandEye function [answered]

Wrong result in calibrateHandEye function I am working in a robot application, in which I have a camera fixed to a robot