Why did the results calculated by calibrateCamera seem wrong

asked 2018-01-14 20:56:00 -0600

I calibrated a eye-in-hand system by OpenCV and MATLAB, and the results turned out totally different. I examined the matrices calculated by MATLAB using m[u v 1]'=A[R|t][x y z 1]' and it made sense. But I couldn't figure out what the results calculated by OpenCV means. Specifically, I aimed to get the translation and rotation matrix from camera coordinate to world coordinate, which I supposed it to be rvecs and tvecs returned by calibrateCamera, but I couldn't reproject the points from world coordinate to image coordinate using the intrinsic and extrinsic matices. How can I get the right results?

edit retag flag offensive close merge delete