I am trying to use opencv with python to calibrate a camera. I am doing initial tests to try to confirm I am doing it right. I used Blender 3d to animate a chessboard moving so that I could check the intrinsic matrix I got with my blender setup. I loaded individual frames from the animation and used cv.FindChessboardCorners to find the corners. However, I am a little confused how to check my answer.Using cv.CalibrateCamera2, I got this intrinsic matrix:
[[ 5.98555703e+04 0.00000000e+00 9.60743286e+02]
[ 0.00000000e+00 6.18404531e+04 5.36965637e+02]
[ 0.00000000e+00 0.00000000e+00 1.00000000e+00]]
The center coordinates (960,536) make sense because the resolution of my picture is (1920,1080)but I do not know know how to interpret the 5.98555703e+04 and 6.18404531e+04 values. In blender, the camera's focal length is 35mm. Can I use this information to check these two values? I know the values for row 1 col 1 and row 2 col 2 depend on focal length but I did not understand any of the explanations I found on the web.
Also, the values for the distortion coefficients (k1,k2,p1,p2,k3) are:
[[ -1.36325574e+00]
[ 1.39697409e+00]
[ -4.34228778e-02]
[ 2.86565591e-02]
[ 5.75893035e-04]]
Is this high or low? Blender's camera should not have distortion.
I really appreciate any help I can get! Thanks