Ask Your Question

Calibrating Blender Camera as test

asked 2013-01-21 13:45:20 -0500

amartin7211 gravatar image

updated 2013-01-21 13:51:03 -0500

I am trying to use opencv with python to calibrate a camera. I am doing initial tests to try to confirm I am doing it right. I used Blender 3d to animate a chessboard moving so that I could check the intrinsic matrix I got with my blender setup. I loaded individual frames from the animation and used cv.FindChessboardCorners to find the corners. However, I am a little confused how to check my answer.Using cv.CalibrateCamera2, I got this intrinsic matrix:

[[  5.98555703e+04   0.00000000e+00   9.60743286e+02]
 [  0.00000000e+00   6.18404531e+04   5.36965637e+02]
 [  0.00000000e+00   0.00000000e+00   1.00000000e+00]]

The center coordinates (960,536) make sense because the resolution of my picture is (1920,1080)but I do not know know how to interpret the 5.98555703e+04 and 6.18404531e+04 values. In blender, the camera's focal length is 35mm. Can I use this information to check these two values? I know the values for row 1 col 1 and row 2 col 2 depend on focal length but I did not understand any of the explanations I found on the web.

Also, the values for the distortion coefficients (k1,k2,p1,p2,k3) are:

[[ -1.36325574e+00]
 [  1.39697409e+00]
 [ -4.34228778e-02]
 [  2.86565591e-02]
 [  5.75893035e-04]]

Is this high or low? Blender's camera should not have a lot of distortion.

I really appreciate any help I can get! Thanks

edit retag flag offensive close merge delete


Are you resolved the issue? I just working on same problem, and as I see, these 4 parameters of blender camera may be an solution: matrix_basis matrix_local matrix_parent_inverse matrix_world Dit you try it?

inferrna gravatar imageinferrna ( 2015-02-11 00:43:47 -0500 )edit

1 answer

Sort by ยป oldest newest most voted

answered 2015-05-09 18:34:25 -0500

reliasolve gravatar image

The aspect ratio of the camera in Blender is determined by two parameters: the focal length (which defaults to 35mm) and the sensor size (which is set on the camera control panel, and defaults to 32 in my version). This is the horizontal size of the image sensor in mm, according to the pop-up info box. The model in OpenCV only has the focal length as a parameter. So the question becomes how to set the sensor size in Blender to match what OpenCV is expecting.

When I produce a video of objects in Blender and then track them in OpenCV, the fit is much better for an OpenCV focal-length setting of between 350 and 3500 than for a setting of 35.

The basic issue is that the camera matrix in OpenCV stores the focal length in pixel units, not millimeters (see So the basic issue is to convert from millimeters to pixels. For the default image sensor width of 32mm, and an image resolution of 640x480, we compute the OpenCV focal length as follows: 35mm * (640 pixels / 32 mm) = 700 pixels. The resulting camera matrix would be:

700 0 320 0 700 240 0 0 1

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower


Asked: 2013-01-21 13:45:20 -0500

Seen: 1,076 times

Last updated: Jan 21 '13