Ask Your Question

Revision history [back]

The aspect ratio of the camera in Blender is determined by two parameters: the focal length (which defaults to 35mm) and the sensor size (which is set on the camera control panel, and defaults to 32 in my version). This is the horizontal size of the image sensor in mm, according to the pop-up info box. The model in OpenCV only has the focal length as a parameter. So the question becomes how to set the sensor size in Blender to match what OpenCV is expecting.

When I produce a video of objects in Blender and then track them in OpenCV, the fit is much better for an OpenCV focal-length setting of between 350 and 3500 than for a setting of 35.

The basic issue is that the camera matrix in OpenCV stores the focal length in pixel units, not millimeters (see http://docs.opencv.org/doc/tutorials/calib3d/camera_calibration/camera_calibration.html). So the basic issue is to convert from millimeters to pixels. For the default image sensor width of 32mm, and an image resolution of 640x480, we compute the OpenCV focal length as follows: 35mm * (640 pixels / 32 mm) = 700 pixels. The resulting camera matrix would be:

700 0 320 0 700 240 0 0 1