Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Multi-camera calibration for tracking objects

Hello everyone,

I have some questions for more experienced OpenCV users concerning making a multi-camera tracking program. So to quickly present a problem I want to track multiple objects with multiple cameras. The result I want to achieve is more or less something like this: https://www.youtube.com/watch?v=7Dy9co0mWY0

Eventually I came to the conclusion that I want to use Kalman filter for tracking. The issues I want to ask about are:

  1. Is there a way to calibrate multiple cameras based on dataset of videos like those in video link? Can it be done somehow automatically? I know you can calibrate a camera using a chessboard (http://docs.opencv.org/3.3.0/dc/d43/tutorial_camera_calibration_square_chess.html) but that's not the case as you don't have it in the video. There's also something like this: http://docs.opencv.org/master/d2/d1c/tutorial_multi_camera_main.html but I guess it has the very same disadvantage.
  2. What would be the most efficient way to approach tracking? Should I use Kalman filter for each view and try to merge individual result or somehow try to reconstruct the objects in 3d and then apply filter?

Any suggestions will be welcomed. Thanks.