Ask Your Question

Revision history [back]

Multi-camera registration using images taken from a target

Hi everyone

I am trying to find out the most accurate way to calculate the orientation of images that are taken from several cameras that encircle a common target. My idea is to produce a cylinder whose surface contains equip-spaced dots or squares (some kind of simple pattern). Then, several cameras are positioned around this target and images are captured from each camera. I then need to find out the 3-dimensional orientation of each of the images from the separate cameras relative to the target coordinate. The most important for me are the azimuth and inclination but other information such as the pitch and roll angle are also useful. Does anyone have any useful information that would help me with this? I would be mostly interested if there are algorithms that do something like this. I can develop it myself to suit my specific case, but a start would be great.