# How to detect camera rotation

Hello,

I am working on a project involving a zeppelin that has to be able to navigate using a camera. The camera has to scan qr codes containing instructions (eg move x meters, rotate x degrees).

The TA's advised us to simply determine experimentally how much engine power we need to move a certain distance, but since we have a downwards pointing camera anyway I figured we could use image analysis to make our movements more exact.

I am looking for a way to estimate how much the camera has rotated between two pictures. I have looked through the documentation, but since I have no real experience in image processing it's quite overwhelming.

Preferably the method would also be able to account for small amounts of translation and height change of the camera.

Of course I'm not expecting anybody to give me a complete, ready to use method, but some pointers on what to look at first would be nice :)

edit retag close merge delete

You only need the angle of rotation between images?

( 2013-11-26 05:10:44 -0600 )edit

Sort by ยป oldest newest most voted

You've asked a fairly complex question, so instead of trying to give you the complete answer, I'll point you in the right direction (based on my limited perspective on the problem you are trying to solve).

Using Camshift, you can find an object center, size, and orientation. http://docs.opencv.org/modules/video/doc/motion_analysis_and_object_tracking.html?highlight=camshift (and you'll see other operations here that will be of benefit)

1. Find a reference object. This assumes you can detect an object with OpenCV (or will investigate how to do it).
2. Once you rotate the camera (in one axis I'm assuming), find the same object and run Camshift on it. The orientation should give the value you can utilize.

I don't know if you are trying to solve a more difficult problem and may need to bring in Jacobian Matrices (though probably not if you take the approach of finding a reference object again (could be the same one in some circumstances)).

more

You can check this paper,"Using vanishing points to correct camera rotation in images". Obviously, it's time consuming to estimate the rotation between two images, and the result is not accurate.Maybe you should consider use another device.

more

I don't think there is an image processing method to determine camera rotation, not a reliable one at least. Will Stwart's answer assumes you'll always have detected objects in your image and that you can always compare them between frames to determine relative camera rotation. It also assumes that this object is not moving, or else the rotation change you'd get would be of the object and not of the camera.

It wouldn't even make much sense to use image to determine rotation, since that is a task that can be completed by just adding a 100\$ sensor in your setup. IMU sensors can determine rotation very robustly with a lot less error than any image processing method would. They are even used for autonomous navigation, which seems to be your goal exactly. There are many low cost solutions that are easy to integrate in a custom project.

more

A combination of image and inertial models would provide a means to confirm and refine the answers you would receive from either (e.g., a 3D truth model). If all the objects in the vision field are themselves moving (or rotating) on their own, then an inertial method alone would seem appropriate.

( 2013-11-29 14:10:54 -0600 )edit

Official site

GitHub

Wiki

Documentation

## Stats

Asked: 2013-11-25 11:43:24 -0600

Seen: 6,902 times

Last updated: Apr 23 '14