Ask Your Question

Revision history [back]

wrong rotation matrix when using recoverpose between two very similar images

I'm trying to perform visual odometry with a camera on top of a car. Basically I use Fast or GoodFeaturesToTrack ( I don't know yet which one is more convenient) and then I follow those points with calcOpticalFlowPyrLK. Once I have both previuos and actual points I call findEssentialMat and then recoverPose to obtain rotation and translation matrix.

My program works quite well. It has some errors when there are images with sun/shadow in the sides but the huge problem is WHEN THE CAR STOPS. When the car stops or his speed is quite low the frames looks very similar (or nearly the same) and the rotation matrix gets crazy (I guess the essential matrix too).

Does anyone knows if it is a common error? Any ideas on how to fix it?

I don't know what information you need to answer it but it seems that It is a concept mistake that I have. I have achieved an acurracy of 1ยบ and 10 metres after a 3km ride but anytime I stop.....goodbye!

Thank you so much in advance