wrong rotation matrix when using recoverpose between two very similar images

asked 2017-12-12 05:29:00 -0600

I'm trying to perform visual odometry with a camera on top of a car. Basically I use Fast or GoodFeaturesToTrack ( I don't know yet which one is more convenient) and then I follow those points with calcOpticalFlowPyrLK. Once I have both previuos and actual points I call findEssentialMat and then recoverPose to obtain rotation and translation matrix.

My program works quite well. It has some errors when there are images with sun/shadow in the sides but the huge problem is WHEN THE CAR STOPS. When the car stops or his speed is quite low the frames looks very similar (or nearly the same) and the rotation matrix gets crazy (I guess the essential matrix too).

Does anyone knows if it is a common error? Any ideas on how to fix it?

I don't know what information you need to answer it but it seems that It is a concept mistake that I have. I have achieved an acurracy of 1º and 10 metres after a 3km ride but anytime I stop.....goodbye!

Thank you so much in advance

edit retag flag offensive close merge delete

Comments

Maybe check if there is enough displacement between the two frames using the average optical flow? Maybe there is something in the literature to check if the essential matrix is singular or not? Another option consists to predict / filter the updated pose, something similar to the Kalman filter?

Eduardo gravatar imageEduardo ( 2017-12-13 07:06:45 -0600 )edit