Fast outliers detection

asked 2015-01-28 19:55:10 -0500

Mehdi gravatar image

updated 2015-01-28 20:06:26 -0500

I have a situation where I am trying to recognize a scene my robot already mapped. For that I match keypoints from the current robot's view to all images corresponding to the keyframes of my map. Just from seeing the matching results I can as human tell if it is a good one or a false positive. Like in the images below where I superpose both images. I use brisk descriptors, goodFeaturesToTrack detector and binary matching.

image description

Here the two scenes are totally different but contain similar lamps and I can see that the matching lines are crossing each other so it is a wrong match

image description

Here however, the matches are nicely aligned and I can tell this is the right scene being matched. Normally even when having the right scene I still have some outliers and the results in the images shown here are after using cv2.detectHomography and masking as much outliers as possible. What I tried until now is to calculate the remaining matches after outliers removal and the image with the maximum number of filtered matches wins. It is however not very stable with many false positives and it is too slow even for offline processing.

My questions are: Is there a method to remove outliers without having to estimate the homography? How can I write a fast algorithm to detect if the matches "flow" in one direction or that if they are degenerate. By flow I mean that they either represent a translation or a rotation in plane as my robot as a fixed camera looking at the ceiling and moves on wheels.

edit retag flag offensive close merge delete


Maybe you could try to use the line angles between the pairs of matches. For example, you may have 2 big clusters for the angles 15° and 35° and you could reject outliers by comparing the angle value ?

Eduardo gravatar imageEduardo ( 2015-01-29 05:09:22 -0500 )edit