OpenCV : Running FLANN on a cluster

asked 2016-10-24 01:25:29 -0500

ashikaumanga gravatar image

I am trying to scale the "matching_to_many_images.cpp" for large set of images. (20K+ images)

I use FLANN based matcher to match the images (with SURF keypoint and descriptor extractors).I try follow the method described in this paper (section "Searching on a Compute Cluster") ,

I have a training image set C with total n number of images. C={B(1)...B(n)}. I divide the C into N number of "buckets" where each bucket contains (n/N) number of images. For each bucket I perform "detectKeyPoints" , "computeDescriptors" and "trainMAtcher" separately. This means I have a separate "DescriptorMatcher" for each image bucket.Total N number of DescriptorMatchers.

Then for the query image, I perform "detectKeyPoints","computeDescriptors" and then perform "match" against each of the N DescriptorMatchers. Finally I get list of DMatch objects from each of the "match".

My question is:

1) Am I doing the correct steps according to the paper? I am tryin to understand how the "reduce" step is done as described in the paper.

2) There are two factors I can extract from the DMatch object ; "distance" and "number of total matches per image". How can I use these two factors to find the closest matching image?

edit retag flag offensive close merge delete


you want a flann::Index here, not the matcher.

the "matching_to_many" example is misleading here, as it assumes, that all images show the same scene (just from different pov). you simply cannot use the same approach for classifying different objects.

berak gravatar imageberak ( 2016-10-24 01:59:43 -0500 )edit

@berak , thank you .but isnt FlannBasedMatcher use flann::Index under the hood ?

ashikaumanga gravatar imageashikaumanga ( 2016-10-24 02:23:17 -0500 )edit

sort of, yes. but the way you use it significantly differs

berak gravatar imageberak ( 2016-10-24 02:28:39 -0500 )edit