OpenCV : Running FLANN on a cluster
I am trying to scale the "matching_to_many_images.cpp" for large set of images. (20K+ images) https://github.com/kipr/opencv/blob/m...
I use FLANN based matcher to match the images (with SURF keypoint and descriptor extractors).I try follow the method described in this paper (section "Searching on a Compute Cluster") http://www.cs.ubc.ca/research/flann/u... ,
I have a training image set C with total n number of images. C={B(1)...B(n)}. I divide the C into N number of "buckets" where each bucket contains (n/N) number of images. For each bucket I perform "detectKeyPoints" , "computeDescriptors" and "trainMAtcher" separately. This means I have a separate "DescriptorMatcher" for each image bucket.Total N number of DescriptorMatchers.
Then for the query image, I perform "detectKeyPoints","computeDescriptors" and then perform "match" against each of the N DescriptorMatchers. Finally I get list of DMatch objects from each of the "match".
My question is:
1) Am I doing the correct steps according to the paper? I am tryin to understand how the "reduce" step is done as described in the paper.
2) There are two factors I can extract from the DMatch object ; "distance" and "number of total matches per image". How can I use these two factors to find the closest matching image?
you want a flann::Index here, not the matcher.
the "matching_to_many" example is misleading here, as it assumes, that all images show the same scene (just from different pov). you simply cannot use the same approach for classifying different objects.
@berak , thank you .but isnt FlannBasedMatcher use flann::Index under the hood ?
sort of, yes. but the way you use it significantly differs