Ask Your Question
0

how to find matching image

asked 2016-03-02 07:04:59 -0600

jan gravatar image

updated 2016-03-02 07:30:38 -0600

berak gravatar image

Hello,

i want to extract similar image from vocabulary that was created in above code.this code run successfully creating two descriptors file.My question is that I don't understand that file "descriptr.yml" contains extracted features of query image or matched features of query image with vocabulary. Please help

my code is...

#include "stdafx.h"
#include <opencv/cv.h>
#include <opencv/highgui.h>
#include <opencv2/nonfree/features2d.hpp>

using namespace cv;
using namespace std;

#define DICTIONARY_BUILD 0 // set DICTIONARY_BUILD 1 to do Step 1, otherwise it goes to step 2

int _tmain(int argc, _TCHAR* argv[])
{   
int minHessian = 400; //Hessian Threshold
#if DICTIONARY_BUILD == 1

    //Step 1 - Obtain the set of bags of features.

    //to store the input file names
char  filename[100]={};     
    //to store the current input image
    Mat input;  

    //To store the keypoints that will be extracted by SURF
    vector<KeyPoint> keypoints;
    //To store the SURF descriptor of current image
    Mat descriptor;
    //To store all the descriptors that are extracted from all the images.
    Mat featuresUnclustered;
    //The SURF feature extractor and descriptor 
    SurfDescriptorExtractor detector(minHessian,4,2,false);     

    //I select 20 (1000/50) images from 1000 images to extract feature descriptors and build the vocabulary
    for(int f=1;f<25;f++){      
        //create the file name of an image
        sprintf(filename,"C:\\harshada\\OpenCV BoFSURF\\Release\\image\\%i.jpg",f);
        //open the file
        input = imread(filename, CV_LOAD_IMAGE_GRAYSCALE); //Load as grayscale  
    //Mat img =cv::imread("C:\\harshada\\OpenCV BoFSURF\\Release\\image\\22.jpg");
    //cv::cvtColor(img,img,CV_BGR2GRAY);
        if(input.empty())
    {
        cout << "Error: Image cannot be loaded !" << endl;
        system("Pause");
        return -1;
    }
        //detect feature points
        detector.detect(input, keypoints);
        //compute the descriptors for each keypoint
        detector.compute(input, keypoints,descriptor);      
        //put the all feature descriptors in a single Mat object 
        featuresUnclustered.push_back(descriptor);      
        //print the percentage
        printf("%i percent done\n",f/10);
    }   


    //Construct BOWKMeansTrainer
    //the number of bags
    int dictionarySize=200;
    //define Term Criteria
    TermCriteria tc(CV_TERMCRIT_ITER,100,0.001);
    //retries number
    int retries=1;
    //necessary flags
    int flags=KMEANS_PP_CENTERS;
    //Create the BoW (or BoF) trainer
    BOWKMeansTrainer bowTrainer(dictionarySize,tc,retries,flags);
    bowTrainer.add(featuresUnclustered);
    if (featuresUnclustered.type() != CV_32F)
{
    featuresUnclustered.convertTo(featuresUnclustered, CV_32F);
}
Mat vocabulary = bowTrainer.cluster();
    //cluster the feature vectors
    //Mat dictionary=bowTrainer.cluster(featuresUnclustered);   
    //store the vocabulary
    FileStorage fs("C:\\harshada\\OpenCV BoFSURF\\Release\\image\\dictionary.yml", FileStorage::WRITE);
    fs << "vocabulary" << vocabulary;
    fs.release();

#else
    //Step 2 - Obtain the BoF descriptor for given image/video frame. 

    //prepare BOW descriptor extractor from the dictionary    
    Mat dictionary; 
    FileStorage fs("C:\\harshada\\OpenCV BoFSURF\\Release\\image\\dictionary.yml", FileStorage::READ);
    fs["vocabulary"] >> dictionary;
    fs.release();   

    //create a nearest neighbor matcher
    Ptr<DescriptorMatcher> matcher(new FlannBasedMatcher);
    //create SURF feature point extracter
    Ptr<FeatureDetector> detector(new SurfFeatureDetector(minHessian,4,2,false));
    //create SURF descriptor extractor
    Ptr<DescriptorExtractor> extractor(new SurfDescriptorExtractor(minHessian,4,2,false));  
    //create BoF (or BoW) descriptor extractor
    BOWImgDescriptorExtractor bowDE(extractor,matcher);
    //Set the dictionary with the vocabulary we created in the first step
    bowDE.setVocabulary(dictionary);

    //To store the image file name
    char * filename = new char[100 ...
(more)
edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted
3

answered 2016-03-02 07:42:47 -0600

berak gravatar image

updated 2016-03-04 11:42:48 -0600

if you had 200 SURF descriptors in your vocabulary, your bowDescriptor will have 200 numbers, each the distance from the SURF descriptor of your image to one of the features in the vocabulary.

to compare 2 images this way, you extract the bowFeatures for both images, and compare those (instead of comparing the images themselves, or the SURF descriptors)

the most simple algorithm for this is nearest-neighbour search.

given, you have a vector of bowFeatures (your train-set), and a test candidate, it's as simple as:

vector<Mat> images; // keep for later
... train bow dictionary, and compute / collect bowFeatures from train-set:   
vector<Mat> bowTrain = ...;


Mat bowTest = computeBowFeatureFromTestImage(img);

int best=0;
double minDist = 999999999;
for (size_t i=0; i<bowTrain.size(); i++)
{
     double dist = norm(bowTrain[i], bowTest); //calc L2 distance
     if (dist < minDist) // keep the one with smallest distance
     {
          minDist = dist;
          best = i;
     }
}

Mat bestImage = images[best];

but, ofc, this is a very primitive / blunt way to do it. in real life, you want to train a more sophisticated classifier, like Knn or Svm.

edit flag offensive delete link more

Comments

No,I want to check that my query image is in my data set or not.if present then want to show on screen.

jan gravatar imagejan ( 2016-03-02 23:34:11 -0600 )edit

^^ sorry, but i don't understand a single word.

berak gravatar imageberak ( 2016-03-03 01:26:01 -0600 )edit

in short i want content based image retrieval .So how can i do that.

jan gravatar imagejan ( 2016-03-03 04:10:26 -0600 )edit
1

feed the bowFeatures from your train set into KNearest. then, for a test candidate, find the closest K, and take your pick

berak gravatar imageberak ( 2016-03-03 04:18:06 -0600 )edit
1

did you understand, why BOW is applied here ?

berak gravatar imageberak ( 2016-03-03 04:19:15 -0600 )edit

I don't understand which algorithms should I use to retrieve similar image?will you please give me some ideas?

jan gravatar imagejan ( 2016-03-03 05:14:08 -0600 )edit

it's probably time, to take a detour, and look at machine learning, e.g. here or here . also see edit above.

berak gravatar imageberak ( 2016-03-03 06:11:20 -0600 )edit

hello berak, I have one problem with "vector<mat> bowtrain".Above in my code i am clustering dictionary in Mat.So how can i use it in "norm"function? please help

jan gravatar imagejan ( 2016-03-10 03:18:28 -0600 )edit

no, you don't use the vocabulary for the comparison directly.

instead, you setup your bowDE with the vocabulary, and "project" your SURF features into bow space, like here:

bowDE.compute(img,keypoints,bowDescriptor);

do that for all train features, and also your test features, then compare those.bowDescriptors

berak gravatar imageberak ( 2016-03-10 03:36:40 -0600 )edit

hi berak,i define bowTrain as vector<mat>bowTrain and then trying to push back the descriptor matrix on bowTrain like bowTrain.push_back(bowDescriptor); is it my correct way? It's throwing error" Unhandled exception at 0x0f345a40 (opencv_features2d2411.dll) in BoFSURF.exe: 0xC0000005: Access violation reading location 0x00000014." plz help

jan gravatar imagejan ( 2016-03-18 04:15:06 -0600 )edit

Question Tools

2 followers

Stats

Asked: 2016-03-02 07:04:59 -0600

Seen: 1,755 times

Last updated: Mar 04 '16