Ask Your Question
2

What to do with DMatch value ?

asked 2013-08-08 04:35:29 -0600

newBee gravatar image

updated 2013-08-08 06:03:14 -0600

I have this code for image matching using ORB

  FeatureDetector detector = FeatureDetector.create(FeatureDetector.ORB);
    DescriptorExtractor descriptor = DescriptorExtractor.create(DescriptorExtractor.ORB);;
    DescriptorMatcher matcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMING);
    File root = Environment.getExternalStorageDirectory();
    File file = new File( root, "nano1.jpg");
    File file2 = new File( root, "nano2.jpg");
    Log.d(LOG_TAG, "File " + file.exists() + " & "+ file2.exists() + " " + root.getAbsolutePath());

    //first image
    Mat img1 = Highgui.imread(file.getAbsolutePath());
    Mat descriptors1 = new Mat();
    MatOfKeyPoint keypoints1 = new MatOfKeyPoint();

    detector.detect(img1, keypoints1);
    descriptor.compute(img1, keypoints1, descriptors1);

    //second image
    Mat img2 = Highgui.imread(file2.getAbsolutePath());
    Mat descriptors2 = new Mat();
    MatOfKeyPoint keypoints2 = new MatOfKeyPoint();

    detector.detect(img2, keypoints2);
    descriptor.compute(img2, keypoints2, descriptors2);


    //matcher should include 2 different image's descriptors
    MatOfDMatch  matches = new MatOfDMatch();             
    matcher.match(descriptors1,descriptors2,matches);
    Log.d(LOG_TAG, "size " + matches.size());

    //feature and connection colors
    Scalar RED = new Scalar(255,0,0);
    Scalar GREEN = new Scalar(0,255,0);
    //output image
    Mat outputImg = new Mat();
    MatOfByte drawnMatches = new MatOfByte();
    //this will draw all matches, works fine
    Features2d.drawMatches(img1, keypoints1, img2, keypoints2, matches, 
            outputImg, GREEN, RED,  drawnMatches, Features2d.NOT_DRAW_SINGLE_POINTS);
    int DIST_LIMIT = 80;
    List<DMatch> matchList = matches.toList();
    List<DMatch> matches_final = new ArrayList<DMatch>();
    for(int i=0; i<matchList.size(); i++){
        if(matchList.get(i).distance <= DIST_LIMIT){
            matches_final.add(matches.toList().get(i));
        }
    }

    MatOfDMatch matches_final_mat = new MatOfDMatch();
    matches_final_mat.fromList(matches_final);
    for(int i=0; i< matches_final.size(); i++){
        Log.d(LOG_TAG,""+ matches_final.get(i));
    }

And in the matches I get

08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=0, trainIdx=8, imgIdx=0, distance=63.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=1, trainIdx=81, imgIdx=0, distance=78.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=2, trainIdx=162, imgIdx=0, distance=73.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=3, trainIdx=189, imgIdx=0, distance=75.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=4, trainIdx=88, imgIdx=0, distance=77.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=5, trainIdx=89, imgIdx=0, distance=60.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=6, trainIdx=81, imgIdx=0, distance=78.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=7, trainIdx=68, imgIdx=0, distance=57.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=8, trainIdx=298, imgIdx=0, distance=48.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=9, trainIdx=12, imgIdx=0, distance=39.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=10, trainIdx=479, imgIdx=0, distance=66.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=11, trainIdx=480, imgIdx=0, distance=63.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=12, trainIdx=125, imgIdx=0, distance=56.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=13, trainIdx=298, imgIdx=0, distance=71.0]
08-08 14:59:29.569: D/FdActivity(9001): DMatch [queryIdx=14 ...
(more)
edit retag flag offensive close merge delete

Comments

1

Discard some matches based on distance, it seems the only parameter of use in DMatch. Or apply some algorithms like testing symmetry between matches(find matches for image1 in image2 and vice versa, then check if they are symmetrical), or try RANSAC and so on.

mada gravatar imagemada ( 2013-08-08 07:08:02 -0600 )edit

@mada Can you give some lead on how to do that ?

newBee gravatar imagenewBee ( 2013-08-09 01:50:44 -0600 )edit

@ximobayo explained it quite good. Using knnMatch will give you more options in discarding false matches, also check this out: http://answers.opencv.org/question/11840/false-positive-in-object-detection/

mada gravatar imagemada ( 2013-08-09 05:20:18 -0600 )edit

2 answers

Sort by ยป oldest newest most voted
4

answered 2013-08-09 02:37:10 -0600

ximobayo gravatar image

The struct DMatch tells for a descriptor (feature) which descriptor (feature) from the train set is more similar. So there are a queryIndex a TrainIndex (which features decide the matcher that is more similar) and a distance. The distance represents how far is one feature from other (in some metric NORM_L1, NORM_HAMMING etc).

So you can determine if a matche is correct if you set a threshold for the distance. For example with SURF features you can match with knnMatch the 2 nearest features, and then you get a Matrix of Matches and you can take as good matches all that its distance are the half than the second I mean:

   matcher->knnMatch(desc1, trainDesc, matches, 2, Mat(), false );
    for(int i=0; i<matches.size(); i++){
      if(matches[i][0].distance > matches[i][1].distance * 0.6){
    goodMatches.push_back(i);
      }
    }
edit flag offensive delete link more

Comments

what is the push_back method in line goodMatches.push_back(i);

newBee gravatar imagenewBee ( 2013-08-09 06:38:31 -0600 )edit
0

answered 2013-08-09 07:29:56 -0600

ximobayo gravatar image

goodMatches in that example is a vector to write down which features were found, and push_back isa method of the vector class that store one element more at the end of the array

edit flag offensive delete link more

Question Tools

Stats

Asked: 2013-08-08 04:35:29 -0600

Seen: 19,639 times

Last updated: Aug 09 '13