Ask Your Question
0

Does the matching work in OpenCV 2.4.4?

asked 2013-05-07 17:23:54 -0600

MysticBE gravatar image

updated 2013-05-14 06:34:12 -0600

The Reason I ask this question is the following: when I match both my descriptors that are of the size 64x241 and 64x258 with matcher.match(descriptorsa,descriptorsb,matches); the matches contain 1x258 matches... how is this possible when the descriptors in the first image only contain 241 descriptors?

EDIT:

These are the matches after I did the matcher.match(descriptorsa,descriptorsb,matches); function. Matches on the original picture

Here i get 258 matches with the original image, while the image has nothing to do with the original one, after being filtered (see first) and The image that is almost identical to the original, which has gives 61 matches after being filtered.

The code is the following:

            //matcher
            MatOfDMatch matches = new MatOfDMatch();          
            matcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMING);
            matcher.match(descriptorsDB,descriptorsCamera,matches);
            Log.d("LOG!", "Aantal goeie matches na de .match= " + matches.size());
            MatOfDMatch goedematches = new MatOfDMatch();


            double max_dist = 0;
            double min_dist = 100000;

            if (descriptorsDB.cols() == descriptorsCamera.cols())
            {
                // calculates max and min distance between keypoints
            for( int i = 0; i < descriptorsDB.rows(); i++ )
            { double dist = matches.toArray()[i].distance;
              if( dist < min_dist ) min_dist = dist;

              if( dist > max_dist ) max_dist = dist;

            }
            // tekent enkel goeie matches
           for( int i = 0; i < descriptorsDB.rows(); i++ )
            {  MatOfDMatch temp = new MatOfDMatch();

               if( matches.toArray()[i].distance <= 2*min_dist )
               {   temp.fromArray(matches.toArray()[i]);
                   goedematches.push_back(temp); 
                   }        
            }
edit retag flag offensive close merge delete

3 answers

Sort by ยป oldest newest most voted
3

answered 2013-05-08 06:36:15 -0600

Nyenna gravatar image

It matches the features of the first image to the best correspondence in the second image. Which means that if your first image has 241 descriptors, you will get 241 matches no matter the second image.

Try inverting your descriptors:

matcher.match(descriptorsa,descriptorsb,matches);

should not return the same number of matches as:

matcher.match(descriptorsb,descriptorsa,matches);

If the first image has 200 descriptors and the second only one, then you will get 200 matches: the best solution for each will be the unique descriptor of the second image!

edit flag offensive delete link more

Comments

so I should always describe the second image instead of the first to get a better result?

MysticBE gravatar imageMysticBE ( 2013-05-08 06:40:04 -0600 )edit
1

No. As I said in at least three answers to your posts: you will always get "good matches" (inliers) and "bad matches" (outliers). First of all, you should draw the matches to see if you have good matches or not (but apparently you haven't done this or haven't shared the result to us). I don't know how you can say that this is not working if you haven't even drawn the matches in a comprehensible way...

Nyenna gravatar imageNyenna ( 2013-05-08 07:24:46 -0600 )edit

I actually drew the matches already, i will share you the results (check edit on this post). I'm sorry for not sharing them. I hope these images suffice for you, since I wasn't able to draw lines between the two images. These are the matches i get from the matcher.match(descriptorsa,descriptorsb,matches); function (not filtered). I hope those are the ones you want to see.

MysticBE gravatar imageMysticBE ( 2013-05-08 12:24:25 -0600 )edit

If you want to draw lines between your images, you will need to copy both of them in the same matrix before. I don't see why your results would be incorrect. The matches you get from the original image to the similar one are probably really coherent when those with the other images aren't (but you would need to draw lines to see this). As I said, the algorithm gives the _best possible match w.r.t the hamming distance_ even though it is really bad. Your filtering step is also very naive, and you should prefer a geometry validation using e.g. a homography matrix. But the first step would be to draw line to convince yourself that the algorithms work.

Nyenna gravatar imageNyenna ( 2013-05-14 01:58:16 -0600 )edit
2

answered 2013-05-08 05:56:46 -0600

I do not agree completely on @NightLife 's comment. Offcourse more matches can happen for a single point, that is why people use algorithms like RANSAC to counter that.

However, what happends here is that the matching functionality is a bruteforce matcher, which means that it matches both ways between the data sets. Therefore the result must contain the amounts of elements as the largest descriptor set. Else the matcher will try to access elements in a descriptor set that are out of the actual range. This is the reason of your 258 value, which is completely correct and how the algorithm should work in order not to create errors.

edit flag offensive delete link more

Comments

When I use the matches I get the following results TLK = 258 matches (when i match the same image) TLK2 = 61 matches (when i match a similar image) TLK3 = 44 matches (has nothing to do with the original image) TLK4 = 251 matches (has NOTHING to do with the original image, gives back a lot of matches) TLK5 = 258 matches (same here, has NOTHING to do with the original, also gives back a full matching)... what is going on? :S I have the feeling the matches has no clue on what it is doing.

MysticBE gravatar imageMysticBE ( 2013-05-08 06:11:12 -0600 )edit

Except it does not do a two-ways matching. It matches the first set of descriptors to the second. In other words: matching A to B won't give the exact same results as matching B to A.

Nyenna gravatar imageNyenna ( 2013-05-08 06:37:37 -0600 )edit

I would post NOTHING again ... seriously ... be patient and let us try to find the solution together.

StevenPuttemans gravatar imageStevenPuttemans ( 2013-05-08 06:43:57 -0600 )edit

@Nyenna : is was expecting it to perform two way matching, but it seems I was wrong. However, your suggestion seems legit. Thanks for the update, learnt something new today =)

StevenPuttemans gravatar imageStevenPuttemans ( 2013-05-08 06:46:47 -0600 )edit
0

answered 2013-05-08 02:11:24 -0600

NightLife gravatar image

updated 2013-05-08 02:12:42 -0600

If you show the matches in one window, you can see that sometimes one feature point from first or second image is matched to more than one point in the other image. So, you will have more matches than 241.

edit flag offensive delete link more

Question Tools

Stats

Asked: 2013-05-07 17:23:54 -0600

Seen: 685 times

Last updated: May 14 '13