Keypoint matcher matches all keypoints from train to query irrespective of the scene?
I am using feature detection to recognize known object inn a real time camera frame with android but matcher shows every keypoint from the object image is matched to the keypoints in the scene/camera frame.
the logcat outputs the keypoint and matched info
08-07 01:40:13.899: I/OCVSample::Activity(26175): mGray H/W : 320x240
08-07 01:40:13.899: I/OCVSample::Activity(26175): mRef H/W : 153x94
08-07 01:40:13.899: I/OCVSample::Activity(26175): keypoints H/W : 1x381
08-07 01:40:13.899: I/OCVSample::Activity(26175): keypointsRef H/W : 1x33
08-07 01:40:13.899: I/OCVSample::Activity(26175): descriptor Size: 32x381
08-07 01:40:13.899: I/OCVSample::Activity(26175): descriptorRef Size : 32x33
08-07 01:40:13.899: I/OCVSample::Activity(26175): Matches Size: 1x33
08-07 01:40:13.899: I/OCVSample::Activity(26175): matchesList Size: 33
08-07 01:40:13.899: I/OCVSample::Activity(26175): LIST D MATCH -- Max dist : 97.0 -- Min dist : 67.0
08-07 01:40:13.899: I/OCVSample::Activity(26175): good_matches Size: 33
08-07 01:40:13.899: I/OCVSample::Activity(26175): gm good_matches H/W : 1x33
08-07 01:40:13.899: I/OCVSample::Activity(26175): gm good_matches : Mat [ 33*1*CV_32FC4, isCont=true, isSubmat=false, nativeObj=0x6761d8, dataAddr=0x8bef80 ]
the code is
FeatureDetector detector = FeatureDetector.create(FeatureDetector.ORB);
DescriptorExtractor extractor = DescriptorExtractor.create(DescriptorExtractor.ORB);
DescriptorMatcher matcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMINGLUT);
detector.detect(mGray, keypoints);
detector.detect(mRef, keypointsRef);
extractor.compute(mGray, keypoints, descriptor);
extractor.compute(mRef, keypointsRef, descriptorRef);
matcher.match(descriptorRef, descriptor, matchs);
List<DMatch> matchesList = matchs.toList();
Double max_dist = 0.0;
Double min_dist = 100.0;
for (int i = 0; i < matchesList.size(); i++) {
Double dist = (double) matchesList.get(i).distance;
if (dist < min_dist)
min_dist = dist;
if (dist > max_dist)
max_dist = dist;
}
LinkedList<DMatch> good_matches = new LinkedList<DMatch>();
MatOfDMatch gm = new MatOfDMatch();
for (int i = 0; i < matchesList.size(); i++) {
if (matchesList.get(i).distance <= (3 * min_dist)) {
good_matches.addLast(matchesList.get(i));
}
}
gm.fromList(good_matches);
Log.i(TAG, "gm from good_matches Size: " + gm.size());
what is the problem with matcher.
/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\ \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\ UPDATE /\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/ /\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
OK now I use
FeatureDetector detector = FeatureDetector.create(FeatureDetector.FAST);
DescriptorExtractor extractor = DescriptorExtractor.create(DescriptorExtractor.ORB);
DescriptorMatcher matcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE);
I now get good-matches less than no of reference keypoints depending on condition below
if (matchesList.get(i).distance <= (3 * min_dist)) {
good_matches.addLast(matchesList.get(i));
}
The logcat now shows(compare-previous)
Keypoints Size: 1x676
KeypointsRef Size : 1x139
descriptor Size: 32x629
descriptorRef Size : 32x32
Matches Size: 1x32
matchesList Size: 32
LIST D MATCH -- Max dist : 447.1934814453125 -- Min dist : 100.0
good matches size: 5
gm from good_matches Size: 1x5
obj Size: 1x5
scene Size: 1x5
Calib3d.findHomography hg Size: 3x3
A few output samples
Frame - 1
hg : [-1.946750073879795, -2.153719667711051, 161.7932939049003;
-2.245372221173412, -2.194859343368184, 175.7583814286593;
-0.01283591450941816, -0.01266823891952267, 0.9999999999999999]
obj_corners : [0, 0; 153, 0; 153, 94; 0, 94]
scene_corners : [161.79329, 175.75838; 141.15591, 174.06831; 157.10199, 173.61986; 213.06747, 160.14717]
Frame-2
hg : [-1 ...
You have 33 keypoints in the refimage. So the bruteforce matcher will find the best match for every keypoint. So you will of course have 33 matches. Since your mindistace is 67 and your maxdistance 97, all the matches will also be in good_matches.
This project is real time detection from camera frame http://answers.opencv.org/question/17723/android-application-to-recognize-known-object-and/
What if the reference image is not present in the scene there should be no matches right(when I point the camera towards white background there's matches and border lines are drawn in the output but are skewed and not pointing to anything).
All I need is to detect the presence of the reference object in the camera frame what feature extraction, detection and matching I should use.
Are there any clear openCV documentation for object recognition?
No, even if the reference image is not in the scene, you will always get matches. Why? Its simple, you are using orb and a BFmatcher with the hamming distance. The bruteforce matcher will just find the best match according to the hamming distance for every keypoint descriptor. So you will have to find a way to determine good and bad matches or if the image is not even present.
Using fllannbased matcher gives error