Inaccurate feature matching
Hi, I am currently developing an AndroidApp using OpenCV4Android. The aim of this app is a structure form motion in a little larger scale. The concept of the app and the mathematical background is done. The problem is the matching of the feature points. But for a better understanding I will provide a little bit more background.
The problem is that I am having two images of the same scene. Between the images the camera has been translated and maybe there is some rotation. For the structure form motion I need the fundamental matrix and get the essential matrix for later usage.
As OpenCV4Android does not provide SIFT or SURF Feature matching, I am using ORB and Brutforce_Hamming matcher. The Featuredetector is getting excactly 500 Featurepoints per image (8MP) But the matcher for the featuerpoints is only gettin between 0 and 3 matches... I need far more matches for getting a good fundamental matrix.
I am doing the whole process on undistored images using the cameracalibration done before.
Here is the code I am using to get the matches and the FeaturePoints: // Create a feature detector which uses SIFT Features FeatureDetector detector = FeatureDetector.create(FeatureDetector.ORB);
// Create an extractor for the description of the feature points using 3 // WHAT IS 3
DescriptorExtractor extractor = DescriptorExtractor.create(DescriptorExtractor.ORB);
// Matches the described feature points of both images together using bruteforce_hamming
DescriptorMatcher matcher = DescriptorMatcher.create(DescriptorMatcher.BRUTEFORCE_HAMMING);
// Minimal distance between two matches
double min_dist = 80;
// Maximal distance between two matches
double max_dist = 10000;
/**
* Calculates the matches of KeyPoints between two images
* @Param left the left image
* @Param right the right image
* @return List<DMatch> List of matches between left and right
*/
private List<DMatch> computeGoodMatches(Mat left, Mat right, MatOfPoint2f object_left, MatOfPoint2f object_right){
// For the left image
MatOfKeyPoint key_left = new MatOfKeyPoint();
detector.detect(left, key_left);
publishProgress("Keypoints left: " + String.valueOf(key_left.size()));
Mat desc_left = new Mat();
extractor.compute(left, key_left, desc_left);
// For the right image
MatOfKeyPoint key_right = new MatOfKeyPoint();
detector.detect(right, key_right);
publishProgress("Keypoints right: " + String.valueOf(key_right.size()));
Mat desc_right = new Mat();
extractor.compute(right, key_right, desc_right);
// Calculate the matches (evaluating the good matches happens later in the code)
MatOfDMatch matches = new MatOfDMatch();
matcher.match(desc_left, desc_right, matches);
List<DMatch> matchesList = matches.toList();
publishProgress("Number matches: " + String.valueOf(matchesList.size()));
LinkedList<DMatch> good_matches = new LinkedList<DMatch>();
// Quick calculation of min and max distance between matches
for( int j = 0; j < desc_left.rows(); j++ ){
Double dist = (double) matchesList.get(j).distance;
if( dist < min_dist ) min_dist = dist;
if( dist > max_dist ) max_dist = dist;
}
// Only use good matches
// Good = 2*min_dist or 0.02
for(int j = 0; j < desc_left.rows(); j++){
if(matchesList.get(j).distance <= min_dist){
good_matches.addLast(matchesList.get(j));
}
}
List<KeyPoint> list_key_left = key_left.toList();
List<KeyPoint> list_key_right = key_right.toList();
LinkedList<Point> objList1 = new LinkedList<Point>();
LinkedList<Point> objList2 = new LinkedList<Point>();
for(int j = 0; j<good_matches.size(); j++){
objList1.addLast(list_key_left.get(good_matches.get(j).queryIdx).pt);
objList2.addLast(list_key_right.get(good_matches.get(j).trainIdx).pt);
}
object_left.fromList(objList1);
object_right ...