Ask Your Question

Features2D example

asked 2015-05-02 06:45:31 -0500

LBerger gravatar image

updated 2015-05-03 15:07:17 -0500


I start writting an example about feature2d and matching. I have got some questions. First questions Is it a good program call in right order, graphical result , dynamicCast and ...? Second questions about create method I use ORB::create and BRISK::create. Is it possible to write something like create("ORB"). Third questions matches are saved in file. On screen I can see

0       237     0       808.622
1       208     0       676.574
2       220     0       558.299
3       15      0       297.963

in file I have got

  0 237 0 1145710537 1 208 0 1143547076 2 220 0 1141609254 3 15 0

I don't understand. have you got same results?

thanks for your help

#include <opencv2/opencv.hpp>
#include <vector>
#include <iostream>

using namespace std;
using namespace cv;

int main(void)
    vector<String> typeAlgoMatch;

    vector<String> typeDesc;

    String  dataFolder("../data/");
    vector<String> fileName;

    Mat img1 = imread(dataFolder+fileName[0], IMREAD_GRAYSCALE);
    Mat img2 = imread(dataFolder+fileName[1], IMREAD_GRAYSCALE);

    Ptr<Feature2D> b;

    vector<String>::iterator itDesc;
// Descriptor loop
    for (itDesc = typeDesc.begin(); itDesc != typeDesc.end(); itDesc++){
        Ptr<DescriptorMatcher> descriptorMatcher;
        vector<DMatch> matches;         /*<! Match between img and img2*/
        vector<KeyPoint> keyImg1;       /*<! keypoint  for img1 */
        vector<KeyPoint> keyImg2;       /*<! keypoint  for img2 */
        Mat descImg1, descImg2;         /*<! Descriptor for img1 and img2 */
        vector<String>::iterator itMatcher = typeAlgoMatch.end();
        if (*itDesc == "AKAZE"){
            b = AKAZE::create();
        else if (*itDesc == "ORB"){
            b = ORB::create();
        else if (*itDesc == "BRISK"){
            b = BRISK::create();
        try {
            b->detect(img1, keyImg1, Mat());
            b->compute(img1, keyImg1, descImg1);
            b->detectAndCompute(img2, Mat(),keyImg2, descImg2,false);
 // Match method loop
             for (itMatcher = typeAlgoMatch.begin(); itMatcher != typeAlgoMatch.end(); itMatcher++){
                descriptorMatcher = DescriptorMatcher::create(*itMatcher);
                descriptorMatcher->match(descImg1, descImg2, matches, Mat());
 // Keep best matches only to have a nice drawing 
                Mat index;
                Mat tab(matches.size(), 1, CV_32F);
                for (int i = 0; i<matches.size(); i++)
          <float>(i, 0) = matches[i].distance;
                sortIdx(tab, index, SORT_EVERY_COLUMN + SORT_ASCENDING);
                vector<DMatch> bestMatches; /*<! best match */

                for (int i = 0; i<30; i++)
                    bestMatches.push_back(matches[<int>(i, 0)]);

                Mat result;
                drawMatches(img1, keyImg1, img2, keyImg2, bestMatches, result);
                namedWindow(*itDesc+": "+*itMatcher, WINDOW_AUTOSIZE);
                imshow(*itDesc + ": " + *itMatcher, result);
                FileStorage fs(*itDesc+"_"+*itMatcher+"_"+fileName[0]+"_"+fileName[1]+".xml", FileStorage::WRITE);

                vector<DMatch>::iterator it;

            cout << "Index \tIndex \tdistance\n";
            cout << "in img1\tin img2\n";
            double cumSumDist2=0;
            for (it = bestMatches.begin(); it != bestMatches.end(); it++)
                cout << it->queryIdx << "\t" <<  it->trainIdx << "\t"  <<   it->distance << "\n";
                Point2d p=keyImg1[it->queryIdx].pt-keyImg2[it->trainIdx].pt;
    catch (Exception& e){
        cout << "Feature : " << *itDesc << "\n";
        if (itMatcher != typeAlgoMatch.end())
            cout << "Matcher : " << *itMatcher << "\n";
int i=0;
cout << "\n\t";
for (vector<String>::iterator itMatcher = typeAlgoMatch.begin(); itMatcher != typeAlgoMatch.end(); itMatcher++)
cout << "\n";
for (itDesc = typeDesc.begin(); itDesc != typeDesc.end(); itDesc++){
    cout ...
edit retag flag offensive close merge delete



Does this version work with you instead of using dynamicCast ?

Ptr<DescriptorMatcher> matcher = DescriptorMatcher::create("BruteForce");
Eduardo gravatar imageEduardo ( 2015-05-02 15:12:08 -0500 )edit

@Eduardo Yes it does

LBerger gravatar imageLBerger ( 2015-05-02 15:46:55 -0500 )edit

It is no more possible to use in OpenCV 3.0 something like (link here and here):

Ptr<Feature2D> feature = Feature2D::create("ORB");

So your code is correct.

Eduardo gravatar imageEduardo ( 2015-05-02 16:01:54 -0500 )edit

are you trying to add a sample ? good ;)

  • opencv2/opencv.hpp should be enough to include
  • rather use cv::String, than std::string
  • your vector<> initialization requires c++11, not everyone has it.
  • gnu-style indentation of braces - they hate it (so do i :)
  • i get the same numbers in the filestorage, this looks like a bug (DMatch gets treated there as a vector of ints, written with "4i" as format string, so bogus value for the float DMatch::distance)
berak gravatar imageberak ( 2015-05-03 01:22:44 -0500 )edit
berak gravatar imageberak ( 2015-05-03 02:00:42 -0500 )edit

@berak I have modified source file.

LBerger gravatar imageLBerger ( 2015-05-03 03:54:09 -0500 )edit

good ! it also might be better, to write the xml files to your local folder, not to samples/data

berak gravatar imageberak ( 2015-05-03 03:58:29 -0500 )edit

Some few more comments:

  • you can keep with if / else if version when you check for the feature name
  • if you plan to use the same method for features detection / extraction, you can use directly detectAndCompute (as you did for image 2) I think
  • be careful, not all decriptor matcher methods are suitable for every type of descriptors: you seem to use for now only binary descriptors so you should use BruteForce-Hamming. The 2nd version of Hamming distance is for certain configuration of ORB.
Eduardo gravatar imageEduardo ( 2015-05-03 10:18:34 -0500 )edit

@Eduardo About two methods it's only to say that there is two way to do the same thing. About your last remark i have catch exception and there is no exception on my computer. For all descriptors methods can be called. After in source file of this method may be you're right hamming=hamming(2). I don't know. on image result I can see some difference betwenn hamming and hamming(2) for all descriptors. May be results are different for an another reason. I have create a pull request for this sample must I delete hamming(2)?

LBerger gravatar imageLBerger ( 2015-05-03 12:55:59 -0500 )edit

Some late comments after I see the pull request. I still don't see the point to test the different binary descriptors with all the matching methods. For me, it is a nonsense to use a matching method other than BruteForce-Hamming with these binary descriptors as we will always end up intrinsically with a wrong result. For a novice user, it will mess him up.

Other point, you have the cumulative distance (train/query distance point) between each matching method. How could you decide which one is the most appropriate ? If you really want to show the possible matching method, I think that the best is to choose two images with a known homography and to show that the distance error between the train match point and the true corresponding point is bigger with the inappropriate matching methods.

Eduardo gravatar imageEduardo ( 2015-05-07 05:07:23 -0500 )edit

2 answers

Sort by ยป oldest newest most voted

answered 2015-05-06 05:44:16 -0500

berak gravatar image

updated 2015-05-06 05:45:10 -0500

hmm, as long as opncv can't serialize the dmatches properly, either don't save them, or come up with a better way ?

void write(FileStorage& fs, const String& objname, const std::vector<DMatch>& matches) {
    fs << objname << "[";
    for (size_t i=0; i<matches.size(); i++) {
        const DMatch &dm = matches[i];
        cv::write(fs, dm.queryIdx);
        cv::write(fs, dm.trainIdx);
        cv::write(fs, dm.imgIdx);
        cv::write(fs, dm.distance);
    fs << "]";
void read(FileStorage& fs, const String& objname, std::vector<DMatch>& matches) {
    FileNode pnodes = fs[objname];
    for (FileNodeIterator it=pnodes.begin();; ) {
        DMatch dm;
        *it >> dm.queryIdx; it++;
        *it >> dm.trainIdx; it++;
        *it >> dm.imgIdx;   it++;
        *it >> dm.distance; it++;
        if (it==pnodes.end()) break;
ostream& operator << (ostream &out, const std::vector<DMatch>& matches) {
    out << "[" ;
    for (size_t i=0; i<matches.size(); i++) {
        const DMatch &dm = matches[i];
        out << dm.queryIdx << "," << dm.trainIdx << "," << dm.imgIdx << "," << dm.distance << ";" << endl;
    out << "]" << endl;
    return out;

//// optionally, read it back in:
//int main2(void)
//    vector<DMatch> matches2;
//    FileStorage fs2("some.xml", FileStorage::READ);
//    read(fs2,"Matches",matches2);
//    cout << matches2 << endl;
//    fs2.release();
//    return 0;
edit flag offensive delete link more


Additional info: for custom class, I just have found this tutorial (File Input and Output using XML and YAML files) for how to serialize to XML or YAML.

Eduardo gravatar imageEduardo ( 2015-05-06 08:49:11 -0500 )edit

@berak I have created pull request with xml file saved. In my opinion I should write a comment about this bug and hope that it would be fixed.

LBerger gravatar imageLBerger ( 2015-05-06 12:47:38 -0500 )edit

yes, seen that. (did not want to inflate the comments there further, steven already had his share, hehe ;))

(hmm, should we pull straws now, who's to do the fix ?)

berak gravatar imageberak ( 2015-05-06 13:10:58 -0500 )edit

I don't understand all what you write (my english is bad) but about my sample I have just push a comment in the code and changed some file name of yml path. This example is about ORB... so yml is not usefull may be I can delete this three lines. About bug 4308 I have tried to debug VecWriterProxy in persistance.hpp. I don't understand this two lines:

   int _fmt = DataType<_Tp>::fmt;
   char fmt[] = { (char)((_fmt >> 8) + '1'), (char)_fmt, '\0' };

How can we have 4i and 4f in fmt? Actually there is only 4i .

LBerger gravatar imageLBerger ( 2015-05-06 13:30:50 -0500 )edit

your diagnosis is correct.

it can't have different types in the same datatype (that's the real bug behind it)

berak gravatar imageberak ( 2015-05-06 13:45:01 -0500 )edit

answered 2015-05-03 16:23:45 -0500

Eduardo gravatar image

updated 2015-05-03 16:40:48 -0500

I will try to elaborate a little more my comment about matching method. Could be useful for someone else reading this. Hope I won't say too much wrong things.

First of all, a descriptor can be seen basically as an array of numbers that encode the local information around the corresponding keypoint.

For floating point descriptors (like SIFT, or SURF), we can compute the distance between them by:

Euclidean distance

This is the classical Euclidean distance between two vectors.

Binary descriptors use a binary string to encode the local information, and the suitable distance to use is the Hamming distance (the number of different bits):

Hamming distance

One of the advantages of binary descriptors over previous method is the CPU cost involved in the distance computation and thus on the matching process. The XOR operation on two arrays followed by a bit count shoud be less expensive than computing the L2 norm on our CPU architecture.

For example, descriptor 1 is 10010 and descriptor 2 is 11010. If we treat the binary string as an array of 5 numbers, the Euclidean distance would be 1, the Hamming distance would be 1.

If we have the same result (before the square root operation) in theory, it could be different on a computer, based upon how numbers are represented. In OpenCV, this binary string seems to be represented by a vector of uchar (8 bits) (CV_8U): 10010 becomes 18 and 11010 becomes 26. The Euclidean distance would be 8 and the Hamming distance would still be 1.

For example, we have a descriptor1=00010010 (18 dec) and we want to match it to the closest descriptors, descriptor2=00011010 (26 dec) or descriptor3=00010111 (23 dec).

normL2(descriptor1, descriptor2) //== 8
normL2(descriptor1, descriptor3) //== 5
normHamming(descriptor1, descriptor2) //==1
normHamming(descriptor1, descriptor3) //==2

This would lead to match descriptor1 to a different descriptor according to the matching method.

Some test code:

#include <iostream>
#include <opencv2/opencv.hpp>    

int main() {
  uchar desc1 = 0x12; //18 dec ; 10010
  uchar desc2 = 0x1A; //26 dec ; 11010
  uchar desc3 = 0x17; //23 dec ; 10111

  cv::Mat M1 = (cv::Mat_<float>(1,1) << desc1);
  cv::Mat M2 = (cv::Mat_<float>(1,1) << desc2);
  cv::Mat M3 = (cv::Mat_<float>(1,1) << desc3);
  std::cout << "NORM_L2=" << cv::norm(M1, M2, cv::NORM_L2) << std::endl; //8
  std::cout << "NORM_L2=" << cv::norm(M1, M3, cv::NORM_L2) << std::endl; //5

  cv::Mat M4 = (cv::Mat_<uchar>(1,1) << desc1);
  cv::Mat M5 = (cv::Mat_<uchar>(1,1) << desc2);
  cv::Mat M6 = (cv::Mat_<uchar>(1,1) << desc3);
  std::cout << "NORM_HAMMING=" << cv::norm(M4, M5, cv::NORM_HAMMING) << std::endl; //1
  std::cout << "NORM_HAMMING=" << cv::norm(M4, M6, cv::NORM_HAMMING) << std::endl; //2

  cv::Ptr<cv::DescriptorMatcher> matcher = cv::DescriptorMatcher::create("BruteForce");
  std::vector<cv::DMatch> matches;
  matcher->match(M4, M5, matches);
  std::cout << "Distance=" << matches[0].distance << std::endl; //8

  matcher = cv::DescriptorMatcher::create("BruteForce-Hamming");
  matcher->match(M4, M5, matches);
  std::cout << "Distance=" << matches[0].distance << std::endl; //1

  return 0;

PS1: It could be easy to add ... (more)

edit flag offensive delete link more

Question Tools



Asked: 2015-05-02 06:45:31 -0500

Seen: 2,233 times

Last updated: May 06 '15