Matching thermographic / non-thermographic images with OpenCV feature detectors

asked 2013-05-16 16:06:43 -0600

bschlinker gravatar image

(This is a cross-post from StackOverflow, but I wanted to post here also)

I’m currently working on building software which can match infrared and non-infrared images taken from a fixed point using a thermographic camera.

The use case is the following: A picture is taken from using a tripod of a fixed point using an infrared thermographic camera and a standard camera. After taking the pictures, the photographer wants to match images from each camera. There will be some scenarios where an image is taken with only one camera as the other image type is unnecessary. Yes, it may be possible for the images to be matched using timestamps, but the end-user demands they be matched using computer vision.

I've looked at other image matching posts on StackOverflow -- they have often focused on using histogram matching and feature detectors. Histogram matching is not an option here, as we cannot match colors between the two image types. As a result, I've developed an application which does feature detection. In addition to standard feature detection, I’ve also added some logic which says that two keypoints cannot be matching if they are not within a certain margin of each other (a keypoint on the far left of the query image cannot match a keypoint on the far right of the candidate image) -- this process occurs in stage 3 of the code below.

To give you an idea of the current output, here is a valid and invalid match produced -- note the thermographic image is on the left. My objective is to improve the accuracy of the matching process.

Valid match: Valid match

Invalid match: Invalid match

Here is the code:

    // for each candidate image specified on the command line, compare it against the query image
        Mat img1 = imread(argv[1], CV_LOAD_IMAGE_GRAYSCALE); // loading query image
        for(int candidateImage = 0; candidateImage < (argc - 2); candidateImage++) {
            Mat img2 = imread(argv[candidateImage + 2], CV_LOAD_IMAGE_GRAYSCALE); // loading candidate image
            if(img1.empty() || img2.empty())
            {
                printf("Can't read one of the images\n");
                return -1;
            }

            // detecting keypoints
            SiftFeatureDetector detector;
            vector<KeyPoint> keypoints1, keypoints2;
            detector.detect(img1, keypoints1);
            detector.detect(img2, keypoints2);

            // computing descriptors
            SiftDescriptorExtractor extractor;
            Mat descriptors1, descriptors2;
            extractor.compute(img1, keypoints1, descriptors1);
            extractor.compute(img2, keypoints2, descriptors2);

            // matching descriptors
            BFMatcher matcher(NORM_L1);
            vector< vector<DMatch> > matches_stage1;
            matcher.knnMatch(descriptors1, descriptors2, matches_stage1, 2);

            // use nndr to eliminate weak matches
            float nndrRatio = 0.80f;
            vector< DMatch > matches_stage2;
            for (size_t i = 0; i < matches_stage1.size(); ++i)
            {
                if (matches_stage1[i].size() < 2)
                    continue;
                const DMatch &m1 = matches_stage1[i][0];
                const DMatch &m2 = matches_stage1[i][3];
                if(m1.distance <= nndrRatio * m2.distance)
                    matches_stage2.push_back(m1);
            }

            // eliminate points which are too far away from each other
            vector<DMatch> matches_stage3;
            for(int i = 0; i < matches_stage2.size(); i++) {
                Point queryPt = keypoints1.at(matches_stage2.at(i).queryIdx).pt;
                Point trainPt = keypoints2.at(matches_stage2.at(i).trainIdx).pt;

                // determine the lowest number here
                int lowestXAxis;
                int greaterXAxis;
                if(queryPt.x <= trainPt.x) { lowestXAxis = queryPt.x; greaterXAxis = trainPt.x; }
                else { lowestXAxis = trainPt.x; greaterXAxis = queryPt.x; }

                int lowestYAxis;
                int ...
(more)
edit retag flag offensive close merge delete