OpenCV returning NULL Descriptor Matrix
I am trying to use OpenCV for generating descriptors at keypoints in an image on iOS. I have done feature detection using my own algorithm and now want to extract the descriptors at these points.
I implemented the following code to do this:
cv::Mat frame = LH_ImageProcessing::extractGrayFromBGRA(Img, 480, 640);
std::vector<cv::KeyPoint> keyPoints;
cv::Mat descriptors;
for (int i = 0; i < cornersDetected; i++) {
keyPoints.push_back(cv::KeyPoint((float) cornerArray[i*2], (float) cornerArray[i*2+1], 5));
}
FREAK extractor;
extractor.compute(frame, keyPoints, descriptors);
However the descriptors Mat is always empty after I run the "compute" function. All the pointers are just NULL, although I can clearly see that the keyPoints array is a reduced size after I run it. Which means that it is removing keypoints that it can't extract descriptors for.
I thought it was my implementation so I used a built in detector (SurfDetector) and copied the implementation from the OpenCV FREAK example. But I end up with the same result.
Is anyone else having issues, or have I missed something fundamental in OpenCV?
can you debug into the compute function?
Nope. The library OpenCV give for iOS is precompiled without debug flags. I have tried numerous times to compile from source but never come right and to be honest just don't have the time.