2014-01-25 05:20:12 -0600 | commented question | Feature Descriptors always the same ? Found the problem: I was not able to read out the "data" because it's somewhere in the memory. You could read it out if you write the Mat File to an XML file |
2014-01-25 05:18:51 -0600 | asked a question | iOS Image Recognition Hello OpenCV developers,
Do you have ideas which algorithms I could use to get a better recognition of the image? (SURF / SIFT is not better) |
2014-01-21 01:07:57 -0600 | commented question | Feature Descriptors always the same ? This data above is before converting it for the database. It's taken right after extracting it with the ORBDescriptorExtractor I know that it works, because the matching works. But which is the important data for the matching? Everything looks like it is the same |
2014-01-20 16:09:24 -0600 | asked a question | Feature Descriptors always the same ? Hey, I am new to OpenCV and I'm working on an image recognition/matching iPhone App. The aim is to take a picture and to compare the matching to other pictures and find the correct picture. For saving calculation time I wanted to store my descriptors of the images in a database. So I created some little classes, which handle that problem. But when I looked at my database every entry was exactly the same. This was not a fault of my code I think. I compared the cv::Mat descriptors output for all images and it exactly looks equal. I know that my image recognition works quite good. But how could that work without different descriptors. I am using ORBFeatureDetection and ORBDescriporExtraction At my picture you can see the output for my image. Which of these values are important for the BFMatcher or the FlannBasedMatcher to compare the images. Here's an output of a different image: As you can see its exactly the same. So how could the matcher two exactly same descriptors. Here's some sample Code: } |