Ask Your Question

OpenCV Distance Metrics

asked 2012-11-27 02:47:13 -0600

aliciadominica gravatar image

Hello everyone, I'm doing a benchmark test on keypoint detectors, descriptors and matching of those for a research project. I'm aware that there are great benchmark tests out there but this will be done for a specific experimental environment. Descriptors will be Orb and Freak and detectors are ORB, SURF and maybe FAST. I will use BruteForceMathcer.

So, in order to make the test fair, I decided I will use approximately same amount of keypoints and the same distances. But I can't seem to find a detailed explanation of the distance metrics used in opencv.

For instance; when I use "BruteForceMatcher<l2<float> > matcher;" when both keypoint detector and descriptor is orb, it gives me a correct match between two points with coordinates point1(130,339) and point2(130,340). Obviously the distance between those two is 1, but when I look at the matches vector, the distance value is 272.71964 which is very confusing for me.

My question is, is there any documentation that explains why this is the case? I googled it, but haven't found a decent explaination. If not, I would really appreciate if you could explain this.

Thank you

edit retag flag offensive close merge delete

2 answers

Sort by ยป oldest newest most voted

answered 2012-11-27 05:26:53 -0600

Ben gravatar image

Venky gave you the right answer, I will just explain it a little more detailed. If you were comparing pixel locations, a best match would always have the same pixel coordinates. This only works, when you have identical or near identical images you are comparing. But assume that one image is cropped. You would then have an offset for your best match. So you don't want to compare pixel locations in order to define if it's a good match or not.

What you really want: compare the corresponding descriptor vectors. Every keypoint has a location and a descriptor which somehow describes the surroundings of that pixel. Those descriptor vectors have a fixed length to make it easier to compare them. When you calculate the distance between two descriptor vectors and the distance is very small, that means, that the surroundings of the two keypoints to be compared are very similar. Distance metrics can be for example euclidian for float vectors or the Hamming distance for boolean descriptor vectors.

I hope this helps you to understand how keypoint matches are computed.

edit flag offensive delete link more


Thanks, that was very informative. I was comparing the consequent video frames continuously until the end of the video, and I have switched to using euclidean distance which performs better in terms of accurracy. For the purpose of my experiment I think Euclidean distance is a better fit than what you just explained but the distance between descriptor vectors seems very useful for other purposes where images differ from each other more than two consequent video frames. Thanks again.

aliciadominica gravatar imagealiciadominica ( 2012-11-27 05:42:49 -0600 )edit

I don't know what you are trying to do, but you might want to have a look at CalcOpticalFlowmethods ( and / or DescriptorMatcher::radiusMatch.

Ben gravatar imageBen ( 2012-11-27 06:34:02 -0600 )edit

Thanks, I did use radiusMatch, and it improved the results over simple matching. I needed the optical flow info too so thanks for that :)

aliciadominica gravatar imagealiciadominica ( 2012-11-27 06:38:40 -0600 )edit

answered 2012-11-27 05:12:13 -0600

venky gravatar image

Hi, I am guessing/think the distance which it is showing is the distance between the two descriptors (vectors) and not the pixel location. Note: I have not used any of the functions above. Thanks Venky

edit flag offensive delete link more

Question Tools


Asked: 2012-11-27 02:47:13 -0600

Seen: 2,250 times

Last updated: Nov 27 '12