Ask Your Question

K-Nearest Neighbors, SURF and classifying images.

asked 2013-11-17 12:52:38 -0600

r3xus gravatar image


I've been tasked with the problem of creating a system capable of classifying two sets of images. These images are going to be either natural (landscapes) or man-made (buildings). Initially I'm supplied with a set of N images from one class and N from the other - the training set.

The way I imagine this system to work is to extract features with something like SURF and classify them using KNN. Thing is I don't quite get how to combine these two algorithms.

What I understand is that I can detect features with the detect function of the SurfFeatureDetector. I can also extract descriptors of these features with SurfDescriptorExtractor and compute. What I don't understand is how to send these features to KNN.

I presume that when one wants to use KNN in OpenCV the extracted feature descriptors are the trainData parameter of KNN, but what is the responses parameter, what do I pass there.

Also if my approach of using SURF is not optimal I would be very grateful if somebody suggests something different, just bare in mind I'm very new to the field of Computer Vision so anything terribly complex probably wouldn't suit me.


edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted

answered 2013-11-17 14:05:38 -0600

Guanta gravatar image

Okay, maybe let's first answer the KNN-question. If you have a matrix of feature vectors and corresponding labels, you can train your KNN-classifier as any other classifier with the train() method (, afterwards, you can use the findNearest() method ( for an unknown test-feature to predict the label by specifying how many k-nearest neighbors you want to consider (majority vote).

The response parameter is a one-dimensional matrix (i.e. just one row or one column) which contains the labels for your classes. For your problem, either choose (-1, 1) or (0,1) class labels, how you need to specify it, is explained here: For each descriptor one corresponding class label needs to be assigned.

So, and because you don't want to do this for all the features of one image, you don't use local features here, but global ones (otherwise your result will be very inaccurate!), i.e. you can use for example, global histograms features, or the popular GIST-feature ( Alternatively (the state of the art), you can use local features (typically densely sampled) to form a Bag-of-(visual)-Words descriptor (or short: BoW, or BovW), see how to build it, OpenCV has also integrated BoW, see the corresponding classes at

edit flag offensive delete link more


Thank you very much for your wonderful answer! I now have a grasp on how knn is working, I'm, just for the sake of it, trying to extract the number of straight lines in each image, the numbers of corners and the histogram mean and classify with that feature vector. If I get that implemented, I'm going to move on to what you suggest (BoW). Thanks again, dude!

r3xus gravatar imager3xus ( 2013-11-18 07:55:26 -0600 )edit

Hi, I am using KNN with BoW for a classifying problem. My question is how can I add some decision threshold to the KNN classifier, I mean, If the value is higher than a threshold consider that the classification has been done well, if not, show the best two classes of the results.

aee gravatar imageaee ( 2016-05-26 05:01:50 -0600 )edit

you can get the distances by the neighborResponses but you'd need to analyze them on your own.

Guanta gravatar imageGuanta ( 2016-06-08 16:11:36 -0600 )edit

hi, can anyone help me to give code for BOVW with KNN ?

aqsa gravatar imageaqsa ( 2017-11-07 11:40:12 -0600 )edit

Question Tools

1 follower


Asked: 2013-11-17 12:52:38 -0600

Seen: 2,849 times

Last updated: Nov 17 '13