# How to interpret predict parameters obtained from KNN?

I have trained KNN to recognise obj A as 1

When I am doinh knn->findKnearest(data,m,response,neighbor,dist);

I get response as 0; Then neighbor [1,0,...] dist[0,0.23333]

What does this mean? Do I have to use euclidean distance to do matching?

edit retag close merge delete

( 2016-04-07 11:59:54 -0500 )edit

I have alreday read that. What I am asking is that is class 1 dist is 0 why am i getting response as class 0 instead of class 1

( 2016-04-07 12:07:57 -0500 )edit
1

That wasn't what you asked, you just threw some numbers and asked what they meant... Anyway, the docs say "In case of classification, the class is determined by voting" (I guess you're doing classification). Take a look at this tutorial to get a better grasp of knn, and why having a distance zero to one class does not necessarily imply to have a sample belonging to such class.

( 2016-04-07 12:24:31 -0500 )edit

I have not thrown some random numbers. I am working with KNN and I got this problem that is why I am asking. I do not understand how to get the correct label. Thanks for the link. It is the sample which which I have trained the data so knn should output label 1

( 2016-04-07 12:29:46 -0500 )edit

Are you using such very same sample code? If so, giving that both training and test data are randomly generated, I don't see why you should expect same results at all. And about the other issue, just saying that you should post clear detailed questions in the future.

( 2016-04-07 12:40:51 -0500 )edit
1

if you get mispredictions here:

• probably K is too large
( 2016-04-07 23:48:53 -0500 )edit

Sort by ยป oldest newest most voted

KNN uses the K nearest neighbors. So if you are using a k of say, 7, and your sample has the nearest points as [1,0,0,0,0,0,0], it's going to return a label of 0, even if the distance to the first point is 0. KNN is designed to be noise tolerant, which includes occasionally mis-labeled points.

Some versions include a weight in importance based on the distance, but there are problems with that (for example, what weight do you use for distance 0, discrete spaces, and weighting different dimensions differently) and I don't think OpenCV's KNN does that.

more

1

That's clearly explained in the previously linked tutorial

( 2016-04-08 01:27:49 -0500 )edit

Official site

GitHub

Wiki

Documentation