# KNearest slow? C++

Hi guys, i need to use the KNearest from OpenCV in C++, i read the Doc but i did't undertand it well.

I have confidence map (Mat_float) and a set of confidence features (Mat_float[40]) associated with this map. So each value of confidence map is associated with 40 values.

I need to find the mean value of the KNN of each pixel of the confidence map and i need to use the confidence features as weight for the knn search.

I tryied to implement this in c++, it works in some ways but it is very slow. (maybe 1/2 hours to complete the run)

    Ptr<ml::KNearest> knn(ml::KNearest::create());
knn->setIsClassifier(false);

int num_samples = rows*cols;

Mat samples;  //Each row has 40 features and represents a pixel
samples.create(num_samples, 40, CV_32F);

Mat response; //The i-row of the sample is associated with te (0, i) value of the array (or not?)
response.create(1, num_samples, CV_32F);

//Here i just fill the samples and response
num_samples = 0;
for(int r = 0; r < rows; r++){
for(int c = 0; c < cols; c++){

response.at<float>(0, num_samples) =  _confidence_map.at<float>(r, c);

for(int k = 0; k < 40; k++){
samples.at<float>(num_samples, k) = feature_array[k].at<float>(r,c);
}
num_samples++;
}
}
//The train is very fast
knn->train(samples, cv::ml::ROW_SAMPLE, response);
Mat result;

knn->findNearest(samples, 20, result);


Am i doing something wrong? Thanks for the help.

edit retag close merge delete

and for each pixel i need to find the 20-Nearest neighbour.

maybe this needs some more explanation ? do you mean colors ? positions ?

what are you trying to achieve, in general here ?

also, we need some numbers. how many train samples do you have ? and how many classes ? and why the 20 ?

(e.g.: if you only have 3 samples per class, using a K of 20 won't work)

( 2018-09-21 05:58:29 -0500 )edit
1

Yea sorry, i wans't very clear. I edited the question, i hope it will be more clear.

I'm trying to implement this from a paper that is not very clear so i'm a bit confused sorry.

( 2018-09-21 06:38:58 -0500 )edit

somewhat better, but hmmm.

do you have a link to the paper, maybe ? (so we can see for ourselves ?)

KNearest is a classification mechanism, so, what are your classes/responses/labels ?

(i'm somewhat thinking, you wanted (unlabelled) clustering instead, but no idea without further explanation)

( 2018-09-21 06:41:57 -0500 )edit

I'm sorry but it is hidden, however i'll leave here the link: https://ieeexplore.ieee.org/document/...

( 2018-09-21 06:55:23 -0500 )edit

The features are then enhanced through adaptive filtering in the feature domain. In addition, the resulting confidence map, estimated using the confidence features with a random regression forest, is further improved through K-nearest neighbor based aggregation scheme on both pixel- and superpixel-level.