KNearest slow? C++

asked 2018-09-21 05:51:59 -0500

alibi12 gravatar image

updated 2018-09-22 03:27:20 -0500

Hi guys, i need to use the KNearest from OpenCV in C++, i read the Doc but i did't undertand it well.

I have confidence map (Mat_float) and a set of confidence features (Mat_float[40]) associated with this map. So each value of confidence map is associated with 40 values.

I need to find the mean value of the KNN of each pixel of the confidence map and i need to use the confidence features as weight for the knn search.

I tryied to implement this in c++, it works in some ways but it is very slow. (maybe 1/2 hours to complete the run)

    Ptr<ml::KNearest> knn(ml::KNearest::create());

int num_samples = rows*cols;

Mat samples;  //Each row has 40 features and represents a pixel
samples.create(num_samples, 40, CV_32F);

Mat response; //The i-row of the sample is associated with te (0, i) value of the array (or not?)
response.create(1, num_samples, CV_32F);

//Here i just fill the samples and response
num_samples = 0;
for(int r = 0; r < rows; r++){
for(int c = 0; c < cols; c++){<float>(0, num_samples) =<float>(r, c);

    for(int k = 0; k < 40; k++){<float>(num_samples, k) = feature_array[k].at<float>(r,c);
//The train is very fast
knn->train(samples, cv::ml::ROW_SAMPLE, response);
Mat result;

knn->findNearest(samples, 20, result);

Am i doing something wrong? Thanks for the help.

edit retag flag offensive close merge delete


and for each pixel i need to find the 20-Nearest neighbour.

maybe this needs some more explanation ? do you mean colors ? positions ?

what are you trying to achieve, in general here ?

also, we need some numbers. how many train samples do you have ? and how many classes ? and why the 20 ?

(e.g.: if you only have 3 samples per class, using a K of 20 won't work)

berak gravatar imageberak ( 2018-09-21 05:58:29 -0500 )edit

Yea sorry, i wans't very clear. I edited the question, i hope it will be more clear.

I'm trying to implement this from a paper that is not very clear so i'm a bit confused sorry.

alibi12 gravatar imagealibi12 ( 2018-09-21 06:38:58 -0500 )edit

somewhat better, but hmmm.

do you have a link to the paper, maybe ? (so we can see for ourselves ?)

KNearest is a classification mechanism, so, what are your classes/responses/labels ?

(i'm somewhat thinking, you wanted (unlabelled) clustering instead, but no idea without further explanation)

berak gravatar imageberak ( 2018-09-21 06:41:57 -0500 )edit

I'm sorry but it is hidden, however i'll leave here the link:

alibi12 gravatar imagealibi12 ( 2018-09-21 06:55:23 -0500 )edit

The features are then enhanced through adaptive filtering in the feature domain. In addition, the resulting confidence map, estimated using the confidence features with a random regression forest, is further improved through K-nearest neighbor based aggregation scheme on both pixel- and superpixel-level.

this is your problem ?

berak gravatar imageberak ( 2018-09-21 07:18:22 -0500 )edit

Yes, the part where it says that the confidence map is further improved. The KNN is used to compute the filtered confidence map Qi (pixel level) and Qn(superpixel level).

The next step is just a simple weighted sum between,y) and,y) to get the final value.

alibi12 gravatar imagealibi12 ( 2018-09-21 07:26:42 -0500 )edit

I edited with some code and better explanation i hope it is clear now

alibi12 gravatar imagealibi12 ( 2018-09-22 03:23:35 -0500 )edit