# Unable to get Mahalanobis distance

I am trying to find the Mahalanobis distances between a test sample image and a few training data images (At&t database). I took http://answers.opencv.org/question/34... as reference for my code. When I run the code I am getting the following error

"OpenCV Error: Assertion failed (type == v2.type() && type == icovar.type() && sz == v2.size() && len == icovar.rows && len == icovar.cols) in Mahalanobis, file /home/opencv-2.4.9/modules/core/src/matmul.cpp, line 2244 "

Please find the code snippet for mahalanobis distance here http://pastebin.com/Mg8DbFQJ

Mat covar, invcovar, mean;
for(size_t sampleIdx = 0; sampleIdx < _projections.size(); sampleIdx++) {

calcCovarMatrix(_projections[sampleIdx], covar, mean, CV_COVAR_SCRAMBLED|CV_COVAR_ROWS,CV_64F); //Calculating the covariance matrix
invert(covar, invcovar, DECOMP_SVD);                          //Calculating the inverse covariance matrix
double dist=Mahalanobis( _projections[sampleIdx], q, invcovar );
// Add to the resulting distance array:
if(distances.needed()) {
distances.getMat().at<double>(sampleIdx) = dist;
}
if((dist < minDist) && (dist < _threshold)) {
minDist = dist;
minClass = _labels.at<int>((int)sampleIdx);
}
}

edit retag close merge delete

Sort by » oldest newest most voted

I modified the code as in the below link

http://pastebin.com/pZGEYNwb

Mat covar, invcovar, mean;
calcCovarMatrix(_projections, covar, mean, CV_COVAR_NORMAL|CV_COVAR_ROWS,CV_64F);   //Calculating the covariance matrix

invert(covar, invcovar, DECOMP_SVD); //Calculating the inverse covariance matrix
//////For Calculating Mahalanobis distance///////////////"_projections"->Training images(40x10)//////"q" -> TestSample///////

for(size_t sampleIdx = 0; sampleIdx <_projections.size(); sampleIdx++) {

double dist=Mahalanobis( _projections[sampleIdx], q, invcovar );
// Add to the resulting distance array:
if(distances.needed()) {

distances.getMat().at<double>(sampleIdx) = dist;

}
if((dist < minDist) && (dist < _threshold)) {
minDist = dist;
minClass = _labels.at<int>((int)sampleIdx);
}
}


And for this I am getting distance vector as below

http://pastebin.com/9KCr7LTs

The above values are the distances calculated between a particular test image and 400 training images, where the test image is one among the training image itself.

So as per your comment, have I went wrong somewhere when I modified. Can you please tell me whether the distance vector I got is proper or not.

more

ah, nice, so not the covars of a single projection, but from all of them, right ?

how long did it take ? (you're calling predict() per test sample, so the expensive covar calculation/inverting should go to a different place, e.g. after the training)

"Can you please tell me whether the distance vector I got is proper or not." - not really. . you will have to loop over your vector, to find the smalles element, and see if labels[indexOfSmallest] is the correct one.

( 2015-09-15 00:19:45 -0500 )edit

It took only fraction of a second to compute the distance vector. And the smallest distance corresponds to the matching training image.

( 2015-09-15 01:39:05 -0500 )edit

again, apologies for being misleading before, i did not get, that you're doing this on PCA/LDA projections, which are fairly small.

( 2015-09-15 02:00:00 -0500 )edit

i tried your solution, and found, that it needed a normalization of the covars (to avoid neg numbers in Mahalanobis())

    calcCovarMatrix(features, covar, _mean, CV_COVAR_NORMAL|CV_COVAR_ROWS, CV_32F);
covar /= (features.rows-1);
invert(covar, icovar, DECOMP_SVD);

( 2015-09-15 06:14:48 -0500 )edit

Could you please tell me, the type of data for which you got negative Mahalanobis distance. The data set which I tried was At&t database with 40 class, where each class has 10 grayscale sample images. For this data I didn't get negative values.

( 2015-09-15 23:29:39 -0500 )edit

yale. but i alerady strolled far away from the opencv code, so i probably missed something elsewhere.

did your results improve (compared to say L2) ? definitely here.

( 2015-09-16 00:59:13 -0500 )edit

Apologies if my doubt is silly. I'm not aware of how to do the comparison. mahalanobis distance is within the range 0-1 and Euclidean is in and around 10^3. So how can we get the improvement. I tried reconstruction and then the similarity. Is that one way of doing?

( 2015-09-18 03:33:48 -0500 )edit

no idea, if i understand you right, but the value range for the distances should not matter at all for a nearest neighbour search.

was more asking: did you get less false predictions using Mahalanobis or L2 ?

trying to compare the reconstructed images might be a cute idea, have not tried that so far.

( 2015-09-18 03:40:00 -0500 )edit

Official site

GitHub

Wiki

Documentation