Issues with OpenCV train_HOG c++ sample code? [closed]
I am trying to train a HOG+SVM on the INRIA dataset using the OpenCV sample code train_HOG.cpp. Firstly, I am confused why Support Vector Regression (EPS_SVR) is used for what is clearly a classification problem. I've tried changing it to C_SVC but am getting the following runtime error with all kernels other than linear (this happens for the regression case as well) while testing on the training set itself:
OpenCV Error: Assertion failed (alpha.total() == 1 && svidx.total() == 1 && sv_total == 1) in get_svm_detector terminate called after throwing an instance of 'cv::Exception' what(): HOGsvm.cpp:41: error: (-215) alpha.total() == 1 && svidx.total() == 1 && sv_total == 1 in function get_svm_detector Aborted (core dumped)
Any idea on why it is happening and how to resolve it?
EDIT: Here's the code snippet that's raising the error
Ptr<svm> svm = SVM::create() ;
svm->setTermCriteria(TermCriteria( CV_TERMCRIT_ITER+CV_TERMCRIT_EPS, 100, 1e-3 ));
svm->setType(SVM::C_SVC);
svm->setKernel(SVM::RBF);
svm->setC(0.01);
svm->setGamma(0.1);
svm->train(train_data, ROW_SAMPLE, Mat(labels));
HOGDescriptor myHog ;
myHog.winSize = Size(64, 128) ;
vector<float> hogDetector ;
Mat sv = svm->getSupportVectors();
const int sv_total = sv.rows;
Mat alpha, svidx;
double rho = svm->getDecisionFunction(0, alpha, svidx);
CV_Assert( alpha.total() == 1 && svidx.total() == 1 && sv_total == 1 );
i don't think, you're allowed to change the train params. (kernel or svm type), else the prediction later (which is just a hardcoded dot-product) will crash, because your "compressed support vector" will have the wrong shape.
For prediction, the SVM associated with a HOGDescriptor object is set explicitly using the setSVMDetector method of the class. I imagine as long as the vectors, in both training and test sets, are of the same size, this should work, right?
yes, exactly. the setSVMDetector() method sets the single (compressed) support vector for a linear SVM.
The issue I am facing is before the prediction phase. It happens when I am trying to get the support vectors and decision function from the trained SVM. Here's the code snippet where the assertion is failing.
Mat sv = svm->getSupportVectors(); const int sv_total = sv.rows; Mat alpha, svidx; double rho = svm->getDecisionFunction(0, alpha, svidx); CV_Assert( alpha.total() == 1 && svidx.total() == 1 && sv_total == 1 );
Can you help me pinpoint the exact issue?
please add some lines of code to your question
the problem is still the same: you cannot use any other kernel than LINEAR
Do you mean I cannot use any kernel other than LINEAR on HOG data? How do I experiment with other kernels and different parameters for HOG+SVM? Can you point me to appropriate resources? Are you aware of what dataset and method has been used to train the defaultPeopleDetector?
ha- wait: you're restricted to LINEAR and SVR if you want to train a support vector for the HOGDescriptor (using detectMultiScale() later). this is, what train_HOG.cpp does.
ofc. you can use any kernel, if you use the SVM on its own, e.g. make it a multi-class classification, and use svm->predict() , and hog.compute() to make the train/test features from small image patches
Thanks for clarifying that. Can you tell me in which part of the code the restriction of SVR+LINEAR kernel is enforced? Also, the assertion fails at getSupportVectors() and getDecisionFunction(), much before setting the SVM detector for a HOGDescriptor object. Printing the values gives alpha.total( ) = 200, svidx.total( ) = 200 and sv_total = 200. Can you help me understand what's happening here?
the assertion is from train_HOG.cpp, not from getDecisionFunction() and it checks, if you have a single 1d support vector, and 1d indices/alpha. those are required for the dot product in detectMultiScale() later.