Support Vector Regression prediction values

asked 2016-09-19 12:14:32 -0600

Angulu gravatar image

I am training a regression model to predict a label given a feature vector. Training and testing samples are drawn from 10 classes. I have trained SVM for regression as shown in the code below

void train::trainSVR(Mat data, vector<int> labels)
    Ptr<TrainData> trainData = TrainData::create(data, ml::ROW_SAMPLE, labels);
    Ptr<SVM> svr = SVM::create();
    svr->setType(SVM::Types::NU_SVR);//For n-class classification problem with imperfect class separation
    svr->setGamma(10.0);//for poly
    svr->setDegree(0.1);//for poly
    svr->setCoef0(0.0);//for poly

    cout << "Training Support Vector Regressor..." << endl;
    bool trained = svr->isTrained();
    if (trained)
        cout << "SVR Trained. Saving..." << endl;
        svr->save(".\\Trained Models\\SVR Model.xml");
        cout << "SVR Model Saved." << endl;

I have predicted my model as shown in the function below

void train::predictSVR(Mat data, vector<int> labels){
    Mat_ <float> output;
    vector<int> predicted;
    //Create svm smart pointer and load trained model
    Ptr<ml::SVM> svr = Algorithm::load<ml::SVM>(".\\Trained Models\\SVR Model.xml");
    ofstream prediction;"SVR Prediction.csv", std::fstream::app);//open file for writing predictions in append mode
    for (int i = 0; i < labels.size(); i++)
        float pred = svr->predict(data.row(i));
        prediction << labels[i] << "," << pred << endl;


The prediction gives me some value against every label. My questions are:-

  1. What does the value against the label signify?
  2. How do I retrieve Mean squared error (mse), support vectors and constants for the regression function y=wx +b. How do I get b, and w Kindly advice.
edit retag flag offensive close merge delete


I don't think regression is what you need at all...

LorenaGdL gravatar imageLorenaGdL ( 2016-09-19 13:58:14 -0600 )edit

What would you advice I use? My feature vector is a 2D Mat with 5 samples from each of the 10 classes. My labels are in the form of 1, 1, 1, 1,1,2,2,2,2,2.....10,10,10,10,10

Angulu gravatar imageAngulu ( 2016-09-20 02:27:18 -0600 )edit

That's simply a typical multi-class classification problem, not a regression one. Use C_SVC (or NU_SVC).

LorenaGdL gravatar imageLorenaGdL ( 2016-09-20 03:14:28 -0600 )edit

Thank you. I have tried multi-class classification using One-vs-All but am confused on how to combine these N classifiers for prediction. Could you advise me how to approach that? Am grateful for your input

Angulu gravatar imageAngulu ( 2016-09-20 03:20:08 -0600 )edit

You don't need to create N classifiers, just one with as many labels as classes. Prediction is internally handled, using one-vs-one boundaries, and outputting the class with highest number of votes. Code is exactly the same as a simple 2-class classification problem.

LorenaGdL gravatar imageLorenaGdL ( 2016-09-20 03:33:18 -0600 )edit

OK. Thank you. Maybe I need to ask a different question for I have done multi-class classification and got very low accuracy. Am grateful

Angulu gravatar imageAngulu ( 2016-09-20 03:38:42 -0600 )edit

This is the training code am using for multi-class classification. Kindly advice how to improve it for training accuracy improvement

void train::trainSVM(Mat hists, vector<int> labels){

    Ptr<TrainData> trainData = TrainData::create(hists, ml::ROW_SAMPLE, labels);
    Ptr<SVM> svm = SVM::create();
    svm->setKernel(SVM::KernelTypes::RBF);//RBF needs only 2 parameters C and gamma
Angulu gravatar imageAngulu ( 2016-09-20 03:41:50 -0600 )edit