Ask Your Question
1

How can I get weights from the trained SVM (regression)?

asked 2019-01-31 17:30:18 -0600

AlexB gravatar image

How can I get bias and array of weights from the trained SVM::Types::NU_SVR to perform prediction by myself?

For example, I trained SVM for regression with linear kernel function and I want to get the trained weights, to perform this part of code by myself: https://github.com/cjlin1/libsvm/blob...

I want to get double sv_coef[] and double bias from trained Ptr<SVM> svr:

double train::trainSVR(Mat data, vector<int> labels, Mat unlabled_data)
{
    Ptr<TrainData> trainData = TrainData::create(data, ml::ROW_SAMPLE, labels);
    Ptr<SVM> svr = SVM::create();
    svr->setType(SVM::Types::NU_SVR);  //For regression, to predict value
    svr->setKernel(SVM::LINEAR);  // linear kernel function
    svr->setTermCriteria(TermCriteria(TermCriteria::MAX_ITER, 100, 1e-6));
    svr->setNu(0.1);

    cout << "Training Support Vector Regressor..." << endl;
    //svr->trainAuto(trainData);
    svr->train(trainData);
    bool trained = svr->isTrained();
    double sum = 0;
    if (trained)
    {
        //svr->save(".\\Trained Models\\SVR Model.xml");

        // how to get: double sv_coef[] ?
        // how to get: double bias (rho) ?

        // custom prediction
        for(int i=0; i<unlabled_data.cols; i++) {
                sum += sv_coef[i] * unlabled_data.at<double>(0,i);
        }
        sum -= bias;
    }
    return sum;
}
edit retag flag offensive close merge delete

Comments

imho, your for-loop is wrong. it should iterate over the support vectors, not over the query data

berak gravatar imageberak ( 2019-02-01 02:41:25 -0600 )edit

1 answer

Sort by ยป oldest newest most voted
1

answered 2019-02-01 02:25:05 -0600

berak gravatar image

updated 2019-02-01 02:51:03 -0600

hmmm, there are differences between libsvm and opencv's SVM.

e.g. coef0 is a single constant, and it's not even used in the case of a linear SVM

while you can get at the data, you would be doing something else in the prediction pass, than in the training pass.

it's probably a better idea, to implement a custom kernel:

struct XKernel : public ml::SVM::Kernel
{
    virtual ~XKernel() {}

    //! your distance metric between one of the support vecs(sample)
    //   and your query(another) goes here
    float per_elem(int var_count, const float* sample, const float* another) {}

    //! post-process results array (if nessecary, e.g. apply bias)
    virtual void post(int vcount, float* results) {}

    void calc(int vcount, int var_count, const float* vecs, const float* another, float* results)
    {
        for (int j=0; j<vcount; j++)
        {
            const float* sample = &vecs[j*var_count];
            results[j] = per_elem(var_count, sample, another);
        }
        post(vcount, result);
    }

    int getType(void) const
    {
        return -1; // we're special.
    }

};

Ptr<ml::SVM> svm = ml::SVM::create();
Ptr<XKernel> kern = makePtr<XKernel>();
svm->setKernel(kern);
edit flag offensive delete link more

Question Tools

1 follower

Stats

Asked: 2019-01-31 17:30:18 -0600

Seen: 654 times

Last updated: Feb 01 '19