OpenCV Q&A Forum - RSS feedhttp://answers.opencv.org/questions/OpenCV answersenCopyright <a href="http://www.opencv.org">OpenCV foundation</a>, 2012-2018.Thu, 31 Jan 2019 17:30:18 -0600How can I get weights from the trained SVM (regression)?http://answers.opencv.org/question/208243/how-can-i-get-weights-from-the-trained-svm-regression/How can I get bias and array of weights from the trained `SVM::Types::NU_SVR` to perform prediction by myself?
For example, I trained SVM for regression with linear kernel function and I want to get the trained weights, to perform this part of code by myself: https://github.com/cjlin1/libsvm/blob/3648ef34f4b7a869ca8b9c064c48ef1d56904c7f/svm.cpp#L2509-L2512
I want to get `double sv_coef[]` and `double bias` from trained `Ptr<SVM> svr`:
double train::trainSVR(Mat data, vector<int> labels, Mat unlabled_data)
{
Ptr<TrainData> trainData = TrainData::create(data, ml::ROW_SAMPLE, labels);
Ptr<SVM> svr = SVM::create();
svr->setType(SVM::Types::NU_SVR); //For regression, to predict value
svr->setKernel(SVM::LINEAR); // linear kernel function
svr->setTermCriteria(TermCriteria(TermCriteria::MAX_ITER, 100, 1e-6));
svr->setNu(0.1);
cout << "Training Support Vector Regressor..." << endl;
//svr->trainAuto(trainData);
svr->train(trainData);
bool trained = svr->isTrained();
double sum = 0;
if (trained)
{
//svr->save(".\\Trained Models\\SVR Model.xml");
// how to get: double sv_coef[] ?
// how to get: double bias (rho) ?
// custom prediction
for(int i=0; i<unlabled_data.cols; i++) {
sum += sv_coef[i] * unlabled_data.at<double>(0,i);
}
sum -= bias;
}
return sum;
}AlexBThu, 31 Jan 2019 17:30:18 -0600http://answers.opencv.org/question/208243/Implementation of ERTreeshttp://answers.opencv.org/question/181147/implementation-of-ertrees/Hi OpenCV Community,
I've been schrolling through the source code of OpenCV mainly the old_ml version OpenCV 2 still, and I stuck because I couldn't find the source of the *ERTrees* implementation.
Anyone know where is the implementation of ERTrees?
I wanted to check the differences between the ERTrees and RTrees in the implementatio nof OpenCV, but so far I only found a header.
Thank you for your time and contribution,
Any answer is appetiated,
Bestbestflatron walterWed, 27 Dec 2017 06:32:54 -0600http://answers.opencv.org/question/181147/Boosted Regression Trees OpenCV 3.1http://answers.opencv.org/question/102817/boosted-regression-trees-opencv-31/ Hi
I'm a user of gradient boosted trees for regression in OpenCV 2.4.x.
http://docs.opencv.org/2.4/modules/ml/doc/gradient_boosted_trees.html
These do not appear to exist in OpenCV 3.1. Is there any alternative in OpenCV 3.1, or any specific reasoning behind their exclusion from the latest codebase?
I have been trying to use the cv::Boost class in OpenCV 3.1 for regression but it seems that there is no working codepath to support this. Can anyone advise on whether Boosted regression is possible in OpenCV 3.1?
Thanks in advance
MikeMikeThu, 22 Sep 2016 13:01:20 -0500http://answers.opencv.org/question/102817/Support Vector Regression prediction valueshttp://answers.opencv.org/question/102574/support-vector-regression-prediction-values/ I am training a regression model to predict a label given a feature vector. Training and testing samples are drawn from 10 classes. I have trained SVM for regression as shown in the code below
void train::trainSVR(Mat data, vector<int> labels)
{
Ptr<TrainData> trainData = TrainData::create(data, ml::ROW_SAMPLE, labels);
Ptr<SVM> svr = SVM::create();
svr->setKernel(SVM::KernelTypes::POLY);
svr->setType(SVM::Types::NU_SVR);//For n-class classification problem with imperfect class separation
//svr->setC(5);
//svr->setP(0.01);
svr->setGamma(10.0);//for poly
svr->setDegree(0.1);//for poly
svr->setCoef0(0.0);//for poly
svr->setNu(0.1);
cout << "Training Support Vector Regressor..." << endl;
//svr->trainAuto(trainData);
svr->train(trainData);
bool trained = svr->isTrained();
if (trained)
{
cout << "SVR Trained. Saving..." << endl;
svr->save(".\\Trained Models\\SVR Model.xml");
cout << "SVR Model Saved." << endl;
}
}
I have predicted my model as shown in the function below
void train::predictSVR(Mat data, vector<int> labels){
Mat_ <float> output;
vector<int> predicted;
//Create svm smart pointer and load trained model
Ptr<ml::SVM> svr = Algorithm::load<ml::SVM>(".\\Trained Models\\SVR Model.xml");
ofstream prediction;
prediction.open("SVR Prediction.csv", std::fstream::app);//open file for writing predictions in append mode
for (int i = 0; i < labels.size(); i++)
{
float pred = svr->predict(data.row(i));
prediction << labels[i] << "," << pred << endl;
}
}
The prediction gives me some value against every label. My questions are:-
1. What does the value against the label signify?
2. How do I retrieve Mean squared error (mse), support vectors and constants for the regression function `y=wx +b`. How do I get `b`, and `w`
Kindly advice.AnguluMon, 19 Sep 2016 12:14:32 -0500http://answers.opencv.org/question/102574/Trouble with Knnhttp://answers.opencv.org/question/95746/trouble-with-knn/I've build a knn model for regression, but isn't work.
When I call for a prediction, with a set of numbers, the answer always be the same value, like this >>
> In = 0.742781 || Out = 0.917355
> In = 0.557086 || Out = 0.917355
> In = 0.19518 || Out = 0.917355
> In = 0.9759 || Out = 0.917355
> In = 0.09759 || Out = 0.917355
the model is setted to regression (setIsClassifier(false)) and the Algorithm Type is Brute Force (The KdTree isn't work too, showing a error).
Please, guys. I need so much this help.
@berak , the code
//Loading the model, early trained.
Ptr<KNearest> knn = Algorithm::load<KNearest>(str.str());
//Loading the value for regression
sampleMat.at<float>(0,0) = cloud_normals->points[i/3].normal_x;
knns[i]->findNearest(sampleMat, 5, response);
//Saving the regression
tmp.x = response.at<float>(0,0);
Visual Results.
After the train, i've tested the model.
![Results from the train. The first cloud, the regression. The second, the target](/upfiles/14653078989108575.png)
OpenCV v2 (2.4), Ubuntu 14.04
Lucas Amparo BarbosaMon, 06 Jun 2016 11:33:17 -0500http://answers.opencv.org/question/95746/Logistic Regression on MNIST datasethttp://answers.opencv.org/question/94405/logistic-regression-on-mnist-dataset/ In [this](https://www.simplicity.be/article/recognizing-handwritten-digits) post you can find a very good tutorial on how to apply SVM classifier to MNIST dataset. I was wondering if I could use logistic regression instead of SVM classifier. So I searhed for Logistic regression in openCV, And I found that the syntax for both classifiers are almost identical. So I guessed that I could just comment out these parts:
cv::Ptr<cv::ml::SVM> svm = cv::ml::SVM::create();
svm->setType(cv::ml::SVM::C_SVC);
svm->setKernel(cv::ml::SVM::POLY);//LINEAR, RBF, SIGMOID, POLY
svm->setTermCriteria(cv::TermCriteria(cv::TermCriteria::MAX_ITER, 100, 1e-6));
svm->setGamma(3);
svm->setDegree(3);
svm->train( trainingMat , cv::ml::ROW_SAMPLE , labelsMat );
and replace it with:
cv::Ptr<cv::ml::LogisticRegression> lr1 = cv::ml::LogisticRegression::create();
lr1->setLearningRate(0.001);
lr1->setIterations(10);
lr1->setRegularization(cv::ml::LogisticRegression::REG_L2);
lr1->setTrainMethod(cv::ml::LogisticRegression::BATCH);
lr1->setMiniBatchSize(1);
lr1->train( trainingMat, cv::ml::ROW_SAMPLE, labelsMat);
But first I got this error:
OpenCV Error: Bad argument(data and labels must be a floating point matrix)
Then I changed
cv::Mat labelsMat(labels.size(), 1, CV_32S, labelsArray);
to:
cv::Mat labelsMat(labels.size(), 1, CV_32F, labelsArray);
And now I get this error: OpenCV Error: bad argument(data should have atleast two classes)
I have 10 classes (0,1,...,9) but I don't know why I get this error. My codes are almost identical with the ones in the mentioned tutorial. linoTue, 17 May 2016 11:32:04 -0500http://answers.opencv.org/question/94405/how can I regress multiple variables by single random forest with opencv?http://answers.opencv.org/question/78662/how-can-i-regress-multiple-variables-by-single-random-forest-with-opencv/ I can use CvRTrees to regress one parameter. But how can I regress multiple parameters simultaneously? Thanks.zhjw1988Wed, 09 Dec 2015 00:13:02 -0600http://answers.opencv.org/question/78662/RTrees regression to multiple output responseshttp://answers.opencv.org/question/75236/rtrees-regression-to-multiple-output-responses/
I am using RTrees for regression. However, I don't find the way to make it work with multiple variables. I am using code similar to the tree_engine.cpp sample.
My features have dimension 300, and I want to do joint regression to 3 output variables, so I have 3 ordered responses per feature vector. For example, 1 data row has indices 0-299=features, 300-302=responses.
I have tried the following settings:
const char* filename = "my_data.csv";
int response_idx = 300;
int response_size = 3;
std::string typespec = "ord[0-302]";
Ptr<TrainData> data = TrainData::loadFromCSV(filename, 0, response_idx, response_idx+response_size, typespec);
However, I get the error:
OpenCV Error: Assertion failed (d == 2 && (sizes[0] == 1 || sizes[1] == 1 || sizes[0]*sizes[1] == 0)) in create, file /Users/xavisuau/Downloads/opencv-3.0-2.0/modules/core/src/matrix.cpp, line 2294
libc++abi.dylib: terminating with uncaught exception of type cv::Exception: /Users/xavisuau/Downloads/opencv-3.0-2.0/modules/core/src/matrix.cpp:2294: error: (-215) d == 2 && (sizes[0] == 1 || sizes[1] == 1 || sizes[0]*sizes[1] == 0) in function create
Which looks quite deep...
Can anyone confirm regression is working using multiple response? Or is this maybe a bug?
Thanks!
xavisuauTue, 03 Nov 2015 03:02:34 -0600http://answers.opencv.org/question/75236/Approximating a set of points from a binary imagehttp://answers.opencv.org/question/59145/approximating-a-set-of-points-from-a-binary-image/My task is to plot a line that reppresents the approximation (linerar I guess) to a set of points.
Example:
![image description](/upfiles/14282855632031481.jpg)
-draw a line that best fits the white points from this image.
I am aware of functions such as fitLine but how can I use them when I represent images as Mat?
exzampSun, 05 Apr 2015 21:02:59 -0500http://answers.opencv.org/question/59145/Discover depth of the bite of an apple.http://answers.opencv.org/question/3456/discover-depth-of-the-bite-of-an-apple/Is there any technique that can be used to discover the depth of the bite of an apple?
![image description](http://i.imm.io/J6uq.jpeg)
Thank you!beowulfWed, 24 Oct 2012 21:38:51 -0500http://answers.opencv.org/question/3456/How to build a regression tree over binary variables?http://answers.opencv.org/question/228/how-to-build-a-regression-tree-over-binary-variables/Let's say I have a training set {y<sub>i</sub>,x<sub>i</sub>}<sub>N</sub> with y<sub>i</sub>>0 and x<sub>i</sub> a vector of binary variables x<sub>ik</sub>∈{0,1}.
Which is the best way to build a regression tree with selected variables? Should I use CvDTree or implement my own regression tree?Niu ZhiHengTue, 10 Jul 2012 07:18:04 -0500http://answers.opencv.org/question/228/