# Revision history [back]

here is, what i used recently for the random forest:

Mat train_features; // 1 row for each lbp feature, float !
Mat train_labels    // 1 row (1 float element with the id) corresponding to the feature

CvRTrees tree;
CvRTParams cvrtp; // set params, play with those!! (the numbers were for my problem)
cvrtp.max_depth = 25;
cvrtp.min_sample_count = 6;
cvrtp.max_categories = 2;
cvrtp.term_crit.max_iter = 100;

tree.train ( train_features , CV_ROW_SAMPLE , train_labels,cv::Mat(),cv::Mat(),cv::Mat(),cv::Mat(),cvrtp );

// later:
float id = tree.predict(sample); // where sample is a lbp feature , again


but again, since your problem is more a binary, not a multi-class one, another classifier, like svm or dtree might give better results.

please have a look here, for a nice tut on machinelearning in opencv

here is, what i used recently for the random forest:

Mat train_features; // 1 row for each lbp feature, float !
Mat train_labels    // 1 row (1 float element with the id) corresponding to the feature

CvRTrees tree;
CvRTParams cvrtp; // set params, play with those!! (the numbers were for my problem)
cvrtp.max_depth = 25;
cvrtp.min_sample_count = 6;
cvrtp.max_categories = 2;
cvrtp.term_crit.max_iter = 100;

tree.train ( train_features , CV_ROW_SAMPLE , train_labels,cv::Mat(),cv::Mat(),cv::Mat(),cv::Mat(),cvrtp );

// later:
float id = tree.predict(sample); // where sample is a lbp feature , again


but again, since your problem is more a binary, binary classification, not a multi-class one, another classifier, like svm or dtree might give better results.

please have a look here, for a nice tut on machinelearning in opencv

here is, what i used recently for the random forest:

Mat train_features; // 1 row for each lbp feature, float !
Mat train_labels    // 1 row (1 float element with the id) corresponding to the feature

CvRTrees tree;
CvRTParams cvrtp; // set params, play with those!! (the numbers were for my problem)
cvrtp.max_depth = 25;
cvrtp.min_sample_count = 6;
cvrtp.max_categories = 2;
cvrtp.term_crit.max_iter = 100;

tree.train ( train_features , CV_ROW_SAMPLE , train_labels,cv::Mat(),cv::Mat(),cv::Mat(),cv::Mat(),cvrtp );

// later:
float id = tree.predict(sample); // where sample is a lbp feature , again


but again, since your problem is more a binary classification, not a multi-class one, another classifier, like svm or dtree might give better results.

if you look at the lbp-FaceRecognizer , it's not even using any of those, just a plain chi-square based knn. (and seemingly, results did not improve using svm or such)

please have a look here, for a nice tut on machinelearning in opencv

here is, what i used recently for the random forest:

Mat train_features; // 1 row for each lbp feature, float !
Mat train_labels    // 1 row (1 (containing 1 float element with the id) corresponding to the for each feature

CvRTrees tree;
CvRTParams cvrtp; // set params, play with those!! (the numbers were for my problem)
cvrtp.max_depth = 25;
cvrtp.min_sample_count = 6;
cvrtp.max_categories = 2;
cvrtp.term_crit.max_iter = 100;

tree.train ( train_features , CV_ROW_SAMPLE , train_labels,cv::Mat(),cv::Mat(),cv::Mat(),cv::Mat(),cvrtp );

// later:
float id = tree.predict(sample); // where sample is a lbp feature , again


but again, since your problem is more a binary classification, not a multi-class one, another classifier, like svm or dtree might give better results.

if you look at the lbp-FaceRecognizer , it's not even using any of those, just a plain chi-square based knn. (and seemingly, results did not improve using svm or such)

please have a look here, for a nice tut on machinelearning in opencv