Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

It is quite simple to train an AdaBoost classifier. The following steps are needed:

cv::Ptr<cv::ml::Boost> boost_ = cv::ml::Boost::create();
cv::Ptr<cv::ml::TrainData> data = cv::ml::TrainData::create(train_data, cv::ml::ROW_SAMPLE, train_labels)
  • train_data is a matrix of size NxF where N is the number of samples and F is the number of features. The type of the matrix has to be CV_32FC1.
  • train_labels is a matrix of size Nx1 where N has the same value as above, so number of samples/labels. This matrix constains all labels beloning to the data. In the case you are using a binary classifier, labels can be for example 0 and 1. The type of this matrix has to be CV_32SC1.

Train the model by using:

  boost_->train(data);

After the training you can predict single samples by passing a matrix of the size NxF and the type CV_32FC1 (same as above) to the function predcit function:

  boost_->predict(sample);

There is an API documentation and also a short Tutorial available.


Regarding to :

I am trying to implement my own weak classifiers in connection with AdaBoost

There can be passed evry kind of feature to the algorithm as e.g number of green points in an image or what ever.