feature selection using adaboost
I have implemented a feature extractor based on LBP (for texture descriptor) and SVM (for classification).
[I don't think it is really important, but i don't use opencv lbps, i use my own implementation of lbps and i use uniform LBPs. For classification i use LIBSVM ].
The feature vector for each image is composed by 1.000 - 3.000 values depending on the number of cells. I want to apply Adaboost in order to get the most important features and then apply SVM for classification (apply SVM with the boosted LBP features). (I want to increase the number of features for extraction and have no penalty in the classification step).
I think i should use CvBoost::get_weak_predictors() to obtain the most important features, but i don't know how. Maybe i'm wrong about how Adaboost works.
If anyone know about it, please give me some tips. Thanks in advance.
NOTE: I have found a "similar" question here. Sorry if duplicated.
Interesting quesion! The more natural way for feature selection would be to use LDA. However it should work with boosting, too.
Indeed interesting question, will follow in order to hear conclusions about it :)
Looking forward to the answers. I have exatcly the same problem
Just thinking out loud. Wont LDA or KL divergence help rather than going for AdaBoost? Is there a special motivation behind going for AdaBoost?