feature selection using adaboost

asked 2013-04-25 05:40:17 -0500

updated 2013-04-26 07:12:20 -0500

I have implemented a feature extractor based on LBP (for texture descriptor) and SVM (for classification).

[I don't think it is really important, but i don't use opencv lbps, i use my own implementation of lbps and i use uniform LBPs. For classification i use LIBSVM ].

The feature vector for each image is composed by 1.000 - 3.000 values depending on the number of cells. I want to apply Adaboost in order to get the most important features and then apply SVM for classification (apply SVM with the boosted LBP features). (I want to increase the number of features for extraction and have no penalty in the classification step).

I think i should use CvBoost::get_weak_predictors() to obtain the most important features, but i don't know how. Maybe i'm wrong about how Adaboost works.

If anyone know about it, please give me some tips. Thanks in advance.

NOTE: I have found a "similar" question here. Sorry if duplicated.

edit retag flag offensive close merge delete



Interesting quesion! The more natural way for feature selection would be to use LDA. However it should work with boosting, too.

Guanta gravatar imageGuanta ( 2013-04-25 08:25:21 -0500 )edit

Indeed interesting question, will follow in order to hear conclusions about it :)

StevenPuttemans gravatar imageStevenPuttemans ( 2013-04-25 08:42:47 -0500 )edit

Looking forward to the answers. I have exatcly the same problem

Pedro Batista gravatar imagePedro Batista ( 2013-05-02 12:55:36 -0500 )edit

Just thinking out loud. Wont LDA or KL divergence help rather than going for AdaBoost? Is there a special motivation behind going for AdaBoost?

Prasanna gravatar imagePrasanna ( 2013-07-03 07:47:15 -0500 )edit