Ask Your Question

Revision history [back]

feature selection using adaboost

I have implemented a feature extractor based on LBP (for texture descriptor) and SVM for (classification).

[I don't think it is really important, but i don't use opencv lbps, i use my own implementation of lbps and i use uniform LBPs. For classification i used LIBSVM ].

The feature vector for each image is composed by 1.000 - 3.000 values depending on the number of cells. I want to apply Adaboost in order to get the most important features and them apply SVM for classification. (I want to increase the number of features for extraction and have no penalty in the classification step).

I think i should use CvBoost::get_weak_predictors() to obtain the most important features, but i don't know how. Maybe i'm wrong about how Adaboost works.

If anyone know about it, please give me some tips. Thanks in advance.

NOTE: I have found a similar question here. Sorry if it were duplicated

feature selection using adaboost

I have implemented a feature extractor based on LBP (for texture descriptor) and SVM for (classification).(for classification).

[I don't think it is really important, but i don't use opencv lbps, i use my own implementation of lbps and i use uniform LBPs. For classification i used LIBSVM ].

The feature vector for each image is composed by 1.000 - 3.000 values depending on the number of cells. I want to apply Adaboost in order to get the most important features and them apply SVM for classification. (I want to increase the number of features for extraction and have no penalty in the classification step).

I think i should use CvBoost::get_weak_predictors() to obtain the most important features, but i don't know how. Maybe i'm wrong about how Adaboost works.

If anyone know about it, please give me some tips. Thanks in advance.

NOTE: I have found a similar question here. Sorry if it were duplicated

feature selection using adaboost

I have implemented a feature extractor based on LBP (for texture descriptor) and SVM (for classification).

[I don't think it is really important, but i don't use opencv lbps, i use my own implementation of lbps and i use uniform LBPs. For classification i used use LIBSVM ].

The feature vector for each image is composed by 1.000 - 3.000 values depending on the number of cells. I want to apply Adaboost in order to get the most important features and them apply SVM for classification. (I want to increase the number of features for extraction and have no penalty in the classification step).

I think i should use CvBoost::get_weak_predictors() to obtain the most important features, but i don't know how. Maybe i'm wrong about how Adaboost works.

If anyone know about it, please give me some tips. Thanks in advance.

NOTE: I have found a similar question here. Sorry if it were duplicated

feature selection using adaboost

I have implemented a feature extractor based on LBP (for texture descriptor) and SVM (for classification).

[I don't think it is really important, but i don't use opencv lbps, i use my own implementation of lbps and i use uniform LBPs. For classification i use LIBSVM ].

The feature vector for each image is composed by 1.000 - 3.000 values depending on the number of cells. I want to apply Adaboost in order to get the most important features and them apply SVM for classification. (I want to increase the number of features for extraction and have no penalty in the classification step).

I think i should use CvBoost::get_weak_predictors() to obtain the most important features, but i don't know how. Maybe i'm wrong about how Adaboost works.

If anyone know about it, please give me some tips. Thanks in advance.

NOTE: I have found a similar "similar" question here. Sorry if it were duplicated

feature selection using adaboost

I have implemented a feature extractor based on LBP (for texture descriptor) and SVM (for classification).

[I don't think it is really important, but i don't use opencv lbps, i use my own implementation of lbps and i use uniform LBPs. For classification i use LIBSVM ].

The feature vector for each image is composed by 1.000 - 3.000 values depending on the number of cells. I want to apply Adaboost in order to get the most important features and them apply SVM for classification. (I want to increase the number of features for extraction and have no penalty in the classification step).

I think i should use CvBoost::get_weak_predictors() to obtain the most important features, but i don't know how. Maybe i'm wrong about how Adaboost works.

If anyone know about it, please give me some tips. Thanks in advance.

NOTE: I have found a "similar" question here. Sorry if it were duplicatedduplicated.

feature selection using adaboost

I have implemented a feature extractor based on LBP (for texture descriptor) and SVM (for classification).

[I don't think it is really important, but i don't use opencv lbps, i use my own implementation of lbps and i use uniform LBPs. For classification i use LIBSVM ].

The feature vector for each image is composed by 1.000 - 3.000 values depending on the number of cells. I want to apply Adaboost in order to get the most important features and them then apply SVM for classification. (I want to increase the number of features for extraction and have no penalty in the classification step).

I think i should use CvBoost::get_weak_predictors() to obtain the most important features, but i don't know how. Maybe i'm wrong about how Adaboost works.

If anyone know about it, please give me some tips. Thanks in advance.

NOTE: I have found a "similar" question here. Sorry if it were duplicated.

feature selection using adaboost

I have implemented a feature extractor based on LBP (for texture descriptor) and SVM (for classification).

[I don't think it is really important, but i don't use opencv lbps, i use my own implementation of lbps and i use uniform LBPs. For classification i use LIBSVM ].

The feature vector for each image is composed by 1.000 - 3.000 values depending on the number of cells. I want to apply Adaboost in order to get the most important features and then apply SVM for classification. classification (apply SVM with the boosted LBP features). (I want to increase the number of features for extraction and have no penalty in the classification step).

I think i should use CvBoost::get_weak_predictors() to obtain the most important features, but i don't know how. Maybe i'm wrong about how Adaboost works.

If anyone know about it, please give me some tips. Thanks in advance.

NOTE: I have found a "similar" question here. Sorry if it were duplicated.

feature selection using adaboost

I have implemented a feature extractor based on LBP (for texture descriptor) and SVM (for classification).

[I don't think it is really important, but i don't use opencv lbps, i use my own implementation of lbps and i use uniform LBPs. For classification i use LIBSVM ].

The feature vector for each image is composed by 1.000 - 3.000 values depending on the number of cells. I want to apply Adaboost in order to get the most important features and then apply SVM for classification (apply SVM with the boosted LBP features). (I want to increase the number of features for extraction and have no penalty in the classification step).

I think i should use CvBoost::get_weak_predictors() to obtain the most important features, but i don't know how. Maybe i'm wrong about how Adaboost works.

If anyone know about it, please give me some tips. Thanks in advance.

NOTE: I have found a "similar" question here. Sorry if it were duplicated.