2017-02-16 10:09:53 -0600 | commented question | Add weights to samples in the training set? @StevenPuttemans Thanks for your response. Not particularly. I was following the guidelines described to me to set up a base implementation for pedestrian detection. I explored the HOG + Linear SVM approach described by Dalal to set up an initial (weak) classifier. Only after I already implemented the classifier and got the initial results I wanted to use AdaBoost to further refine the result to reduce the amount of the false positives, unaware that SVM seems to be a poor choice for a weak classifier. The work I'm doing is part of my master thesis to explore and evaluate various features from RGB and Depth images on the KITTI dataset. |
2017-02-16 06:57:33 -0600 | asked a question | Add weights to samples in the training set? I'd like to implement an ensemble of linear SVM classifiers using a custom implementation of Adaboost. Is it possible to add weights to certain samples in the training set before letting a linear SVM train on it? Note: I'm not referring to class weights to correct imbalanced TP/TN training data but set the importance of individual samples as done by the Adaboost algorithm. If it's not possible to do so, how can I simulate giving more importance to certain samples? Will it be enough to add the sample x number of times in the training set? Do I add the weights of the sample as an extra feature? Thanks in advance |