Ask Your Question

Geronimo's profile - activity

2015-09-01 09:20:35 -0600 commented answer Where is adaboost/cvBoost for opencv 3.0?

hey does update work with the new cv::ml::Boost??

2015-09-01 09:17:06 -0600 received badge  Supporter (source)
2015-09-01 09:16:31 -0600 commented question Opencv boost update gives unhandled exception

I have the same issue, and have never successfully run with update = true. Bug report is a good idea. Perhaps this problem is solved in 3.0??? Anybody know?

2015-03-06 02:07:23 -0600 received badge  Self-Learner (source)
2015-03-05 17:40:22 -0600 answered a question learning model size

I realized that with 1000 learners and a depth of 20, that's potentially 2^20*1000 learning parameters, i.e. about a billion or 1 gigabyte. So turns out that the learning model needs all of that space to store all of the trees.

To reduce the size I must lower the tree depth and/or number of learners. For example, reducing tree depth to 5 used only 21 mb (though it seemed to take around the same amount of time to build the learning model). Perhaps decreasing the weight trim rate would result in more trees that are pruned before reaching depth 20 (and thus reduce memory size as well). I haven't tested this yet.

Case closed.

2015-02-24 15:23:57 -0600 received badge  Student (source)
2015-02-24 15:21:37 -0600 asked a question learning model size

I'm using opencv's CvBoost to classify. I've trained the classifier with several gigabytes of data and then I save it off. The model has a tree of 1000 weak learners with a depth of 20 (the default settings). Now I want to load it up to predict classes in real time production code. However, the size of the learning model is HUGE (nearly a gigabyte). I believe this is because the save function saves off all of the data used for learning so the training model can be properly updated. However, I don't need this functionality at run-time, I just want to use the fixed parameters (1000 weak learners, etc) which shouldn't be much data.

Is there a way to save off and load just the weak learner parameters into CvBoost?

Does anyone have experience reducing the learning model data size with this or another opencv learning model? Note: CvBoost inherits from CvStatModel which has the save/load functions.