Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

learning model size

I'm using opencv's CvBoost to classify. I've trained the classifier with several gigabytes of data and then I save it off. The model has a tree of 1000 weak learners with a depth of 20 (the default settings). Now I want to load it up to predict classes in real time production code. However, the size of the learning model is HUGE (nearly a gigabyte). I believe this is because the save function saves off all of the data used for learning so the training model can be properly updated. However, I don't need this functionality at run-time, I just want to use the fixed parameters (1000 weak learners, etc) which shouldn't be much data.

Is there a way to save off and load just the weak learner parameters into CvBoost?

Does anyone have experience reducing the learning model data size with this or another opencv learning model? Note: CvBoost inherits from CvStatModel which has the save/load functions.