Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Training a classifier if feature size is greater than RAM size

Hello,

How can we train a classifier if the size of the Mat containing features is greater than the RAM size? Can we train the classifier incrementally(for example svm classifier)? Is it possible to update the classifier using the previously created xml?

The project is forgery detection algorithm

The feature size comes to about [18157*300] float = 20mb for one image. For 500 images it will come around 10gb. The RAM size is 6gb. This leads to a out of memory error.

Is their a way to train the classifier with features of such size without resorting to dimensionality reduction methods like pca?