LBPHFaceRecognizer model
Hello everyone,
I am trying to train a LBPHFaceRecognizer model with photos from a lot of persons. The problem is that at a point I get an OutOfMemory error. I looked up at the model after I saved in a file and it seems to be saving the histograms of every photo and i think this is the problem. From my point of view it should be saving a representative set of features for every person (If I have 50 photos with the same label it should be saving 1 representative histogram (maybe the centroid of the histograms cluster), not 50). Maybe I am wrong, but if is there someone who knows better what happens in the training and predict phases for this algorithm or someone who used it for a big dataset, please reply.
Thank you, Bogdan
this is correct (and this is all, that happens in the training phase, the prediction is just a 1-nearest neighbour search over that)
i don't think this is feasible (but please try and report back !)
how many ? again, since it's a linear search (and not building a "global model"), you could split it up into several instances
Thank you for the reply. I am trying to train the model on a dataset that contains about 1500 persons and a like 50-100 photos for each person. I guess using so many photos with this LBPHFaceRecognizer is not the best approach.
well you certainly need a few. maybe you should do some cross-fold validation to find out the minimum number needed.
maths time: 64 x 256 x 4 = 64kb per image. for 1500 persons and 50 imgs each, we're at 5gb mem, that's quite a lot.
Try training neural network (cnn)