Possible bug in traincascade's consumption of negative samples. [closed]

asked 2014-06-24 18:56:11 -0600

CvCascadeClassifier::train() calls updateTrainingSet(), which returns the leaf false alarm rate (tempLeafFARate). This is then compared to the required leaf false alarm rate (requriedLeafFARate). updateTrainingSet() calls fillPassedSamples() twice: once for positives and once for negatives. fillPassedSamples() returns after obtaining count samples that pass the existing stages of the cascade.

Suppose that the existing stages of the classifier are perfect. Will traincascade not either hang forever or run out of negative samples to grab and fail? Should there not be a check that makes sure that, even if no negative samples that pass the existing classifier stages have been found, the required leaf false alarm rate hasn't already been satisfied?

edit retag flag offensive reopen merge delete

Closed for the following reason question is not relevant or outdated by sturkmen
close date 2020-11-18 04:44:36.203983