Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Does BackgroundSubtractorKNN respect a learning rate of 0 correctly?

I'm getting a strange result where it looks like BackgroundSubtractorKNN algorithm may still be learning when the learning rate parameter is set to zero in the apply function.

Looking at the source code for apply() I see that:

learningRate = learningRate >= 0 && nframes > 1 ? learningRate : 1./std::min( 2*nframes, history );

Kshort=(int)(log(0.7)/log(1-learningRate))+1;

Kmid=(int)(log(0.4)/log(1-learningRate))-Kshort+1;

Klong=(int)(log(0.1)/log(1-learningRate))-Kshort-Kmid+1;

This strikes me odd as the log of 1 being zero, therefore division by zero occurs. Can anyone confirm or correct me?