1 | initial version |
there is a trap here:
Mat result = net.forward();
result is a shallow copy of the last dnn output layer, so if you call your eval() function twice, both results will point to the very same data ! (the last evaluation)
the remedy is to clone() the result, like:
Mat result = net.forward().clone(); // "deep" copy !
(and yea, i've beeen bitten by that before, too. painful learning curve, there ..)
2 | No.2 Revision |
there is a (c++ specific) trap here:
Mat result = net.forward();
result is a shallow copy of the last dnn output layer, so if you call your eval() function twice, both results will point to the very same data ! (the last evaluation)
the remedy is to clone() the result, like:
Mat result = net.forward().clone(); // "deep" copy !
(and yea, i've beeen bitten by that before, too. painful learning curve, there ..)