DNN onnx model with variable batch size
Hi, If I have a caffe model with an input and output batch size of 1 and I pass it a blob containing multiple images (batch_size >1), e.g.
batch_size = 2
blob = cv.dnn.blobFromImages([img_normalized]*batch_size ,size=(224,224))
net.setInput(blob)
net.forward()
then I get a result for both images.
If I use an onnx model with an input and output batch size of 1, exported from pytorch as
model.eval();
dummy_input = torch.randn(1, 3, 224, 224)
torch.onnx.export(model, dummy_input, onnx_name,
do_constant_folding=True,
input_names = ['input'], # the model's input names
output_names = ['output'])
and pass a single image as
blob = cv.dnn.blobFromImage(img_normalized ,size=(224,224))
net.setInput(blob)
net.forward()
then I again get the correct result. If however I pass more than one image as above then I get the following error
error: OpenCV(4.2.0-dev) \modules\dnn\src\layers\reshape_layer.cpp:113: error: (-215:Assertion failed) total(srcShape, srcRange.start, srcRange.end) == maskTotal in function 'cv::dnn::computeShapeByReshapeMask'
because I have changed the batch size.
I have tried to export the onnx model with a dynamic batch size
torch.onnx.export(model, dummy_input, onnx_name,
do_constant_folding=True,
input_names = ['input'], # the model's input names
output_names = ['output'],
dynamic_axes={'input' : {0 : 'batch_size'}, # variable lenght axes
'output' : {0 : 'batch_size'}})
but the model fails to import
net = cv.dnn_ClassificationModel(onnx_name)
error: OpenCV(4.2.0-dev) \modules\dnn\src\layers\reshape_layer.cpp:149: error: (-215:Assertion failed) dstTotal != 0 in function 'cv::dnn::computeShapeByReshapeMask'
What am I doing wrong/how can I use an onnx model with a dynamic batch size?