Ask Your Question
1

DNN onnx model with variable batch size

asked 2020-01-07 04:22:43 -0600

updated 2020-01-07 04:24:59 -0600

Hi, If I have a caffe model with an input and output batch size of 1 and I pass it a blob containing multiple images (batch_size >1), e.g.

batch_size = 2
blob = cv.dnn.blobFromImages([img_normalized]*batch_size ,size=(224,224))
net.setInput(blob)
net.forward()

then I get a result for both images.

If I use an onnx model with an input and output batch size of 1, exported from pytorch as

model.eval();
dummy_input = torch.randn(1, 3, 224, 224)
torch.onnx.export(model, dummy_input, onnx_name,
                  do_constant_folding=True, 
                  input_names = ['input'],   # the model's input names
                  output_names = ['output'])

and pass a single image as

blob = cv.dnn.blobFromImage(img_normalized ,size=(224,224))
net.setInput(blob)
net.forward()

then I again get the correct result. If however I pass more than one image as above then I get the following error

error: OpenCV(4.2.0-dev) \modules\dnn\src\layers\reshape_layer.cpp:113: error: (-215:Assertion failed) total(srcShape, srcRange.start, srcRange.end) == maskTotal in function 'cv::dnn::computeShapeByReshapeMask'

because I have changed the batch size.

I have tried to export the onnx model with a dynamic batch size

torch.onnx.export(model, dummy_input, onnx_name,
                  do_constant_folding=True, 
                  input_names = ['input'],   # the model's input names
                  output_names = ['output'],
                  dynamic_axes={'input' : {0 : 'batch_size'},    # variable lenght axes
                            'output' : {0 : 'batch_size'}})

but the model fails to import

net = cv.dnn_ClassificationModel(onnx_name)

error: OpenCV(4.2.0-dev) \modules\dnn\src\layers\reshape_layer.cpp:149: error: (-215:Assertion failed) dstTotal != 0 in function 'cv::dnn::computeShapeByReshapeMask'

What am I doing wrong/how can I use an onnx model with a dynamic batch size?

edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted
1

answered 2020-05-26 01:58:58 -0600

Asmita Khaneja gravatar image

updated 2020-07-10 08:16:06 -0600

Hi,

As per the example given in torch documentation https://pytorch.org/tutorials/advance...

Your code should be :

batch_size=1# random initialization

dummy_input = torch.randn(batch_size, 3, 224, 224) 
dynamic_axes = {'input' : {0 : 'batch_size'}, 
                            'output' : {0 : 'batch_size'}}
torch.onnx.export(model, dummy_input, onnx_name,
                  do_constant_folding=True, 
                  input_names = ['input'],   # the model's input names
                  output_names = ['output'],
                  dynamic_axes=dynamic_axes)
edit flag offensive delete link more

Comments

From memory I am sure that is what I would have done, I just didn't include the line

dummy_input = torch.randn(batch_size, 3, 224, 224)

in the question. Can you confirm that you can successfully import an oonx model with dynamic batch size and run forward on it?

cudawarped gravatar imagecudawarped ( 2020-05-26 11:48:42 -0600 )edit
1

Yes, you can successfully export an ONNX with dynamic batch size. I have achieved the same in my case.

Asmita Khaneja gravatar imageAsmita Khaneja ( 2020-07-10 08:14:48 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2020-01-07 04:22:43 -0600

Seen: 7,493 times

Last updated: Jul 10 '20