Ask Your Question

Revision history [back]

Batch inference not working when InferenceEngine as backend

Hi,

I am using OpenCV DNN module for NN inference. Inference on single image, i.e. batch size of 1 works fine but when I try to use more than one images in inference batch, it fails giving error

what(): OpenCV(4.0.1-openvino) /home/jenkins/workspace/OpenCV/OpenVINO/build/opencv/modules/dnn/src/op_inf_engine.cpp:553: error: (-215:Assertion failed) Failed to initialize Inference Engine backend: Input blob size is not equal network input size (1350000!=270000). in function 'initPlugin'

I am using blobFromImages instead of blobFromImage. Blob has 'batch_size' images in it but just inference doesn't work.

I know OpenVINO API has SetBatch() method.

Should I instead switch to OpenVINO API for inference?

Thanks for your time.