dnn resize_layer assertion 'getmemoryshapes'
hello! i was testing a model with the opencv dnn module to then implement the whole algorithm from https://github.com/wywu/LAB but i get the following error:
opencv-master/modules/dnn/src/layers/resize_layer.cpp:200: error: (-215:Assertion failed) inputs.size() == 1 in function 'getMemoryShapes'
I can read the network and print the details but while forwarding it crashes.. Has anyone encountered something similar?
Thanks and greetz!
hi, aguila, can you be a bit more concise about what you are doing here ? there must be a prebuilt model, code, etc.
Hi @berak, the prebuilt model i downloaded from the repo https://wywu.github.io/projects/LAB/s... and there is an implementation in caffe https://github.com/wywu/LAB/blob/mast... i based myself in that one and this one https://github.com/HandsomeHans/Easy-LAB
I started something like this:
Author uses customized Interp layer (see https://github.com/wywu/LAB/blob/mast...). It resizes the first input blob to size of the second one. If your model always works with 1x1x256x256 input you can easy compute a destination shape and use an origin Interp layer introduced in https://github.com/cdmh/deeplab-public.
Hello @dkurt thank you for your help, sorry i did not checked in the past week the two different layers. I am looking now the differences between the Interp Layer from opencv (the one from deeplab) and the custom layer from the author.
In my case, i will always have the same input size 256x256.
For this, if i am correct, i should explicit give the output size as input in the Interp Layer no? something like:
@aguila, Try this:
ok i will try it later today and post my results here.. thanks!
@benjamin, please do not post answers here, if you have a question or a comment, thank you
(apart from that, questons like yours never work here, leave older questions alone, noone's here anymore)