I need to do the inference of a Tensorflow model in C++ OpenCV 4.0.1.
My model is a custom proprietary (I can past code here) Faster-RCNN model. It's a huge model develop from scratch, by still do object detection and is based on Faster-RCNN architecture. Not very recent, it's develop with Tensorflow 1.8.
I found several documentation about loading a TF model with OpenCV DNN module:
https://github.com/opencv/opencv/wiki/Deep-Learning-in-OpenCV says that "The provided API (for C++ and Python) is very easy to use, just load the network and run it. Multiple inputs/outputs are supported"
https://github.com/opencv/opencv/wiki/TensorFlow-Object-Detection-API show how to do it.
I first tried to load and run a model from the documentation (Faster-RCNN ResNet-50) and it works well with the corresponding .pbtxt file. So there is no problem with my OpenCV install or my C++ code (That is inspired from this)
So with my own model, first problem appear: I have only .pb file. The code used for training don't generate .pbtxt, and this is not an option for the moment to modify it. From the OpenCV documentation, the readNetFromTensorflow() function can load a model without the .pbtxt file, this is an optional parameter. It seems that the .pbtxt file is only to help OpenCV to load the graph. With only my .pb, I get this error:
cpp
terminate called after throwing an instance of 'cv::Exception'
what(): OpenCV(4.0.1) /opencv-4.0.1/modules/dnn/src/tensorflow/tf_importer.cpp:1377: error: (-215:Assertion failed) scaleMat.type() == CV_32FC1 in function 'populateNet'
I found this error here with no answer.
So maybe the solution is to create a .pbtxt. The documentation give a script to generate it for Faster-RCNN model, but need a configuration file. It exactly says "Pass a configuration file which was used for training to help script determine hyper-parameters." I have no such thing due to the fact that it's not a model from TF model zoo, but a custom model created from scratch. Should I create a config file like these to use the script and generate a .pbtxt for my model ?
TL;DR: it's not specified in the documentation if the dnn module can only load model from TF model zoo, like always in the examples. Indeed, it says that "You can build your own model", but the documentation that follow doesn't correspond. The script generating the .pbtxt need a config file that is a thing from TF model zoo.
Does anyone can explain me what I don't understand ?