Compile the dnn module against the system version of opencv

asked 2018-09-26 08:52:30 -0600

Mehdi gravatar image

I am using Ubuntu Bionic (18.04) which comes with opencv 3.2.0. Unfortunately this version doesn't contain the dnn module that I need for importing tensorflow models and running inference (supported in opencv 3.4.1). I am tied to this opencv version and can't install a newer one, neither can I install tensorflow. So my only way is to compile dnn separately.

I downloaded the source of the latest opecv and tried to find a way to compile only the dnn module while linking it to opencv core and imgproc that are already installed (from opencv 3.2.0) But I faced a lot of CMake magic that I could not understand until now.

Could anyone give me some hints on how to do that? And is it possible at all to compile dnn from 3.4.1 and link it to opencv 3.2.0?

edit retag flag offensive close merge delete

Comments

please don't try to mix versions.

if you can't (or don't want to) get rid of the preinstalled 3.2.0 version, probably installing the recent one to a custom location (e.g. somewhere in your home folder) might be a clean way to do it. (CMAKE_INSTALL_PREFIX=XXXXX)

(also, we're at 3.4.3 already)

berak gravatar imageberak ( 2018-09-26 08:59:28 -0600 )edit

Due to strict controls on the software on the target product as well as of the deployment pipeline, this is not feasible unfortunately. The only possible way is to compile dnn against 3.2.0 and make a debian package out of it. A more recent opencv ubuntu version will probably come with Ubuntu 20.04, around may 2020, I don't feel like waiting until then.

Mehdi gravatar imageMehdi ( 2018-09-26 09:08:21 -0600 )edit

And as for my goal, it is to do forward inference in order to segment an image, without having to install tensorflow. The only other two ways are using ahead of time compilation (tfcompile) which didn't work for complex models, or running inference on the cloud (which I do now but is interenet dependent). I think it is awesome that the opencv community came up with its own model parser that can run forward inference on any model, but the lagging of the debian community to package the newest versions is a bit unfortunate.

Mehdi gravatar imageMehdi ( 2018-09-26 09:10:57 -0600 )edit

no, again, waste of time. it'll never work.

get VirtualBox, and build the later version in a VM.

berak gravatar imageberak ( 2018-09-26 09:11:30 -0600 )edit

", without having to install tensorflow. " -- did you know, that you can use opencv & tensorflow from google's http://colab.research.google.com/ (even with gpu support !) ?

berak gravatar imageberak ( 2018-09-26 09:13:08 -0600 )edit

I don't see any great advantage of colab compared to an AWS EC2 instance with a dedicated Tesla K80, am I missing something?

Mehdi gravatar imageMehdi ( 2018-09-26 09:54:09 -0600 )edit

it's for free, that's all there is to it. (but i'm probably only trolling you now...)

berak gravatar imageberak ( 2018-09-26 10:13:26 -0600 )edit

The troll was already clear with the virtual machine

Mehdi gravatar imageMehdi ( 2018-09-26 10:30:07 -0600 )edit

Please provide a source of importing tensorflow models and running inference (supported in opencv 3.4.1).

dkurt gravatar imagedkurt ( 2018-09-26 11:29:11 -0600 )edit