Ask Your Question
1

DNN performance on mobile platforms

asked 2017-08-22 06:39:32 -0500

Jaykob gravatar image

updated 2017-08-22 06:40:01 -0500

Hi, I'm in the process of deciding whether to run my tensorflow model on iOS and Android using OpenCV's dnn module vs directly with tensorflow. Advantage using OpenCV would be that I don't need an additional lib (tensorflow) as I'm using OpenCV anyway. Disadvantage is that my model doesn't run out of the box due to some not yet supported layer types. But I think that could be solved somehow.

The main criteria would be performance, though. I only read that the dnn module got significantly faster with the 3.3.0 release, however I didn't find any comparison to tensorflow itself. Does anybody have some experience with both methods and could give me some hints?

edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted
1

answered 2017-08-22 06:57:28 -0500

dkurtaev gravatar image

Hi, @Jaykob! Unfortunately, I haven't experimented with DNN on mobile platforms but I can tell you that TensorFlow (+ Eigen computational backend by default) on CPU is definitely faster. In example, Inception-5h in TF takes 17.9ms versus 19.58ms in DNN. On the other hand, you'd like to run your model on GPU (using OpenCL, if it possible) to save power, in example. DNN is going to has some OpenCL backends (libdnn and Halide). I don't know if TensorFlow supports OpenCL. We're interested to know, what kind of layers are still required for you to extend TensorFlow importer as fast as it possible. Please, keep in touch with your decision. May be it will be the first time I run model on mobile device to help you =)

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower

Stats

Asked: 2017-08-22 06:39:32 -0500

Seen: 384 times

Last updated: Aug 22 '17