Ask Your Question
0

how opencv_traincascade can using GPU

asked 2017-12-05 23:58:24 -0600

Derick gravatar image

I have try to train my classifier using opencv_traincascade in my machine. When run it, it utilize 100% of my CPU but not use my GPU. I have install OpenCV 3.x in my Ubuntu 16.04. And I have GeForece GTX 1080 Ti/PCIe/SSE2. I success install the driver with CUDA 8.0. How I can use the GPU instead of using CPU? I use below script to train the module in terminal

opnencv_traincascade -data data -vec positives.vec -bg bg.txt -numPos 37000 -numNeg 756 -numStage 20 -w 20 -h 20

Any configuration I need to set to use the GPU?

edit retag flag offensive close merge delete

Comments

1

If you have a GTX1080Ti then skip the idea of cascade classifier all together and go for deep learning. Tons of efficient deep learning frameworks for object detection freely available.

StevenPuttemans gravatar imageStevenPuttemans ( 2017-12-06 04:27:07 -0600 )edit

1 answer

Sort by ยป oldest newest most voted
1

answered 2017-12-06 02:11:37 -0600

berak gravatar image

sadly, there is no gpu code for the cascade training at all (it's not a configuration problem)

also, while there is cuda optimized detection code , you'd have to use the "old format" style cascades, generated from the opencv_haartraining tool (which is only in the outdated 2.4 branch)

edit flag offensive delete link more

Comments

1

And even then it seems that recent 2.4 branches have issues with the old models .... so better skip the idea alltogether :D

StevenPuttemans gravatar imageStevenPuttemans ( 2017-12-06 04:26:18 -0600 )edit

even I turn ON the CUDA during install OpenCV3.3, it still will not using GPU?

Derick gravatar imageDerick ( 2017-12-06 18:32:13 -0600 )edit
1

no way for the cascade training.

berak gravatar imageberak ( 2017-12-06 19:46:18 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2017-12-05 23:58:24 -0600

Seen: 3,132 times

Last updated: Dec 06 '17