2017-09-29 11:42:34 -0500 | received badge | ● Teacher (source) |
2017-09-29 11:42:30 -0500 | received badge | ● Student (source) |
2017-05-31 08:02:57 -0500 | received badge | ● Necromancer (source) |
2017-05-31 08:02:57 -0500 | received badge | ● Self-Learner (source) |
2017-05-25 10:59:21 -0500 | received badge | ● Enthusiast |
2017-05-24 14:29:36 -0500 | answered a question | Unable to get HOG confidence using GPU implementation (OpenCV 3) Answering my own question here: setting |
2017-05-24 14:28:25 -0500 | answered a question | Retrieve GPU HOG detector scores For GPU version of the HOG, pass in the third parameter confidence as Note: you need to configure Documentation link: http://docs.opencv.org/3.2.0/de/da6/c... Below is a minimal working example in case you are interested: |
2017-05-23 19:43:50 -0500 | commented question | Retrieve GPU HOG detector scores I've asked a similar question before: http://answers.opencv.org/question/11... AFAIK, this is still not resolved. Edit: Looking at my own question again, actually you can retrieve the confidence (detector scores) by setting group_threshold 0. It's embarrassing that I missed it. |
2016-11-09 15:12:55 -0500 | asked a question | Unable to get HOG confidence using GPU implementation (OpenCV 3) I am confused at how to get the confidence level when using HOG GPU implementation in OpenCV 3. According to OpenCV 3.1 cv::cuda::HOG::detectMultiScale Documentation, code replicated below: You can pass a pointer to the third argument Looking at the actual implementation (hog.cpp:385), it is checking the The same question has also been asked on StackOverflow and hasn't been addressed. Please advice. |