Ask Your Question

Revision history [back]

detectMultiScale on Gpu problem with minNeighbors parameter

Hi, for my thesis i'm trying to make a program that can detect an object and can assign to the detection object a score using the GPU. The score represents the quality of the detection.

Using the CPU version of the function detectMultiScale I created a metric for assign a score. I look the list of Rect returned by the functions and a search for those rectangle that are similar in area and position and, after, I group the rectangles into a single one who contains the detected object. This is possible if a put the parameter minNeighbors = 0, in this way detectMultiScale for CPU does not group the rectangle and so i can apply my metric. The problem is when i try to do the same thing with the GPU version of detectMultiScale, indeed if i put the parameter minNeighbors = 0 the classifier does not detect anything!! but if I put minNeighbors = 1 or greater it detect the same object of the CPU classifier version. My teacher pretends the use of GPU and I don't know why the two version of the classifier have not the same behavior. Please, can any one help me to resolve this problem?

I loaded two image that explain what i mean grouped.png ungrouped.png The ungrouped.png is the output of the classifier CPU and grouped.png image is the result of my metric, and 17 is the detection score.

PS: sorry for my english :-)