Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Problems getting point set for convex hull

I have a function that takes a double (proximity) that determines how far a contour must be from another to be considered close enough to add together. When the contours are added a convex hull is then drawn. When the function takes 2 as its parameter it produces this image: https://i.imgur.com/G5sX6Jc.png.

As you can see from the image something very funky is happening that I cannot figure out. If someone could tell me where I have gone wrong in my code I would be very grateful.

for (size_t i = 0; i < contours.size(); i++)
{
    minDistance = 9999999999999999999;
    for (size_t j = 0; j < contours[i].size(); j++)
    {
        for (size_t a = i + 1; a < contours.size();)
        {
            for (size_t b = 0; b < contours[a].size(); b++)
            {
                euclidean = norm(contours[i][j] - contours[a][b]);
                if (euclidean < minDistance)
                {
                    minDistance = euclidean;
                }
            }
            if (minDistance <= proximity)
            {
                for (size_t q = 0; q < contours[a].size(); q++)
                {
                    contours[i].push_back(contours[a][q]);
                }
                contours.erase(remove(begin(contours), end(contours), contours[a]), contours.end());
            }
            else
            {
                a++;
            }
        }
    }
}