Hi all,
I worked for the couple of months on a Hand gesture application on both c++ and android. I got it working as much as the filtered image contains the correct shape of the hand. If it contains the contour, then it is drawn properly and the algorithm detects quite accurate the fingers, etc.
But, as you expect, the image sometimes (especially on Android) is not filtered properly and the result is either a bad contour or, in some of the cases there is no contour.
I would like very much some ideas about how to proceed or about something that I am missing, anything is of value.
My current ideas would be:
1. Track some labels attached to the fingers.
I must mention I have not so much knowledge about how.
2. Continue finding a solution of filtering the image.
My current solution (part of it, but the important stuff) can be found at:
pastebin.
To have a clear understanding of what happens:
1. I try to draw some small rectangles (like in the original opencv4android sample) onto the screen for finding the thresholds. I place my hands so that I cover the rectangles and I keep it like that for a few seconds so that I get some good values at the end. After I have the contour I use convexityDefects and some geometry calculations which leads me to the desired result of counting the number of fingers shown onto the screen.
My idea for segmenting and filtering of the hand came from this blog: hand tracking and recognition.
On the PC side seems to be working acceptable but on the android side doesn't work really well.
If by chance the viewpoint changes on both sides it barely works or, most of the times doesn't work at all.