Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Though this is not a real answer I will try as best I can to give you a helpful answe. I will try to summarize and be very explicit.
This is my current topic and it's not even close to easy. I've tried several things and concluded that it's too big to jump directly on ANDROID so I am trying to make a good hand gesture detector on PC then think later for Android.
My first(but last in terms of trying) approach for this is to create a good skin classifier (which is ongoing) and then based on the hand contour got from the skin classifier try to use convexity defects and some geometric calculations(angles, distances) to get the number of fingers shown onto the screen. Right now if the contour is right, I detect quite accurate the fingers. However, because I am using thresholds for rgb or hsv or ycrcb doesn't work well all the time so I need something way better.
A starting point for me is/was:
http://www.morethantechnical.com/2013/03/05/skin-detection-with-probability-maps-and-elliptical-boundaries-opencv-wcode/
I would suggest to try and study, if you like this approach of course, hand segmentation techniques and then once you got it working move to the next step, finding the number of fingers shown onto the screen.
My second approach(and first try) for this was(didn't work well) using a concept called Visual categorization with bag of keypoints. Starting point:
Simple Object Classifier using Bag of Words
I tried a lot to make it work but because hand is different than static objects I didn't. After investigating a lot I realized that one solution would be to use shape context instead of using points to train - but I didn't try it because of the lack of time, so I can't say for sure that it will work. You have to research that on your own. After several tests and "tunings" the best accuracy was 46% and the speed is 2FPS on PC.
Hope my answer is at least of help to you.

Good luck

Though this is not a real answer I will try as best I can to give you a helpful answe. answer.
I will try to summarize and be very explicit.
This is my current topic and it's not even close to easy. I've tried several things and concluded that it's too big to jump directly on ANDROID so I am trying to make a good hand gesture detector on PC then think later for Android.
My first(but last in terms of trying) approach for this is to create a good skin classifier (which is ongoing) and then based on the hand contour got from the skin classifier try to use convexity defects and some geometric calculations(angles, distances) to get the number of fingers shown onto the screen. Right now if the contour is right, I detect quite accurate the fingers. However, because I am using thresholds for rgb or hsv or ycrcb doesn't work well all the time so I need something way better.
A starting point for me is/was:
http://www.morethantechnical.com/2013/03/05/skin-detection-with-probability-maps-and-elliptical-boundaries-opencv-wcode/
I would suggest to try and study, if you like this approach of course, hand segmentation techniques and then once you got it working move to the next step, finding the number of fingers shown onto the screen.
My second approach(and first try) for this was(didn't work well) using a concept called Visual categorization with bag of keypoints. Starting point:
Simple Object Classifier using Bag of Words
I tried a lot to make it work but because hand is different than static objects I didn't. After investigating a lot I realized that one solution would be to use shape context instead of using points to train - but I didn't try it because of the lack of time, so I can't say for sure that it will work. You have to research that on your own. After several tests and "tunings" the best accuracy was 46% and the speed is 2FPS on PC.
Hope my answer is at least of help to you.

Good luck