Suitable algorithm for Emotion Detection?
I have a working gender classification application (detect from webcam) and I'm working on emotion detection right now. I'm classifying them using different training sets but emotion detection seem to return me the same results no matter what facial expression I give. I'll eventually need to detect age groups too, so I want to make sure that what I'm doing won't make it difficult for me to enhance it further.
Am I doing them wrongly, like should I read from the same facial database instead of reading one for gender and reading one for expressions?
My codes are referenced from the OpenCV tutorial on face recognition in videos. I simply did the same steps twice for training and recognizing. If there is another approach I can take that would work with having two training datasets, do let me know!
If you think it's my codes that is causing the error (returning the same results no matter what facial expression I give), do let me know too, so I can provide the codes to find out where is wrong.
Thank you in advance :>
---------- EDIT ----------
I have changed the algorithm used for Emotion Detection. I originally used Fisherfaces, now I'm using LBPH algorithm. The results are no longer consistent, but it is very shaky, constantly changing and not exactly accurate. Is there a way to stabilize it, or is there another algorithm more suitable for emotion detection?