Facial action coding system implementation in ios
I am developing a iOS app. Can someone guide me how can I achieve this in iOS using OpenCV.
what i want that Image processing geometry component will output codes for fascial expressions defined in image(happy , sad and all). Image color processor will process the lighting and colors like lipstick color and will returns the expressions codes to front end system based on the appearance.
I am new in open CV so someone please guide me or give me any suggestion to do something like above concept.
Start by face detection, then by face landmarks and then by interpretation (like classifiers)
Thanks for your help. Is there any document or tutorial of open CV framework of iOS So i can use for face landmarks and interpretation ?
for landmarks you can see Mastering OpenCV (Ch 6 and 7)