Tracking on facial features in motion
Hi,
We are trying to create an app which will let us track individual facial features for a live feed from the front camera of a phone. We are looking at OpenCV to achieve this. I was wondering if there are tutorials or sample code available for something like this.
Detailed Description: Our app is meant to track a persons facial features like eyes, lips, nose etc while the face is turning. This tracking will be done on a live video stream while the users is looking at the phone. Therefore the tracking has to super fast and seamless. For eg if the face is turned to a side and only half the lips are visible, then the algorithm should recognize it as lips and continue tracking.
Can you please tell me if you know any algorithms/sample code for this?
I think a combination of using cascade clasifiers in multiple views (front and profile), then combined with a optical flow tracker on the retrieved detections could get you pretty far. You can combine a keypoint descriptor on the detected regions to improve the tracker results. So yes I think OpenCV can achieve this. The biggest challenge will be getting it to real time performance!