Ask Your Question

Revision history [back]

Tracking on facial features in motion


We are trying to create an app which will let us track individual facial features for a live feed from the front camera of a phone. We are looking at OpenCV to achieve this. I was wondering if there are tutorials or sample code available for something like this.

Detailed Description: Our app is meant to track a persons facial features like eyes, lips, nose etc while the face is turning. This tracking will be done on a live video stream while the users is looking at the phone. Therefore the tracking has to super fast and seamless. For eg if the face is turned to a side and only half the lips are visible, then the algorithm should recognize it as lips and continue tracking.

Can you please tell me if you know any algorithms/sample code for this?