Tracking on facial features in motion

asked 2014-11-30 14:45:52 -0500

Jabali gravatar image


We are trying to create an app which will let us track individual facial features for a live feed from the front camera of a phone. We are looking at OpenCV to achieve this. I was wondering if there are tutorials or sample code available for something like this.

Detailed Description: Our app is meant to track a persons facial features like eyes, lips, nose etc while the face is turning. This tracking will be done on a live video stream while the users is looking at the phone. Therefore the tracking has to super fast and seamless. For eg if the face is turned to a side and only half the lips are visible, then the algorithm should recognize it as lips and continue tracking.

Can you please tell me if you know any algorithms/sample code for this?

edit retag flag offensive close merge delete


I think a combination of using cascade clasifiers in multiple views (front and profile), then combined with a optical flow tracker on the retrieved detections could get you pretty far. You can combine a keypoint descriptor on the detected regions to improve the tracker results. So yes I think OpenCV can achieve this. The biggest challenge will be getting it to real time performance!

StevenPuttemans gravatar imageStevenPuttemans ( 2014-12-01 06:50:38 -0500 )edit