Python Wrapper for FaceRecognizer
The Python wrapper for cv::FaceRecognizer works now. First of all thanks to Vadim and Eric for writing the Python wrapper for the contrib module in the first place! Then with some help from Vadim, it was an easy fix (I needed to change 1! parameter). So if you update and compile your git repository it's supposed to work.
The example in this tutorial is going to show how to work with the Eigenfaces model in OpenCV's Python wrapper. You are going to learn how to train a FaceRecognizer on your data, get a prediction (with associated confidence) from the model and see how to access the model internal data. You'll also see how to plot the Eigenfaces, because it's what most of the people want to see.
You'll find the demo in samples/python2
folder:
I'll add this tutorial to the official documentation, as soon as my job allows. Feel free to comment if you have questions or any other requests.
Getting the Data right
We aren't doing a toy example, so you'll need some image data. For sake of simplicity I have assumed, that the images (the faces, persons you want to recognize) are given in folders. So imagine I have a folder (a dataset) images
, with the subfolders person1
, person2
and so on:
philipp@mango:~/facerec/data/images$ tree -L 2 | head -n 20
.
|-- person1
| |-- 1.jpg
| |-- 2.jpg
| |-- 3.jpg
| |-- 4.jpg
|-- person2
| |-- 1.jpg
| |-- 2.jpg
| |-- 3.jpg
| |-- 4.jpg
[...]
One of the public available datasets, that is already coming in such a folder structure is the AT&T Facedatabase, available at:
Once unpacked it is going to look like this (on my filesystem it is unpacked to /home/philipp/facerec/data/at/
, your path is different!):
philipp@mango:~/facerec/data/at$ tree .
.
|-- README
|-- s1
| |-- 1.pgm
| |-- 2.pgm
[...]
| `-- 10.pgm
|-- s2
| |-- 1.pgm
| |-- 2.pgm
[...]
| `-- 10.pgm
|-- s3
| |-- 1.pgm
| |-- 2.pgm
[...]
| `-- 10.pgm
...
40 directories, 401 files
Example: Eigenfaces
I am going to use some functions from my Guide To Face Recognition with Python. These functions will make it much easier to work with the image data, like reading the folder hierarchie. The script outputs a prediction, the names of available model data, the mean face and the first (at most) 16 eigenfaces:
- mean.png The mean face of the given data.
- eigenfaces_i.png The eigenfaces calculated from the given data.
You can either show the results or save them to disk. Run the application with:
python facerec_demo.py </path/to/your/images/> [</path/to/store/results/at>]
So if I have extracted the AT&T Facedatabase to /home/philipp/facerec/data/at
and I want to show the results I call it like this:
python facerec_demo.py /home/philipp/facerec/data/at
If I have ... (more)