2020-06-04 09:51:34 -0600 | received badge | ● Popular Question (source) |
2013-03-13 12:54:16 -0600 | asked a question | Tracking of detected faces using OpenCV on iOS First, a quick bit of background: I am fairly new to iOS and am attempting to detect faces using OpenCV on an iOS device. I was able to get the iOS openCV sample code working fine using the sample code here: This results in a useful method that is called for each frame polled from the camera: In this example, it successfully inverted the frame from the camera and displays on the device. This is useful, as I can subsitute my won OpenCV C++ code into here for whatever image processing I want to do with the frame. Now, I wish to get face tracking implemented. There are header files for a detection based tracker in OpenCV 2.4.2 onwards called “opencv2/contrib/detection_based_tracker.hpp”. It defines a class called DetectionBasedTracker. The tracking mechanism it defines uses haar cascades in the background to detect objects. The reason that I wish to use this temporal tracking method rather than frame by frame face detection is that the tracking is much faster than the OpenCV Haar implementation. A guide on how to implement it is demonstrated here: http://bytesandlogics.wordpress.com/2012/08/23/detectionbasedtracker-opencv-implementation/ I had success in implementing this code in C++ on an Android device. The main code is as follows: And so, for each frame, I can process it to detect the bounding boxes of faces using the lines: Now, the issue. In Objective C, how to I create the "DetectionBasedTracker obj" object so that it can be used in the Thanks for your help! |