OpenCV IOS Swift 2 :How to implement the CvVideoCameraDelegate protocol to process Video Frames?

asked 2016-05-28 13:11:09 -0600

Okay so I've been using OpenCV and want to use it for IOS with Swift 2.0... I successfully implemented it and I tested it with a few functions/examples, it worked fine! But The application in thought I have is a Live Camera object detection. So using a cascade classifier I will do this, but the thing I have trouble with is setting up the CvVideoCameraDelegate protocol with my ViewController; I'm trying to use this tutrial/example(http://docs.opencv.org/2.4/doc/tutorials/ios/video_processing/video_processing.html )to set it up.... But I'm having trouble trying to set it up in swift.... Can someone please advise the correct way to set it up?

edit retag flag offensive close merge delete

Comments

due to a lot of spam, this site has to go moderated, so please be patient, and don't post duplicates.

berak gravatar imageberak ( 2016-05-29 02:42:38 -0600 )edit

Instead of fighting iOS, use it, and just process with opencv when needed. For example, use iOS for the video (AVCaptureSession), then just grab a frame for processing and use opencv for that frame/pic (GCD). I have a couple of apps that use opencv for image processing. Did I use opencv to pull up a camera interface and so on? No, I use the ImagePickerController and take the pic from it and process it via opencv. The more C++ you insist on using for opencv, the less Swift you can use for iOS since the two are incompatible. For me it is more rewarding to figure stuff out. Instead of bombing this place with a question for every step of your development, spend a month or so sitting in front of your computer for 16 hours a day. You will figure it out.....and will be better off for it.

jmbapps gravatar imagejmbapps ( 2016-05-29 09:20:34 -0600 )edit

Ok, I may have to retract my above advice. Im working on this exact thing for OS X. I fired up Xcode for the first time in months to work on it and didnt spend a lot of time, but have yet to find the missing link.

jmbapps gravatar imagejmbapps ( 2016-05-31 09:05:44 -0600 )edit

The AVCaptureStillImageOutput is what I was thinking about. Now if I can just update Apples code example to make it work correctly. Use this still image for processing.

jmbapps gravatar imagejmbapps ( 2016-05-31 16:42:11 -0600 )edit