Hi all,
OpenCV 3.1, iOS App
processImage is fired, but the Mat it produces seems to be garbage and breaks ORB. I can get the columns (.cols), its type() return is equal to a working Mat model, but it breaks on detection, or when I try to print data (.data):
detector->detect(image, keypoints_2);
NSLog(@"pullFrame %s",image.data);
Error I receive in Xcode:
Thread 1: EXC_BAD_ACCESS (code=1,address=0x104504010)
Most likely unrelated to the actual bug, here's what comes up in console when I start CvVideoCamera
WARNING: -[<AVCaptureConnection: 0x4887af20> isVideoMinFrameDurationSupported] is deprecated. Please use AVCaptureDevice activeFormat.videoSupportedFrameRateRanges
2016-03-15 18:51:33.112 imageRecognition[1163:437343] WARNING: -[<AVCaptureConnection: 0x4887af20> setVideoMinFrameDuration:] is deprecated. Please use AVCaptureDevice setActiveVideoMinFrameDuration
2016-03-15 18:51:33.113 imageRecognition[1163:437343] WARNING: -[<AVCaptureConnection: 0x4887af20> isVideoMaxFrameDurationSupported] is deprecated. Please use AVCaptureDevice activeFormat.videoSupportedFrameRateRanges
2016-03-15 18:51:33.113 imageRecognition[1163:437343] WARNING: -[<AVCaptureConnection: 0x4887af20> setVideoMaxFrameDuration:] is deprecated. Please use AVCaptureDevice setActiveVideoMaxFrameDuration
If you understand what's going on, or you know a BETTER and FASTER way to do realtime image in image tracking with openCV, please let me know.
Thank you very much,
Aidan