I'm doing live video processing on iOS using OpenCV, without using CvVideoCamera
. My app is crashing due to Memory Pressure.
The native iOS camera calls this function every time a frame is captured:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
//convert the frame to a UIImage:
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
//convert the UIImage to a Mat:
Mat srcMat = [self cvMatFromUIImage:image];
//Process the Mat:
Mat dst, cdst;
Canny(srcMat, dst, 50, 200, 3);
cvtColor(dst, cdst, COLOR_GRAY2BGR);
std::vector<Vec4i> lines;
HoughLinesP(dst, lines, 1, CV_PI/180, 50, 50, 10 );
for( size_t i = 0; i < lines.size(); i++ )
{
Vec4i l = lines[i];
cv::line( cdst, cv::Point(l[0], l[1]), cv::Point(l[2], l[3]), Scalar(0,0,255), 3, LINE_AA);
}
}
The app crashes about 15 seconds later due to memory pressure.
The calls to imageFromSampleBuffer
and cvMatFromUIImage
don't cause any memory leaks, it's the processing I do after //Process the Mat:
that's responsible for it. But I can't figure out why...
It looks like the Mat
objects are staying in memory, but aren't Mat
objects released automatically?