Ask Your Question

LGS's profile - activity

2014-09-16 10:09:48 -0600 asked a question CV_CAP_PROP_POS_FRAMES not working w/iOS

I developed a project where I am supposed to read frames from a video file. In my case, a .mov file straight from an iPhone capture.

This application is supposed to work on both Mac OS X and iPhone.

Unfortunately, seeking into the file does not work with iOS although it works with Mac OS (each build is linked to its proper OpenCV framework).

In the following code, setting frame always starts file from the begining, and get always returns 1, although frames are actually seeked properly as I displayed them.

Maybe it is just that iOs OpenCV framework is not at the same features levels as the Mac OS one ?

VideoCapture *videoCapture = nil;
videoCapture = new VideoCapture(filePath);
videoCapture->set(CV_CAP_PROP_POS_FRAMES, numframe);

...
loop
    videoCapture->read(grabbedFrame);
    int numframe = videoCapture->get(CV_CAP_PROP_POS_FRAMES);

....

Notes :

  • Currently my target is iOS simulator. Did not checked yet w/ real iPhone

  • I have the same behaviour for MSEC and FRAMES modes.

2014-09-11 05:36:02 -0600 received badge  Editor (source)
2014-09-11 04:01:51 -0600 asked a question cvCreateMat overwrites another Mat with objective C ?

Hi,

I am facing a strange issue. Code below is supposed to display an OpenCV image (matFrame) in a NSImageView control and allocate another Mat (matAux) right after. The issue is that writing in the newly allocated Mat (matAux) is overwriting matFrame as the displayed frame is not the loaded one. Moreoverver, display is not green but blue, hence a channel mismatch ... It looks like the newly matAux data space is ovelapping with matFrame, where it should not unless I missed something.

Things I have checked :

  • Removing data initialization of matAux (setTo, or whatever other writing method) does not produces the problem

  • Using copyTo instead of cvCreateMat works. But this is not my purpose as the new matAux has nothing to do with matFrame

  • allocating matAux as a smaller Mat partly fills matFrame with blue

  • code is called while pushing a button. If matAux creation and initialization is called from separate button, it works ...

  • if matAux is created at application start ( in applicationDidFinishLaunching), it works. But this is also not my purpose ...

Any help or eye opening on Mac OS / opencv beginner's stupid mistake would really be appreciated.

Thanks in advance.

@implementation AppDelegate
Mat matFrame;
Mat matAux;

- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
    // Insert code here to initialize your application

}



- (IBAction)go:(id)sender {
    // Loads image to be displayed
    matFrame = imread("/Users/.../image.bmp");

    // Display image in control
    NSImage* img = nil;
    NSBitmapImageRep* bitmapRep = nil;
    Mat dispImage;
    cvtColor(matFrame, dispImage, CV_BGR2RGB);
    bitmapRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:&dispImage.data pixelsWide:dispImage.cols pixelsHigh:dispImage.rows bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:dispImage.step bitsPerPixel:0];
    img = [[NSImage alloc] initWithSize:NSMakeSize(dispImage.cols, dispImage.rows)];

    [img addRepresentation:bitmapRep];
    [_imageView setImage:img];

    dispImage.release();
    bitmapRep = nil;
    img = nil;

    // Allocate aother Mat
    matAux = cvCreateMat(matFrame.rows, matFrame.cols, CV_8UC3);
    // Fill Mat with green
    matAux.setTo(Scalar(0,255,0));
}

2014-08-08 05:32:01 -0600 asked a question Drawing mat in NSImageView

Hi,

I am facing a strange issue regarding image drawing with objective C in NSImageView controls. The code below is the actual snippet that reproduces the issue (just define a true path for the image) :

I create 2 mat objects, one by reading an image file, the second one is of the same size, but black.

A timer is supposed to update both images in their respective NSImageView control at the same time.

Drawing is done by converting the file image to RGB, as opencv images are stored in a BGR way.

Hence, the program is supposed to display the file image in the first control, and the black image in the second one, each time the timer is triggered.

For testing purpose, I defined 2 options :

SYNCHRONIZE : both drawings are done at the same timer trigger. If not defined, refresh one control after the other, at 2 successive timer triggers.

USE_CONVERT : converts the original image from RGB to BGR and draws the converted mat image. If not defined, uses the original mat image for drawing.

My goal is to have USE_CONVERT and SYNCHRONIZE activated. The issue is that, in that case, the file image is displayed in both controls, where I would like to have the file image in one control and the black one in the other one.

Moreover, if USE_CONVERT is defined but SYNCHRONIZE is not defined, it works correctly !

As soon as I do not use cvtColor (USE_CONVERT is not defined) I actually get the black frame in second control and the file image in the first one, but of course file image is displayed with wrong colors.

It looks like some object is still alive (and used) when called successively, while released if called at different timer calls. Can someone explain to me what is wrong with this implementation ?

Thanks in advance.

#import "AppDelegate.h"

#import "opencv2/opencv.hpp"
using namespace cv;

#define USE_CONVERT
#define SYNCHRONIZE

@implementation AppDelegate
NSTimer *timer = nil;
Mat matFrame1;
Mat matFrame2;
bool first = false;

- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
    // Insert code here to initialize your application
    matFrame1 = imread("/Users/..../image.bmp");
    matFrame2 = Mat::zeros(matFrame1.rows, matFrame1.cols, CV_8UC3);

    timer = [NSTimer scheduledTimerWithTimeInterval:.1 target:self selector:@selector(timerGUI) userInfo:nil repeats:YES];

}

-(void)timerGUI
{
    NSLog(@"TimerGUI");
#ifndef SYNCHRONIZE
    if (first)
#endif
        [self drawImage : matFrame2 : self.imageView2];
#ifndef SYNCHRONIZE
    else
#endif
        [self drawImage : matFrame1 : self.imageView1];

    first = first ? false : true;
}


- (void)drawImage : (Mat)matImage : (NSImageView *)View{

    NSImage* img = nil;
    NSBitmapImageRep* bitmapRep = nil;
#ifdef USE_CONVERT
    Mat dispImage;
    cvtColor(matImage, dispImage, CV_BGR2RGB);
    bitmapRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:&dispImage.data pixelsWide:dispImage.cols pixelsHigh:dispImage.rows bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:dispImage.step bitsPerPixel:0];
    img = [[NSImage alloc] initWithSize:NSMakeSize(dispImage.cols, dispImage.rows)];
#else
    bitmapRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:&matImage.data pixelsWide:matImage.cols pixelsHigh:matImage.rows bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:matImage.step bitsPerPixel:0];
    img = [[NSImage alloc] initWithSize:NSMakeSize(matImage.cols, matImage.rows)];
#endif

    [img addRepresentation:bitmapRep];
    [View setImage:img];

#ifdef USE_CONVERT
    dispImage ...
(more)