Ask Your Question
0

High Dynamic Range Imaging using openCV on iOS producing bad output

asked 2015-11-30 19:05:26 -0500

artandmath gravatar image

I've asked this question at on stack overflow but maybe someone in the openCV community has come across the same issue:

http://stackoverflow.com/questions/34...

I'm trying to use openCV 3 on iOS to produce an HDR image from multiple exposures that will eventually be output as an EXR file. I noticed I was getting garbled output when I tried to create an HDR image. Thinking it was a mistake in trying to create a camera response, I started from scratch and adapted the HDR imaging tutorial material on the openCV to iOS but it produces similar results. The following C++ code returns a garbled image:

cv::Mat mergeToHDR (vector<Mat>& images, vector<float>& times)
{
    imgs = images;
    Mat response;
    //Ptr<CalibrateDebevec> calibrate = createCalibrateDebevec();
    //calibrate->process(images, response, times);

    Ptr<CalibrateRobertson> calibrate = createCalibrateRobertson();
    calibrate->process(images, response, times);

    // create HDR
    Mat hdr;
    Ptr<MergeDebevec> merge_debevec = createMergeDebevec();
    merge_debevec->process(images, hdr, times, response);

    // create LDR
    Mat ldr;
    Ptr<TonemapDurand> tonemap = createTonemapDurand(2.2f);
    tonemap->process(hdr, ldr);

    // create fusion
    Mat fusion;
    Ptr<MergeMertens> merge_mertens = createMergeMertens();
    merge_mertens->process(images, fusion);

    /*
    Uncomment what kind of tonemapped image or hdr to return
    Returning one of the images in the array produces ungarbled output
    so we know the problem is unlikely with the openCV to UIImage conversion
    */

    //give back one of the images from the image array
    //return images[0];

    //give back one of the hdr images
    return fusion * 255;
    //return ldr * 255;
    //return hdr
}

This is what the image looks like:

Bad image

I've analysed the image, tried various colour space conversions, but the data appears to be junk.

The openCV framework is the latest compiled 3.0.0 version from the openCV.org website. The RC and alpha produce the same results, and the current version won't build (for iOS or OSX). I was thinking my next steps would be to try and get the framework to compile from scratch, or to get the example working under another platform to see if the issue is platform specific or with the openCV HDR functions themselves. But before I do that I thought I would throw the issue up on stack overflow to see if anyone had come across the same issue or if I am missing something blindingly obvious.

I have uploaded the example xcode project to here:

https://github.com/artandmath/openCVH...

The HDR tutorial from openCV:

http://docs.opencv.org/master/d3/db7/...

Getting openCV to work with swift was with the help from user foundry on github

edit retag flag offensive close merge delete

Comments

If I noticed it correctly you fixed it on SO? Could you add your results here too?

StevenPuttemans gravatar imageStevenPuttemans ( 2015-12-01 04:08:57 -0500 )edit

Yes. I've found a fix. As a new user, the system says I must wait 2 days to answer my own question. I'll post the answer soon.

artandmath gravatar imageartandmath ( 2015-12-01 05:08:08 -0500 )edit

I just increased your karma with 25 points. Now you should be able to answer immediatly!

StevenPuttemans gravatar imageStevenPuttemans ( 2015-12-01 06:00:04 -0500 )edit

Unfortunately the system won't let me post the answer and warns me new users can't answer their own question for two days. I think it's because I signed up on the forum today.

artandmath gravatar imageartandmath ( 2015-12-01 06:28:45 -0500 )edit

Might be possible too :D oh well enjoy the karma boost :)

StevenPuttemans gravatar imageStevenPuttemans ( 2015-12-01 06:31:14 -0500 )edit

1 answer

Sort by ยป oldest newest most voted
1

answered 2015-12-03 21:29:37 -0500

artandmath gravatar image

updated 2015-12-04 16:43:29 -0500

Thanks foundry for pointing me in the right direction. The UIImage+OpenCV class extension is expecting 8-bits per colour channel, however the HDR functions are spitting out 32-bits per channel (which is actually what I want). Converting the image matrix back to 8-bits per channel for display purposes before converting it to a UIImage fixes the issue.

Here is the resulting image:

The expected result!

Here is the fixed function:

cv::Mat mergeToHDR (vector<Mat>& images, vector<float>& times)
{
    imgs = images;
    Mat response;
    //Ptr<CalibrateDebevec> calibrate = createCalibrateDebevec();
    //calibrate->process(images, response, times);

    Ptr<CalibrateRobertson> calibrate = createCalibrateRobertson();
    calibrate->process(images, response, times);

    // create HDR
    Mat hdr;
    Ptr<MergeDebevec> merge_debevec = createMergeDebevec();
    merge_debevec->process(images, hdr, times, response);

    // create LDR
    Mat ldr;
    Ptr<TonemapDurand> tonemap = createTonemapDurand(2.2f);
    tonemap->process(hdr, ldr);

    // create fusion
    Mat fusion;
    Ptr<MergeMertens> merge_mertens = createMergeMertens();
    merge_mertens->process(images, fusion);

    /*
     Uncomment what kind of tonemapped image or hdr to return
     Convert back to 8-bits per channel because that is what
     the UIImage+OpenCV class extension is expecting
    */


    // tone mapped
    /*
    Mat ldr8bit;
    ldr = ldr * 255;
    ldr.convertTo(ldr8bit, CV_8U);
    return ldr8bit;
    */

    // fusion
    Mat fusion8bit;
    fusion = fusion * 255;
    fusion.convertTo(fusion8bit, CV_8U);
    return fusion8bit;

    // hdr
    /*
    Mat hdr8bit;
    hdr = hdr * 255;
    hdr.convertTo(hdr8bit, CV_8U);
    return hdr8bit;
    */
}

Alternatively here is a fix for the - (id)initWithCVMat:(const cv::Mat&)cvMat method in the OpenCV+UIImage class extension based on one of the iOS tutorials in the iOS section on opencv.org:

http://docs.opencv.org/2.4/doc/tutori...

When creating a new CGImageRef with floating point data, it needs to be explicitly told that it expects floating point data, and the byte order of the image data from openCV needs to be reversed. Now iOS has the float data! It's a bit of a hacky fix, because the method still only deals with 8 bit or 32 bits per channel or alphas and doesn't take into account every kind of image that could be passed from Mat to UIImage.

- (id)initWithCVMat:(const cv::Mat&)cvMat
{
    NSData *data = [NSData dataWithBytes:cvMat.data length:cvMat.elemSize() * cvMat.total()];
    CGColorSpaceRef colorSpace;

    size_t elemSize = cvMat.elemSize();
    size_t elemSize1 = cvMat.elemSize1();

    size_t channelCount = elemSize/elemSize1;
    size_t bitsPerChannel = 8 * elemSize1;
    size_t bitsPerPixel = bitsPerChannel * channelCount;

    if (channelCount == 1) {
        colorSpace = CGColorSpaceCreateDeviceGray();
    } else {
        colorSpace = CGColorSpaceCreateDeviceRGB();
    }

    // Tell CGIImageRef different bitmap info if handed 32-bit
    uint32_t bitmapInfo = kCGImageAlphaNone | kCGBitmapByteOrderDefault;

    if (bitsPerChannel == 32 ){
        bitmapInfo = kCGImageAlphaNoneSkipLast | kCGBitmapFloatComponents | kCGBitmapByteOrder32Little;
    }

    CGDataProviderRef provider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);

    // Creating CGImage from cv::Mat
    CGImageRef imageRef = CGImageCreate(cvMat.cols,                                 //width
                                        cvMat.rows,                                 //height
                                        bitsPerChannel,                             //bits per component
                                        bitsPerPixel,                               //bits per pixel
                                        cvMat.step[0],                              //bytesPerRow
                                        colorSpace,                                 //colorspace
                                        bitmapInfo,                                 // bitmap info
                                        provider,                                   //CGDataProviderRef
                                        NULL,                                       //decode
                                        false,                                      //should interpolate
                                        kCGRenderingIntentDefault                   //intent
                                        );                     

    // Getting UIImage from CGImage
    self = [self initWithCGImage:imageRef];
    CGImageRelease(imageRef);
    CGDataProviderRelease(provider);
    CGColorSpaceRelease(colorSpace);

    return self;
}
edit flag offensive delete link more

Comments

if you think of it, your problem has absolutely nothing to do with ios or your bloody hardware.

imho, all references to that should be removed.

berak gravatar imageberak ( 2015-12-04 10:22:59 -0500 )edit
1

@berak - I think it has everything to do with iOS, but your are right, nothing to do with the camera. I've expanded on my answer.

artandmath gravatar imageartandmath ( 2015-12-04 16:41:11 -0500 )edit
Login/Signup to Answer

Question Tools

1 follower

Stats

Asked: 2015-11-30 19:05:26 -0500

Seen: 865 times

Last updated: Dec 04 '15