Ask Your Question

Revision history [back]

Simulate long exposure from video frames OpenCV

I am trying to simulate a long exposure photo by combining images(frames) into one image and by performing operations based on a preset alpha. I am doing this on an iPhone, and I currently have the length of the video set to 1 second(30 frames). The alpha is set to 1.0/frameCount however I hard coded in 30 to represent one second of 30 FPS video capture. I stop the operations once it has reached one second of video/30 frames. The idea is the user can set a timer for x seconds and I will do the math to figure out how many frames to allow.

Here is the code I am using:

- (void)processImage:(Mat&)image
{

    if (_isRecording) {

        // first frame

        double alpha = 1.0/30;

        if (_frameCount == 0) {

            _exposed = image;
            _frameCount++;
        } else {

            Mat exposed = _exposed.clone();
            addWeighted(exposed, alpha, image, 1.0 - alpha, 0.0, _exposed);
            _frameCount++;
        }

        // stop and save image
        if (_frameCount == 30) {
            _isRecording = NO;
            _frameCount = 0;

            cvtColor(_exposed, _exposed, CV_BGRA2RGB, 30);
            UIImage *exposed = [LEMatConverter UIImageFromCVMat:_exposed];
            UIImageWriteToSavedPhotosAlbum(exposed, nil, nil, nil);
            NSLog(@"saved");
        }
    }
}

When I run this code I basically get back a still image that looks as if it is a single frame. Here is an example:

enter image description here

Does anyone know how I can produce the desired effect of a long exposure image from video frames given I know how many frames there will be?