Ask Your Question

highgui.VideoCapture buffer introducing lag

asked 2014-03-14 03:23:17 -0500

InakiE gravatar image

Dear all,

I am working with a Beaglebone Black, creating a project for future usage in a differente machine with lower capabilities. Problem is I have a timer working on kernel which pulls a triger every X seconds (user can change this value). When the program starts I open the webcam with VideoCapture, and at every tick of the timer I get a frame either with "camera >> cameraFrame" or with ", 0)".

The problem is that the VideoCapture buffer is introducing a lag, so at every new frame I get it is not the very last frame but one of the lasts, so I see a lag of about 10 seconds (variable)

I have read that one possible solution is creating a new thread that handles the camera frames and get them in a cv::Mat, so when the main thread needs the last frame it just asks for it. The problem with this solution is that we need to elude this kind of proccesses so when the timmer acts I get a frame and the everything stops.

So, is there any possibility of modifying the VideoCapture buffer, or even clear it before acquiring the new frame?

Thanks for your help in advanced!

edit retag flag offensive close merge delete



I ended up with a separate thread and a dual buffer system, but since my target is an I7 PC it's got plenty of time and ram for multiple image copies. In a timer based single thread system you need to find a non blocking way to check to see if a new frame has came in. If you do use cv:mat make sure you use Clone or copyTo so that the actual buffer is copied to prevent thread problems.

GrumbleLion gravatar imageGrumbleLion ( 2014-03-14 13:29:22 -0500 )edit

3 answers

Sort by ยป oldest newest most voted

answered 2014-04-10 02:27:10 -0500

Tytan gravatar image

Same problem here, didn't find any proper solution but some kind of a hack to do what I want : I found out that this buffer accumulates a constant number of images if you don't read them. In my case, the buffer always containes 5 images. So I did something (wrong) like that everytime I needed an image : "for(i=0; i<6; i++) { cam >> img; }" Reading from buffer don't take long and this doen't seem to bring any problem to the table as long as you don't read too ofter from your camera. As I said, if you constantly read from VideoCapture, your buffer is empty, and if you try to read 5 images from your camera as your buffer is empty, you'll have to wait your camera to capture the images. What I mean is : if you only need to read an image time to times, just use that hack, but if (like me) you sometimes need to do a real-time analysis, that may be a bit more complicated.

edit flag offensive delete link more


I have to say thank you to you that you saved my life!

n5ken gravatar imagen5ken ( 2017-10-10 02:11:32 -0500 )edit

answered 2016-09-26 15:02:32 -0500

If you only display the frames that contain delay you will get a low frame-rate. Not sure if this is what Valdaine already meant, but based on his solution I came up with a way to remove the delay while keeping the original framerate.

First you have to flush the buffer, and then you start capturing the frames faster than the stream framerate (use cv::waitKey(1) or ideally a loop without any interval).

VideoCapture camera("host-address:port");

Mat frame;


while (cv::waitKey(1) != 27)
    camera >> frame;

    cv::imshow("IP Camera", frame);

The first frame read from the buffer has a 2ms delay, so the loop must be escaped only after reading the second frame with delay.

void flush(VideoCapture& camera)
    int delay = 0;
    QElapsedTimer timer;

    int framesWithDelayCount = 0;

    while (framesWithDelayCount <= 1)   


        delay = timer.elapsed();        

        if(delay > 0)
edit flag offensive delete link more

answered 2014-07-27 13:44:26 -0500

Dear all, I experienced the same problem as InakiE. Thanks to Tytan, I tried the solution to read several images in the buffer. I tried an enhancement by measuring the time it takes to capture the image. I remarked that it takes around 4ms to capture from the buffer and then around 30ms when buffer empty. I wrote the following code (I use Qt library but can be easily adapted):

   int delai=0;
   QTime h;
        {   h.start();//Start the chronometer
            cap>>AcquisitionFrame; //Declarations not included in this example
            delai=h.elapsed(); //Compute delay in ms since the start commande
            qDebug() << delai; //Display the delay for test 
                               // and measurement purposes in debug windows

With that config, the number of frame to loose is independant from the time it takes to process the image. We are "just in time".

If it can help others.

edit flag offensive delete link more
Login/Signup to Answer

Question Tools


Asked: 2014-03-14 03:23:17 -0500

Seen: 12,960 times

Last updated: Sep 26 '16