# highgui.VideoCapture buffer introducing lag

Dear all,

I am working with a Beaglebone Black, creating a project for future usage in a differente machine with lower capabilities. Problem is I have a timer working on kernel which pulls a triger every X seconds (user can change this value). When the program starts I open the webcam with VideoCapture, and at every tick of the timer I get a frame either with "camera >> cameraFrame" or with "camera.read(cameraFrame, 0)".

The problem is that the VideoCapture buffer is introducing a lag, so at every new frame I get it is not the very last frame but one of the lasts, so I see a lag of about 10 seconds (variable)

I have read that one possible solution is creating a new thread that handles the camera frames and get them in a cv::Mat, so when the main thread needs the last frame it just asks for it. The problem with this solution is that we need to elude this kind of proccesses so when the timmer acts I get a frame and the everything stops.

So, is there any possibility of modifying the VideoCapture buffer, or even clear it before acquiring the new frame?

edit retag close merge delete

1

I ended up with a separate thread and a dual buffer system, but since my target is an I7 PC it's got plenty of time and ram for multiple image copies. In a timer based single thread system you need to find a non blocking way to check to see if a new frame has came in. If you do use cv:mat make sure you use Clone or copyTo so that the actual buffer is copied to prevent thread problems.

( 2014-03-14 13:29:22 -0500 )edit

Sort by » oldest newest most voted

more

I have to say thank you to you that you saved my life!

( 2017-10-10 02:11:32 -0500 )edit

If you only display the frames that contain delay you will get a low frame-rate. Not sure if this is what Valdaine already meant, but based on his solution I came up with a way to remove the delay while keeping the original framerate.

First you have to flush the buffer, and then you start capturing the frames faster than the stream framerate (use cv::waitKey(1) or ideally a loop without any interval).

VideoCapture camera("host-address:port");

Mat frame;

flush(camera);

while (cv::waitKey(1) != 27)
{
camera >> frame;

cv::imshow("IP Camera", frame);
}


The first frame read from the buffer has a 2ms delay, so the loop must be escaped only after reading the second frame with delay.

void flush(VideoCapture& camera)
{
int delay = 0;
QElapsedTimer timer;

int framesWithDelayCount = 0;

while (framesWithDelayCount <= 1)
{
timer.start();

camera.grab();

delay = timer.elapsed();

if(delay > 0)
{
framesWithDelayCount++;
}
}
}

more

Dear all, I experienced the same problem as InakiE. Thanks to Tytan, I tried the solution to read several images in the buffer. I tried an enhancement by measuring the time it takes to capture the image. I remarked that it takes around 4ms to capture from the buffer and then around 30ms when buffer empty. I wrote the following code (I use Qt library but can be easily adapted):

   int delai=0;
QTime h;
do
{   h.start();//Start the chronometer
cap>>AcquisitionFrame; //Declarations not included in this example
delai=h.elapsed(); //Compute delay in ms since the start commande
qDebug() << delai; //Display the delay for test
// and measurement purposes in debug windows
}
while(delai<10);


With that config, the number of frame to loose is independant from the time it takes to process the image. We are "just in time".

If it can help others.

more

Official site

GitHub

Wiki

Documentation

## Stats

Seen: 17,921 times

Last updated: Sep 26 '16