Delay when grabbing frames from webcam
I want to use an external trigger to take a single photo from a webcam. However, my webcam behave differently than I expected. I placed the webcam in front of a digital clock and took photos every 2 minutes and then saved the photo as .jpg with the filename containing the current timestamp. The time on the first photo is the same as the first timestamp. Then the next four photos also show the clock showing the same time as the first timestamp (although they were taken at 2 min, 4 min, 6 min and 8 min). The clock on the 6th photo shows the time of the second timestamp (clock shows 2 min whereas jpg was saved at 10 min), the 7th photo shows the time of the 3rd timestamp and so on.
Timestamp jpg Clock in the photo 00:00 00:00 00:02 00:00 00:04 00:00 00:06 00:00 00:08 00:00 00:10 00:02 00:12 00:04 00:14 00:06 00:16 00:08
I have tried 2 webcams, 2 operating systems, and different time intervalls between 10 seconds and 2 minutes. I have tried the opencv function "read" as well as "grab" and "retrieve" and see no difference.
Code:
import cv2 import time cap = cv2.VideoCapture(0) steps = [120,120,120,10,10,10,60,60,10,10,10,10,10,10,0] for n in range(len(steps)): t = time.strftime("%Y-%m-%d_%H-%M-%S") #ret, frame = cap.read() ret = cap.grab() ret,frame = cap.retrieve() cv2.imwrite('test_' + t + '.jpg',frame) time.sleep(steps[n]) cap.release()
Is this the normal expected behaviour of a webcam or is something broken here? Am I using the right functions for my purpose?
Thank you for your help!
My first guess, like @berak said, you are using webcams with an internal frame buffer. Most of those provide you with an own SDK that allows you to set the buffer to contain only a single frame. By doing that you will always get the latest frame. A quick and dirty solution is to always grab 5 frames at a time and only store the latest one (an average frame buffer is about 2-5 frames in size).