2020-11-16 08:06:55 -0600 | received badge | ● Famous Question (source) |
2018-12-13 11:26:32 -0600 | received badge | ● Notable Question (source) |
2018-01-18 03:40:08 -0600 | received badge | ● Popular Question (source) |
2016-04-29 00:39:02 -0600 | asked a question | Is OpenCV's VideoCapture.read() function skipping frames? The Problem I'm writing a video tracking application (in Python) which requires me to calculate the amount of time an object has spent in a particular region of the frame. What I've done is count the number of frames the object has been in that region and multiply this number by 1/FPS of the video. However, I've noticed that the time I calculate is incorrect even for very simple test cases. I think I've tracked the problem down to OpenCV's VideoCapture.read() function. It doesn't seem to grab all the frames available in the video. I've tested this with the following code: The output for this block of code with my "Test 1.avi" file is:: As you can see, the number of frames I've counted and read is not the same as the number of frames in the video file. In fact, the number of frames that I count is about half of the number of frames in the video. FYI: "Test 1.avi" is 2.5 mins long. 5171*0.0291557 sec = 2.5 mins, meaning that 5171 is an accurate count of the number of frames in "Test 1.avi." So ... why is this happening? Is OpenCV's VideoCapture.read() function skipping frames? Software Information I am running:
|
2016-04-29 00:39:00 -0600 | asked a question | Is OpenCV's VideoCapture.read() function skipping frames? The Problem I'm writing a video tracking application (in Python) which requires me to calculate the amount of time an object has spent in a particular region of the frame. What I've done is count the number of frames the object has been in that region and multiply this number by 1/FPS of the video. However, I've noticed that the time I calculate is incorrect even for very simple test cases. I think I've tracked the problem down to OpenCV's VideoCapture.read() function. It doesn't seem to grab all the frames available in the video. I've tested this with the following code: The output for this block of code with my "Test 1.avi" file is:: As you can see, the number of frames I've counted and read is not the same as the number of frames in the video file. In fact, the number of frames that I count is about half of the number of frames in the video. FYI: "Test 1.avi" is 2.5 mins long. 5171*0.0291557 sec = 2.5 mins, meaning that 5171 is an accurate count of the number of frames in "Test 1.avi." So ... why is this happening? Is OpenCV's VideoCapture.read() function skipping frames? Software Information I am running:
|