Ask Your Question

Is it possible to get frame timestamps for live streaming video frames on linux?

asked 2015-05-03 08:18:10 -0500

visiony2 gravatar image

I'm using OpenCV version 2.3 with a usb camera, Logitech C920, on linux (ubuntu).

I'm able to open the video stream, however, the call: CV_CAP_PROP_POS_MSEC (current position in milliseconds) does not work.

I've used other wrappers on V4L and similar calls have worked (e.g via V4L4J).

Can someone clarify:

  • --> Is there indeed no way to use openCV's video API with a usb camera and get timestamps on each frame?

    • Workarounds I can imagine: a flag to build OpenCV with; a newer version of the OpenCV API; some configuration call on the lower level v4l drivers, etc.

I'm trying to work out latency in a processing pipeline, and coordinate cameras across machines, and without timestamps it's a real challenge.

I see numerous informal answers to questions suggesting OpenCV doesn't offer timestamps on live-streaming video, but nothing is definitive.

At the same time, there are hints that there are ways to do this, for example:

So, can anyone clarify, how to get the frame's timestamp, or that it cannot be done?

Thank you!!

edit retag flag offensive close merge delete


i don't know why, but there is cap_v4l.cpp (which keeps timestamps) and cap_libv4l.cpp (which does not).

berak gravatar imageberak ( 2015-05-03 09:47:39 -0500 )edit

@berek - thank you for the pointer, I'll try it later this week and report what I find

visiony2 gravatar imagevisiony2 ( 2015-05-07 05:27:19 -0500 )edit

1 answer

Sort by ยป oldest newest most voted

answered 2015-05-09 07:48:14 -0500

visiony2 gravatar image

updated 2015-05-09 21:27:47 -0500

So, after many hours of digging into berek's pointer to the "cap_v4l.cpp" vs. "cap_libv4l.cpp" distinction, I find the answer is: yes ---> you can get it working...but you'll need to patch 'cap_libv4l.cpp'

You can download the modified one from:

History and Usage Tips:

Sometime before OpenCV version 2.4, someone copied and pasted 'cap_v4l.cpp' to create 'cap_libv4l.cpp.' cap_v4l.cpp isn't even built. However the port was incomplete.

Using cap_v4l.cpp as a guide, I modified the 3.0 rc1 version of cap_v4llib.cpp, and 1st added supported for: CV_CAP_PROP_POS_MSEC, and then also for: CV_CAP_PROP_POS_FRAMES and CV_CAP_PROP_FPS.

It definitely strikes me that these changes outta be in the upstream, and, 'cap_v4l.cpp' should be removed to avoid future confusion.

Here's how to use the mod's in your code, with output examples:

 cv::VideoCapture videoCap; 
videoCap >> image;

printf( "CV_CAP_PROP_POS_MSEC:   %ld   \n", (long) videoCap.get( CV_CAP_PROP_POS_MEC) ); 
printf( "CV_CAP_PROP_POS_FRAMES:  %ld \n", (long) videoCap.get( CV_CAP_PROP_POS_FRAMES) ));  // <-- the v4l2 'sequence' field
printf( "CV_CAP_PROP_FPS:  %f\n", videoCap.get( CV_CAP_PROP_FPS));

for a Logitech C920, this gave:

CV_CAP_PROP_FPS:  30.000000
CV_CAP_PROP_POS_MSEC:   28834084     ;  diffFromLastFrame 36

where 'diffFromLastFrame' is simply the difference between two successive frames for 'CV_CAP_PROP_POS_MSEC'

Incidentally, V4L binds the timestamp to: CLOCK_MONOTONIC, which isn't unix epoch. Mostly, it's the time elapsed since the OS boot, but if your machine hibernated, it's much more complex. To simplify everything (e.g. for distributed camera processing), you'll need to add a constant offset to get it to unix time.

The following code does it -- pass in "(long) videoCap.get( CV_CAP_PROP_POS_MEC) "

#include <time.h>
long monotonicToEpochOffset_ms = -1; //<-- initialize in global scope
long convertToEpoch( long v4l_ts_ms){

    if( monotonicToEpochOffset_ms ==-1){
        struct timeval epochtime;  gettimeofday(&epochtime, NULL); 
        struct timespec  vsTime;  clock_gettime(CLOCK_MONOTONIC, &vsTime);

        long uptime_ms = vsTime.tv_sec* 1000 + (long)  round( vsTime.tv_nsec/ 1000000.0);
        long epoch_ms =  epochtime.tv_sec * 1000  + (long) round( epochtime.tv_usec/1000.0);    

        // add this quantity to the CV_CAP_PROP_POS_MEC to get unix time stamped frames
        monotonicToEpochOffset_ms = epoch_ms - uptime_ms; 

    return monotonicToEpochOffset_ms + v4l_ts_ms; 


After this conversion, you can difference the current time from that epoch converted time to get the lag of your code from the driver's timestamp. When working with the frame data from python or java, or on a distributed system, this is particularly helpful. In my test case I'm printing:

CV_CAP_PROP_POS_MSEC - epoched:   1431173328429 ; diffFromNow: 29
edit flag offensive delete link more


berak gravatar imageberak ( 2015-05-09 08:08:02 -0500 )edit

yup I agree with @berak, a pull request contributing your solution to the official source code would really nice.

theodore gravatar imagetheodore ( 2015-05-09 14:39:28 -0500 )edit

@berak and @theodore: good to know! I created the pull request

visiony2 gravatar imagevisiony2 ( 2015-05-09 21:46:44 -0500 )edit

opencv's buildbots are quite picky with trailing whitespace ;(

(if you got like sublime2 around, just saving from there will remove them all)

berak gravatar imageberak ( 2015-05-10 02:14:52 -0500 )edit
Login/Signup to Answer

Question Tools

1 follower


Asked: 2015-05-03 08:18:10 -0500

Seen: 4,571 times

Last updated: May 09 '15