Ask Your Question

visiony2's profile - activity

2017-07-10 00:51:48 -0600 received badge  Famous Question (source)
2016-12-21 05:55:18 -0600 received badge  Notable Question (source)
2016-08-02 00:31:39 -0600 received badge  Enlightened (source)
2016-08-02 00:31:39 -0600 received badge  Good Answer (source)
2016-06-13 02:46:32 -0600 received badge  Popular Question (source)
2015-06-02 02:09:28 -0600 answered a question OpenCV Stereo Calibration and triangulation in a user defined coordinate system

One way to do it, would be: Part1: use SolvePnP to map the points of your ' known 3D point locations' into it's image coordinates for one of your two camera - say cameraA. This will give you an SE3 (aka: rigid body transform, or a translation + rotation) from your wanted 'user defined coordinate system' into the coordinate system of cameraA.

Part2: Then, when you do your triangulation, you'll get point3D's in either the coordinate system of CameraA or CameraB. If it's cameraB ---> use the calibration data to get the transform to reach cameraA. Once the point3D's (from your triagulation) are in cameraA, use that result from Part1 to reach your desired result

2015-05-13 00:39:31 -0600 received badge  Enthusiast
2015-05-10 07:01:45 -0600 received badge  Nice Answer (source)
2015-05-09 21:46:44 -0600 commented answer Is it possible to get frame timestamps for live streaming video frames on linux?

@berak and @theodore: good to know! I created the pull request

2015-05-09 21:27:47 -0600 received badge  Editor (source)
2015-05-09 21:26:10 -0600 received badge  Scholar (source)
2015-05-09 10:06:07 -0600 received badge  Teacher (source)
2015-05-09 08:04:17 -0600 received badge  Self-Learner (source)
2015-05-09 07:48:14 -0600 answered a question Is it possible to get frame timestamps for live streaming video frames on linux?

So, after many hours of digging into berek's pointer to the "cap_v4l.cpp" vs. "cap_libv4l.cpp" distinction, I find the answer is: yes ---> you can get it working...but you'll need to patch 'cap_libv4l.cpp'

You can download the modified one from: https://github.com/msandler/opencv/bl...

History and Usage Tips:

Sometime before OpenCV version 2.4, someone copied and pasted 'cap_v4l.cpp' to create 'cap_libv4l.cpp.' cap_v4l.cpp isn't even built. However the port was incomplete.

Using cap_v4l.cpp as a guide, I modified the 3.0 rc1 version of cap_v4llib.cpp, and 1st added supported for: CV_CAP_PROP_POS_MSEC, and then also for: CV_CAP_PROP_POS_FRAMES and CV_CAP_PROP_FPS.

It definitely strikes me that these changes outta be in the upstream, and, 'cap_v4l.cpp' should be removed to avoid future confusion.

Here's how to use the mod's in your code, with output examples:

 cv::VideoCapture videoCap; 
 ....
//after: 
videoCap >> image;

printf( "CV_CAP_PROP_POS_MSEC:   %ld   \n", (long) videoCap.get( CV_CAP_PROP_POS_MEC) ); 
printf( "CV_CAP_PROP_POS_FRAMES:  %ld \n", (long) videoCap.get( CV_CAP_PROP_POS_FRAMES) ));  // <-- the v4l2 'sequence' field
printf( "CV_CAP_PROP_FPS:  %f\n", videoCap.get( CV_CAP_PROP_FPS));

for a Logitech C920, this gave:

CV_CAP_PROP_POS_FRAMES: 28
CV_CAP_PROP_FPS:  30.000000
CV_CAP_PROP_POS_MSEC:   28834084     ;  diffFromLastFrame 36

where 'diffFromLastFrame' is simply the difference between two successive frames for 'CV_CAP_PROP_POS_MSEC'


Incidentally, V4L binds the timestamp to: CLOCK_MONOTONIC, which isn't unix epoch. Mostly, it's the time elapsed since the OS boot, but if your machine hibernated, it's much more complex. To simplify everything (e.g. for distributed camera processing), you'll need to add a constant offset to get it to unix time.

The following code does it -- pass in "(long) videoCap.get( CV_CAP_PROP_POS_MEC) "

#include <time.h>
long monotonicToEpochOffset_ms = -1; //<-- initialize in global scope
long convertToEpoch( long v4l_ts_ms){

    if( monotonicToEpochOffset_ms ==-1){
        struct timeval epochtime;  gettimeofday(&epochtime, NULL); 
        struct timespec  vsTime;  clock_gettime(CLOCK_MONOTONIC, &vsTime);

        long uptime_ms = vsTime.tv_sec* 1000 + (long)  round( vsTime.tv_nsec/ 1000000.0);
        long epoch_ms =  epochtime.tv_sec * 1000  + (long) round( epochtime.tv_usec/1000.0);    

        // add this quantity to the CV_CAP_PROP_POS_MEC to get unix time stamped frames
        monotonicToEpochOffset_ms = epoch_ms - uptime_ms; 
    }

    return monotonicToEpochOffset_ms + v4l_ts_ms; 

}

After this conversion, you can difference the current time from that epoch converted time to get the lag of your code from the driver's timestamp. When working with the frame data from python or java, or on a distributed system, this is particularly helpful. In my test case I'm printing:

CV_CAP_PROP_POS_MSEC - epoched:   1431173328429 ; diffFromNow: 29
2015-05-07 05:27:19 -0600 commented question Is it possible to get frame timestamps for live streaming video frames on linux?

@berek - thank you for the pointer, I'll try it later this week and report what I find

2015-05-03 09:50:19 -0600 received badge  Student (source)
2015-05-03 09:25:34 -0600 asked a question Is it possible to get frame timestamps for live streaming video frames on linux?

I'm using OpenCV version 2.3 with a usb camera, Logitech C920, on linux (ubuntu).

I'm able to open the video stream, however, the call: CV_CAP_PROP_POS_MSEC (current position in milliseconds) does not work.

I've used other wrappers on V4L and similar calls have worked (e.g via V4L4J).

Can someone clarify:

  • --> Is there indeed no way to use openCV's video API with a usb camera and get timestamps on each frame?

    • Workarounds I can imagine: a flag to build OpenCV with; a newer version of the OpenCV API; some configuration call on the lower level v4l drivers, etc.

I'm trying to work out latency in a processing pipeline, and coordinate cameras across machines, and without timestamps it's a real challenge.

I see numerous informal answers to questions suggesting OpenCV doesn't offer timestamps on live-streaming video, but nothing is definitive.

At the same time, there are hints that there are ways to do this, for example:

So, can anyone clarify, how to get the frame's timestamp, or that it cannot be done?

Thank you!!