Latency with mjpg

asked 2015-09-01 07:46:44 -0600

LBerger gravatar image

updated 2015-09-08 14:00:35 -0600

Hi,

I want to calibrate my Ip cam using calibration sample but I have got some latency 1 s when I run program and when I start calibration sometime 10s (ten seconds!).

I have write this small program and try to understand what happen :

Mat view0;
for(i = 0;i<tps.size();i++)
{
    if( capture.isOpened() )
    {
        tps[i] = getTickCount();
        while (!capture.grab());
        capture.retrieve(view0);
        //view0.copyTo(view);
    }
}
for(i = 1;i<tps.size();i++)
{
    cout << (tps[i] - tps[i-1])/getTickFrequency() << "\n";
}

and it gives me these results : 22 values like 0.04s (first images) and after 0.852972s and many values like 0.5s (when buffer is empty?)

I understand that MJPG is a flow and you have to read all flow to have last image. Is it right?

Is it possible to flush flow before reading?

Now I have got a switch so my network is clean. i have a record a small video where we can see

  1. using mozilla to display my IPcam (upper left) bottom left opencv
  2. using opencv my ipcam (bottom left)
  3. a wab cam (Logitech 270) upper right

As you can see latency for each link are different. It seems that opencv with ffmpeg has got greater latency time than mozilla and webcam (for webcam it is logical).

have you got same latency time?

Thanks for yours answers

edit retag flag offensive close merge delete

Comments

Not sure to understand fully the problem. The latency occurs when you try to retrieve online the IP camera stream or when you try to read a MJPEG video file ?

I think that it is the first case. Anyway when I try to read a MJPEG video file (no display or waitKey, with the overloaded >> operator), I got:

Mean time=0.00746769 s ; std=0.00100821 s

The video:

Metadata:
    encoder         : Lavf55.19.104
  Duration: 00:00:10.53, start: 0.000000, bitrate: 58551 kb/s
    Stream #0:0: Video: mjpeg (MJPG / 0x47504A4D), yuvj420p(pc), 1280x720 [SAR 1:1 DAR 16:9], 58727 kb/s, 30 tbr, 30 tbn, 30 tbc

I may be wrong but I don't think that the latency comes from the video codec but rather from some network latency or camera fps maybe ?

Eduardo gravatar imageEduardo ( 2015-09-02 13:26:58 -0600 )edit

Thanks for your answer.

It's an IP cam smartCam HDPro samsung (http://user:[email protected]/cgi-b...) with a resolution 1280x960 for this adress.

I wondering : At time t I use capture>> frame1 and after do some processing. (0.5s later for example) I do a new capture>> frame2

If real time for frame1 is t what is real time for frame2? Is it t+1/25s (for fps 25) or is it t+0.5s?

For network I am waiting for switch to have my private network and test latency.

LBerger gravatar imageLBerger ( 2015-09-02 14:24:08 -0600 )edit

with ffplay I have got this :

[mjpeg @ 00000000024bbe80] Format mjpeg detected only with low score of 25, misdetection possible!
Input #0, mjpeg, from 'http://admin:[email protected]/cgi-bin/video.cgi?msubmenu=mjpg&resolution=2':
  Duration: N/A, bitrate: N/A
    Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1280x960 [SAR 96:96 DAR 4:3], 25 tbr, 1200k tbn, 25 tbc
[swscaler @ 00000000033b26a0] deprecated pixel format used, make sure you did set range correctly
  10.40 M-V:  5.407 fd=   2 aq=    0KB vq=    0KB sq=    0B f=0/0
LBerger gravatar imageLBerger ( 2015-09-02 14:25:24 -0600 )edit

just saying, getTickCount() measures cpu time only, not wall time.

so, your counting does neiter include io or sleep, blocking sockets or the like.

to measure network latency, you need realtime clock, e.g. from std::chrono

berak gravatar imageberak ( 2015-09-09 05:51:16 -0600 )edit