Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Why is there a delay between a .read and the moment of the picture is taken

Hi, first I'm sorry for my poor English, please for your indulgence. I've wrote a simple soft in c++, on my rpi3 with linux OS, and opencv2 library. The goal is to take a picture from a USB cam each 100ms.

There is a software interruption each 100ms, which seems to be working perfectly: I check with an ocilloscope the moment of the entry and exit into this interruption. I mesure that there is one entry each 100ms (+/-100µs) and the duration between the entry and the exit is less than 5µs. Then, I add a cap.read in the interruption for capture a frame each 100ms. The interruption continues to work well, each 100ms, and duration between entry and exit is about 10 to 20ms (duration of .read). I suppose that this duration depends on the brightness of the target.

Next, my target is the chronometer of my smartphone, and I realise that pictures don't display an excpected time (I continue to survey timing of interruptions which is perfect). A typical exemple: pict 00 : 10sec 85 pict 01 : 10sec 95 pict 02 : 11sec 08 (+30ms too much) pict 03 : 11sec 11 !!!!! pict 04 : 11sec 15 pict 05 : 11sec 32 pict 06 : 11sec 37 ( pict 07 : 11sec 46 pict 08 : 11sec 56 pict 09 : 11sec 66 pict 10 : 11sec 76 pict 11 : 11sec 86 pict 12 : 11sec 96 pict 13 : 11sec 06

Does the chronometer of my smartphone is not good, I think it's unlikely. Is there a latence between a .read and the real moment when the picture is taken?

Can you exlplain me this phenomena?

Best

click to hide/show revision 2
No.2 Revision

Why is there a delay between a .read and the moment of the picture is taken

Hi, first I'm sorry for my poor English, please for your indulgence. I've wrote a simple soft in c++, on my rpi3 with linux OS, and opencv2 library. The goal is to take a picture from a USB cam each 100ms.

There is a software interruption each 100ms, which seems to be working perfectly: I check with an ocilloscope the moment of the entry and exit into this interruption. I mesure that there is one entry each 100ms (+/-100µs) and the duration between the entry and the exit is less than 5µs. Then, I add a cap.read in the interruption for capture a frame each 100ms. The interruption continues to work well, each 100ms, and duration between entry and exit is about 10 to 20ms (duration of .read). I suppose that this duration depends on the brightness of the target.

Next, my target is the chronometer of my smartphone, and I realise that pictures don't display an excpected time (I continue to survey timing of interruptions which is perfect). A typical exemple: exemple:

pict 00 : 10sec 85
pict 01 : 10sec 95
pict 02 : 11sec 08 (+30ms too much)
pict 03 : 11sec 11 !!!!!
pict 04 : 11sec 15
pict 05 : 11sec 32
pict 06 : 11sec 37  (
pict 07 : 11sec 46
pict 08 : 11sec 56
pict 09 : 11sec 66
pict 10 : 11sec 76
pict 11 : 11sec 86
pict 12 : 11sec 96
pict 13 : 11sec 06

06

Does the chronometer of my smartphone is not good, I think it's unlikely. Is there a latence between a .read and the real moment when the picture is taken?

Can you exlplain me this phenomena?

Best