Why is there a delay between a .read and the moment of the picture is taken

asked 2017-03-11 04:40:53 -0600

niconol gravatar image

updated 2017-03-11 12:50:33 -0600

Tetragramm gravatar image

Hi, first I'm sorry for my poor English, please for your indulgence. I've wrote a simple soft in c++, on my rpi3 with linux OS, and opencv2 library. The goal is to take a picture from a USB cam each 100ms.

There is a software interruption each 100ms, which seems to be working perfectly: I check with an ocilloscope the moment of the entry and exit into this interruption. I mesure that there is one entry each 100ms (+/-100µs) and the duration between the entry and the exit is less than 5µs. Then, I add a cap.read in the interruption for capture a frame each 100ms. The interruption continues to work well, each 100ms, and duration between entry and exit is about 10 to 20ms (duration of .read). I suppose that this duration depends on the brightness of the target.

Next, my target is the chronometer of my smartphone, and I realise that pictures don't display an excpected time (I continue to survey timing of interruptions which is perfect). A typical exemple:

pict 00 : 10sec 85
pict 01 : 10sec 95
pict 02 : 11sec 08 (+30ms too much)
pict 03 : 11sec 11 !!!!!
pict 04 : 11sec 15
pict 05 : 11sec 32
pict 06 : 11sec 37  (
pict 07 : 11sec 46
pict 08 : 11sec 56
pict 09 : 11sec 66
pict 10 : 11sec 76
pict 11 : 11sec 86
pict 12 : 11sec 96
pict 13 : 11sec 06

Does the chronometer of my smartphone is not good, I think it's unlikely. Is there a latence between a .read and the real moment when the picture is taken?

Can you exlplain me this phenomena?

Best

edit retag flag offensive close merge delete