1 | initial version |
Well hello, future internet person with the same problem as me.
I ran out of patience and fell back to a UNIX pipe solution. What I did is something like this:
gst-launch-1.0 v4l2src device=/dev/video1 ! 'image/jpeg,width=640,height=480,framerate=15/1' ! filesink buffer-size=0 location=/dev/stdout | ./camera_app
i.e. I open the camera with gstreamer, ask for an image/jpeg
stream, forcing it to use MJPEG, and put it into a file sink using stdout
for its output. I pipe that into my camera app which I tell to VideoCapture.open
the file /dev/stdin
. This is rather ugly and it does have some delay, so I'd prefer not doing it this way, but this works for three USB cameras in parallel, on a UDOO board (similar to Wandboard etc.).