Hi,
I'm working on a project that requires input from three webcams, all connected by USB, to be used with OpenCV. Separately, the code works for any one camera, but as soon as I want to use two cameras at the same time I get USB bandwidth issues. Using kernel module quirk modes helps somewhat, but not enough.
I've figured out that the problem does not arise when streaming from the three webcams using guvcview
, because in there I can opt for MJPEG compression of the video feed from the cameras. Indeed, if I try to use YUYV as the pixel format in guvcview
I get problems as soon as I try to stream from two or more cameras.
Therefore I want to get MJPEG streams going from the cameras into OpenCV. One way to do this would be to use FIFOs and gstreamer
, but I think that solution is rather bad. Is there a more OpenCV-esque way of doing this, by telling V4L to request an MJPEG stream and then uncompressing it in band?
Thanks in advance!