OpenCV + WebCam + H.264 Streaming
Hi, I have the following problem. I want to grab frames from webcam (I know how to do that in OpenCV I am familiar with this library), then take the frame, and encode it via H.264 and sent it to another computer. The other computer needs to decode each frame with OpenCV and process it.
My main problems are:
- I do not know if it is possible to encode to H.264 frame by frame (I think that this is only possible with a buffer).
- I do not know the better way to send my video frame by frame to another computer by the network.
- I want to keep the framerate higher as it would be possible.
I searched in lots of forums but every new post made me more confused...
I am working on Linux with OpenCV 3 on Raspberry Pi (but my question is in general for Linux, no only specific for rpi).
Thanks in advance, and new ideas and proposals are welcome :)
if you have gstreamer support builtin, you could use opencv's VideoWriter with a custom (server) pipeline
also look up the MJPG protocol, you can send single (JPG or PNG encoded) frames via tcp