Android- Read Streaming from FFmpeg
Hello!
I receive frames streaming from Android (From a GoPro). I need to restream the frames in local and read them from OpenCV to display in the screen! How can I do this? What protocol can I use to restream them in local?
Thank you very much!
Best Regards!
once you have the images in memory, it should be easy, to translate those to an opencv Mat.
please show (the relevant part of) your code !
Then your suggest is to take the GoPro frame from FFmpeg and save them to memory, and so read them from OpenCV, is it right? It could works also for display the LIVE streaming?
again, we need to know, what you have , to help.
I'm sorry. I have FFmpeg library that receive frames from GoPro and I can restream that frames in any protocol in a local stream (UDP, RTP, RTSP etc...). I need only that OpenCv read that frames from this local URL stream, and display the live streaming in display... Is it possibile to known how can read this local stream from OpenCV (and what protocol it want) and how display it in the screen?
Thank you very much!
opencv cannot read from ffmpeg directly.
on android, it's not possible for opencv, to read any kind of stream (be it internal or external), there is no backend (like ffmpeg, again, or gstreamer) for this functionality.
you'll have to find your own way here.
maybe you can start as simple as saving the current image to disk, and trying to read it using imread(), later, if you find a better idea, improve.
(if you're halfway good with sockets, trying to implement an mjpeg or (raw) udp client might be another idea)