Android- Read Streaming from FFmpeg

asked 2018-03-14 04:51:21 -0500

chrichri gravatar image


I receive frames streaming from Android (From a GoPro). I need to restream the frames in local and read them from OpenCV to display in the screen! How can I do this? What protocol can I use to restream them in local?

Thank you very much!

Best Regards!

edit retag flag offensive close merge delete


once you have the images in memory, it should be easy, to translate those to an opencv Mat.

please show (the relevant part of) your code !

berak gravatar imageberak ( 2018-03-14 04:56:02 -0500 )edit

Then your suggest is to take the GoPro frame from FFmpeg and save them to memory, and so read them from OpenCV, is it right? It could works also for display the LIVE streaming?

chrichri gravatar imagechrichri ( 2018-03-14 04:59:15 -0500 )edit

again, we need to know, what you have , to help.

berak gravatar imageberak ( 2018-03-14 05:01:22 -0500 )edit

I'm sorry. I have FFmpeg library that receive frames from GoPro and I can restream that frames in any protocol in a local stream (UDP, RTP, RTSP etc...). I need only that OpenCv read that frames from this local URL stream, and display the live streaming in display... Is it possibile to known how can read this local stream from OpenCV (and what protocol it want) and how display it in the screen?

Thank you very much!

chrichri gravatar imagechrichri ( 2018-03-14 05:11:40 -0500 )edit

opencv cannot read from ffmpeg directly.

on android, it's not possible for opencv, to read any kind of stream (be it internal or external), there is no backend (like ffmpeg, again, or gstreamer) for this functionality.

you'll have to find your own way here.

maybe you can start as simple as saving the current image to disk, and trying to read it using imread(), later, if you find a better idea, improve.

(if you're halfway good with sockets, trying to implement an mjpeg or (raw) udp client might be another idea)

berak gravatar imageberak ( 2018-03-14 07:25:54 -0500 )edit