Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Opening Gstreamer pipeline (from raspberry pi)

HI all,

This is my first post here, and Im quite new to linux/c++/gstreamer etc so go easy on me :)

Ive been searching for this answer for a couple of weeks now, but have yet to find a good answer. I have partial solutions which do not add up.

I tried to stream the camera feed from raspberry pi to my PC. What I succeed:

Step 1: Stream with netcat and open in mplayer

Sender: raspivid -n -t 0 -w 1280 -h 720 -fps 49 -o - | nc 10.x.x.x 5000

Reciever: nc -l -p 5000 | mplayer -fps 120 -cache 1024 -

Step 2: Stream with netcat and use netcat to open my VidCap c++ file, nc -l -p 5001 | ./VidCap

The videocapture object worked with the following line in c++ , videocapture cap("/dev/stdin")

and displayed the video! :) however a huge lag was involved in both cases.

next I turned to Gstreamer, which was said to be faster, and here is where I need help. Again step 1 was involved in streaming to a playe according to tutorials I found:

Sender:

raspivid -n -vs -awb auto -ex auto -t 0 -b 5000000 -w 1280 -h 720 -fps 49 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=10.x.x.x port=5000

Reciever: gst-launch-1.0 -v tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, format=I420, width=1280, height=720, framerate=120/1 ! avdec_h264 ! videoconvert ! autovideosink sync=false

which works great without lag.

Step 2: use in openCV. This time I tried to open the stream from the videocapture object like this:

std::string filename1 = "tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, format=I420, width=1280, height=720, framerate=49/1 ! avdec_h264 ! appsink sync=false";

cv::VideoCapture cap(filename1);

for (int i=0; i<=100; i++){ cap >> img; if(!img.empty()){ imshow("frame", img); cv::waitKey(1); } }

The pipleline opens but no images are displayed (images are empty). I assume it is a decoding problem but I have no idea how to solve it. I tried many variations of the capture object but the fact is I dont really understand it.

I installed opencv3 beta without ffmepg support but with gstreamer1.0.

Any help would be highly appreciated!

Thanks, Dan

Opening Gstreamer pipeline (from raspberry pi)

HI all,

This is my first post here, and Im quite new to linux/c++/gstreamer etc so go easy on me :)

Ive been searching for this answer for a couple of weeks now, but have yet to find a good answer. I have partial solutions which do not add up.

I tried am trying to stream the camera feed from raspberry pi to my PC. PC and process in OpenCV. What I succeed:succeed in doing so far:

Step 1: Stream with netcat and open in mplayer

Sender: raspivid -n -t 0 -w 1280 -h 720 -fps 49 -o - | nc 10.x.x.x 5000

Reciever: nc -l -p 5000 | mplayer -fps 120 -cache 1024 -

Step 2: Stream with netcat and use netcat to open my VidCap c++ file, Receiver: nc -l -p 5001 | ./VidCap

The videocapture object worked with the following line in c++ , ,
videocapture cap("/dev/stdin")

and displayed the video! :) however a huge lag was involved in both cases.

next I turned to Gstreamer, which was said to be faster, and here is where I need help. Again step 1 was involved in streaming to a playe according to tutorials I found:

Sender:

raspivid -n -vs -awb auto -ex auto -t 0 -b 5000000 -w 1280 -h 720 -fps 49 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=10.x.x.x port=5000

Reciever: gst-launch-1.0 -v tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, format=I420, width=1280, height=720, framerate=120/1 ! avdec_h264 ! videoconvert ! autovideosink sync=false

which works great without lag.

Step 2: use in openCV. This time I tried to open the stream from the videocapture object like this:

std::string filename1 = "tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, format=I420, width=1280, height=720, framerate=49/1 ! avdec_h264 ! appsink sync=false";

cv::VideoCapture cap(filename1);

for (int i=0; i<=100; i++){ cap >> img; if(!img.empty()){ imshow("frame", img); cv::waitKey(1); } }

The pipleline opens but no images are displayed (images are empty). I assume it is a decoding problem but I have no idea how to solve it. I tried many variations of the capture object but the fact is I dont really understand it.

I installed opencv3 beta without ffmepg support but with gstreamer1.0.

Any help would be highly appreciated!

Thanks, Dan

Opening Gstreamer pipeline (from raspberry pi)

HI all,

This is my first post here, and Im quite new to linux/c++/gstreamer etc so go easy on me :)

Ive been searching for this answer for a couple of weeks now, but have yet to find a good answer. I have partial solutions which do not add up.

I am trying to stream the camera feed from raspberry pi to my PC and process in OpenCV. What I succeed in doing so far:

Step 1: Stream with netcat and open in mplayer

Sender: raspivid -n -t 0 -w 1280 -h 720 -fps 49 -o - | nc 10.x.x.x 5000

Reciever: nc -l -p 5000 | mplayer -fps 120 -cache 1024 -

Step 2: Stream with netcat and use netcat to open my VidCap c++ file, Receiver: nc -l -p 5001 | ./VidCap

The videocapture object worked with the following line in c++ ,
videocapture cap("/dev/stdin")

Next I tried Gstreamer, and this is where I need help. Same 2 steps as before.

Step1: Gstreamer & automatic video player

Sender:

raspivid -n -vs -awb auto -ex auto -t 0 -b 5000000 -w 1280 -h 720 -fps 49 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=10.x.x.x port=5000

Reciever: gst-launch-1.0 -v tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, format=I420, width=1280, height=720, framerate=120/1 ! avdec_h264 ! videoconvert ! autovideosink sync=false

which works great without lag.

Step 2: use in openCV. This time I tried to open the stream from the videocapture object like this:

std::string filename1 = "tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, format=I420, width=1280, height=720, framerate=49/1 ! avdec_h264 ! appsink sync=false";

cv::VideoCapture cap(filename1);

for (int i=0; i<=100; i++){ cap >> img; if(!img.empty()){ imshow("frame", img); cv::waitKey(1); } }

The pipleline opens but no images are displayed (images are empty). I assume it is a decoding problem but I have no idea how to solve it. I tried many variations of the capture object but the fact is I dont really understand it.

I installed opencv3 beta without ffmepg support but with gstreamer1.0.

Any help would be highly appreciated!

Thanks, Dan

Opening Gstreamer pipeline (from raspberry pi)

HI all,

This is my first post here, and Im quite new to linux/c++/gstreamer etc so go easy on me :)

Ive been searching for this answer for a couple of weeks now, but have yet to find a good answer. I have partial solutions which do not add up.Updated:

I am trying to stream the camera feed from video of the raspberry pi camera to my PC (running ubuntu) and process in OpenCV. What I succeed in doing so far:

Step 1: Stream with netcat and open in mplayer

Sender: produces alot of lag (3-5sec) Pi: raspivid -n -t 0 -w 1280 -h 720 -fps 49 -o - | nc 10.x.x.x 5000

Reciever: PC: nc -l -p 5000 | mplayer -fps 120 -cache 1024 -

Step 2: Stream with netcat ./VidCap

and use netcat to open in my VidCap c++ file, Receiver: nc -l -p 5001 | ./VidCap

The videocapture object worked with the following line in c++ ,
program videocapture cap("/dev/stdin")

Next I tried Gstreamer, and this is where I need help. Same 2 steps as before.

Step1: Gstreamer & automatic video player

Sender:

Gstreamer:

Pi: raspivid -n -vs -awb auto -ex auto -t 0 -b 5000000 -w 1280 -h 720 -fps 49 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=10.x.x.x port=5000

Reciever: gst-launch-1.0 -v tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, format=I420, width=1280, height=720, framerate=120/1 ! avdec_h264 ! videoconvert ! autovideosink sync=false

which works great without lag.

Step 2: PC: this time I run my opencv program directly and use in openCV. This time I tried to open videocapture with the stream from the videocapture object like this:

std::string filename1 = following string "tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, format=I420, width=1280, height=720, framerate=49/1 format=YUY2, framerate=49/1, stream-format=avc ! avdec_h264 ! autoconvert ! appsink sync=false";

cv::VideoCapture cap(filename1);

for (int i=0; i<=100; i++){ cap >> img; if(!img.empty()){ imshow("frame", img); cv::waitKey(1); } }

The pipleline opens but no images are displayed (images are empty). I assume it is sync=false"

This produces a decoding problem but I have no idea how to solve it. I tried many variations mosaic video of the capture object but the fact is I dont really understand it.

I installed opencv3 beta without ffmepg support but with gstreamer1.0.B&W images: http://i61.tinypic.com/fkzksp.png

Any help would be highly appreciated!

Thanks, ideas?

Dan

Opening Gstreamer pipeline (from raspberry pi)

HI all,

Updated:

I am trying to stream the video of the raspberry pi camera to my PC (running ubuntu) and process in OpenCV. What I succeed in doing so far:

Stream with netcat produces alot of lag (3-5sec) Pi: Pi:

raspivid -n -t 0 -w 1280 -h 720 -fps 49 -o - | nc 10.x.x.x 5000

PC: 5000

PC:

nc -l -p 5000 | ./VidCap

./VidCap

and in my program program

videocapture cap("/dev/stdin") 

cap("/dev/stdin")

Next I tried Gstreamer:

Pi: Pi:

raspivid -n  -t 0 -b 5000000 -w 1280 -h 720 -fps 49 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=10.x.x.x port=5000

port=5000

PC: this time I run my opencv program directly and use videocapture with the following string string

"tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, width=1280, height=720, format=YUY2, framerate=49/1, stream-format=avc ! avdec_h264 ! autoconvert ! appsink sync=false"

sync=false"

This produces a mosaic video of B&W images: http://i61.tinypic.com/fkzksp.png

Any ideas?

Dan

Opening Gstreamer pipeline (from raspberry pi)Streaming video from Rpi to PC using gstreamer/netcat

HI all,

Updated:

I am Updated: I solved this issue so Im documenting for future generations :) It took many a week or two of search and study of gst-launch!

I was trying to stream the video of the raspberry pi camera to my PC (running ubuntu) and process in OpenCV. What I succeed in doing so far:

Stream with netcat produces alot of lag (3-5sec) (3-5sec) Pi:

raspivid -n -t 0 -w 1280 -h 720 -fps 49 -o - | nc 10.x.x.x 5000

PC:

nc -l -p 5000 | ./VidCap

and in my program

videocapture cap("/dev/stdin")

Next I tried Gstreamer:

Pi:

raspivid -n  -t 0 -b 5000000 -w 1280 -h 720 -fps 49 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=10.x.x.x port=5000

PC: this time I run my opencv program directly and from command line. In the program I use videocapture with the following string

"tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, width=1280, height=720, format=YUY2, framerate=49/1, stream-format=avc ! avdec_h264 framerate=49/1 ! ffdec_h264 ! autoconvert ! appsink sync=false"

This produces a mosaic Note that OpenCV must receive ffdec_h264 and not avdec_h264 (opening the video of B&W images: http://i61.tinypic.com/fkzksp.png

Any ideas?

in command line it is the other way around).

Good Luck, Dan

Streaming video from Rpi to PC using gstreamer/netcat

HI all,

Updated: I solved this issue so Im documenting for future generations :) It took many me a week or two of search and study of gst-launch!

I was trying to stream the video of the raspberry pi camera to my PC (running ubuntu) and process in OpenCV.

Stream with netcat produces alot of lag (3-5sec) Pi:

raspivid -n -t 0 -w 1280 -h 720 -fps 49 -o - | nc 10.x.x.x 5000

PC:

nc -l -p 5000 | ./VidCap

and in my program

videocapture cap("/dev/stdin")

Next I tried Gstreamer:

Pi:

raspivid -n  -t 0 -b 5000000 -w 1280 -h 720 -fps 49 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=10.x.x.x port=5000

PC: this time I run my opencv program directly from command line. In the program I use videocapture with the following string

"tcpclientsrc host=10.x.x.x port=5000 ! gdpdepay ! rtph264depay ! video/x-h264, width=1280, height=720, format=YUY2, framerate=49/1 ! ffdec_h264 ! autoconvert ! appsink sync=false"

Note that OpenCV must receive ffdec_h264 and not avdec_h264 (opening the video in command line it is the other way around).

Good Luck, Dan