Ask Your Question
0

could someone incorporate gstreamer pipeline into opencv_contrib aruco sample, please?

asked 2019-04-22 23:01:35 -0600

updated 2019-04-23 22:21:34 -0600

supra56 gravatar image

include <stdio.h>

#include <opencv2/opencv.hpp>

using namespace cv;
using namespace std;

int main(int argc, char** argv)
{
VideoCapture cap("rtspsrc location=rtsp://192.168.0.?:8554/test  latency=30 ! decodebin ! nvvidconv ! appsink");

 if (!cap.isOpened())
    {
      cout << "Failed to open camera." << endl;
      return -1;
    }

  for(;;)
    {
      Mat frame;
      cap >> frame;
      Mat bgr;
      cvtColor(frame, bgr, COLOR_YUV2BGR_NV12);
      imshow("original", bgr);
      waitKey(1);
    }

  cap.release();

}
enter code here
edit retag flag offensive close merge delete

Comments

I don't think it's possible. gstreamer is a plugin-based framework licensed under the LGPL. Opencv is 3-clause BSD License. May be you can try a PR for a tutorial

LBerger gravatar imageLBerger ( 2019-04-23 01:58:22 -0600 )edit

1 answer

Sort by ยป oldest newest most voted
0

answered 2019-04-23 02:15:09 -0600

berak gravatar image

opencv's video capabilities and aruco are entirely unrelated, and aruco does not (need to) know anything about video handling.

if you are able to open your VideoCapture with a gstreamer pipeline like above, ok, it will work with any aruco related program, too. if you can't, again don't blame it on aruco.

  • please check cv::getBuildInformation() , to see, if you actually have support for gstreamer builtin.
edit flag offensive delete link more

Comments

Hi Berak. Thank you for following up! I did build opencv with gstreamer support explicitly stated. Moreover, I can play a stream with the binary of the code above. It displays a steam from a network device. But to put it into the sample file https://github.com/opencv/opencv_cont... seems beyond my abilities at the moment. And that is why I asked you guys to help me out with that. I understand that in the example only one single line needs to be substituted, the line 163, right? And it needs to be substituted with one single line as "VideoCapture cap("rtspsrc location=rtsp://192.168.0.?:8554/test latency=30 ! decodebin ! nvvidconv ! appsink");" , right?

Andrey_ gravatar imageAndrey_ ( 2019-04-23 03:48:31 -0600 )edit
1

you probably don't need to change anything in the sample, you can pass it as a cmdline arg, just make sure to escape it correctly, like:

detect_diamonds ... -v "my gstreamer pipeline"
berak gravatar imageberak ( 2019-04-23 03:59:27 -0600 )edit

like /opencv-3.4.6/build/bin/example_aruco_detect_diamonds -v "gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! queue ! decodebin ! videoconvert ! xvimagesink" ? or rather like /opencv-3.4.6/build/bin/example_aruco_detect_diamonds -v "rtspsrc location=rtsp://127.0.0.1:8554/test latency=30 ! decodebin ! nvvidconv ! appsink" Probably I am missing some arguments because it won't work but I will do more attempts. Thank you

Andrey_ gravatar imageAndrey_ ( 2019-04-23 04:08:09 -0600 )edit
1

Thank you! That thing worked but it shows static image from camera and not a videostream:

./aruco_simple "rtspsrc location=rtsp://127.0.0.1:8554/test latency=30 ! decodebin ! nvvidconv ! appsink"

I will try to implement it as well with detect_diamonds, but they seem to have too many arguments that seems complicated to me

Andrey_ gravatar imageAndrey_ ( 2019-04-23 04:16:48 -0600 )edit
1

did or didn't your stream work with another (maybe simpler ?) opencv example ?

if you only tried with gstreamer cmdline -- you can't use some of the argswith opencv

berak gravatar imageberak ( 2019-04-23 04:18:48 -0600 )edit
1

stream works with provided opencv example, in cased it is initiated with the sequence :

  sudo apt-get install libgstrtspserver-1.0 libgstreamer1.0-dev
wget https://gstreamer.freedesktop.org/src/gst-rtsp/gst-rtsp-server-1.14.1.tar.xz
tar -xvf gst-rtsp-server-1.14.1.tar.xz
cd  gst-rtsp-server-1.14.1
cd examples
gcc test-launch.c -o test-launch $(pkg-config --cflags --libs gstreamer-1.0 gstreamer-rtsp-server-1.0)
./test-launch "nvarguscamerasrc ! video/x-raw(memory:NVMM), format=NV12, width=1920, height=1080, framerate=30/1 ! nvvidconv ! video/x-raw, width=640, height=480, format=NV12, framerate=30/1 ! omxh265enc ! rtph265pay name=pay0 pt=96 config-interval=1"

However guys from opencv irc determined that the sample processes just one image and not a stream by design

Andrey_ gravatar imageAndrey_ ( 2019-04-23 05:25:55 -0600 )edit

the issue there is with that approach aruco sample seems to process only the first image of the network stream

Andrey_ gravatar imageAndrey_ ( 2019-04-24 00:44:11 -0600 )edit

and for this file? for the aruco_test that normally takes arguments as ./aruco_test live:2[number of the videocamera] source https://pastebin.com/DkMjtAsV this example wont take the argument as "", as I understand, will it?

Andrey_ gravatar imageAndrey_ ( 2019-04-24 08:09:55 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2019-04-22 23:01:35 -0600

Seen: 509 times

Last updated: Apr 23 '19