Gstreamer challenges

Help. I have my vision code compiled but it is “hanging” on the gstreamer syntax. To test I’ve broken the syntax I am using out into a couple of scripts (one as a single pipeline, and one as a pipeline with a split). The following script works:

gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! videoconvert ! video/x-raw,format=BGR ! videoconvert ! omxh264enc bitrate=300000 ! video/x-h264, stream-format=byte-stream ! h264parse ! rtph264pay ! udpsink host=10.0.0.171 port=5806

When I try to split it and display the split on my Jetson console, I get a single frame (on the console) and apparently no stream to my desktop. This script hangs (no error):

gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! videoconvert ! video/x-raw,format=BGR !tee name=split split. ! queue ! videoconvert ! omxh264enc bitrate=300000 ! video/x-h264, stream-format=byte-stream ! h264parse ! rtph264pay ! udpsink host=10.0.0.171 port=5806 split. ! queue ! autovideoconvert ! xvimagesink

I have tested a simple pipeline direct to the Jetson console, and this also works:

gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw, width=640, height=480 ! xvimagesink

Any thoughts on why the middle script is not working? Thanks!

As a follow up - it seems to be something in here:

omxh264enc bitrate=300000 ! video/x-h264, stream-format=byte-stream ! h264parse ! rtph264pay !udpsink host=10.0.0.171 port=5806

Since if I replace this with just “xvimagesink”, I get two playback windows on my jetson nano, but if I try to include it and then replace the udpsink with xvimagesink, I start to get errors.

I’ve found there are way too many errors with gstreamer, you solve you get 2 more. If you are trying to get a streak from the Jetson to the dashboard just install vscode and java on the Jetson and use cscore. It’s much easier and works the same.

If all you’re trying to do is stream from a Jetson to a dashboard, you could use something like rusty-engine which handles all the pipeline creation for you.

This pipeline can (to my knowledge) simplify to:

v4l2src device=/dev/video0 ! video/x-raw,format=YUY2,width=640,height=480,framerate=30/1 ! videoconvert ! omxh264enc bitrate=300000 ! video/x-h264 ! rtph264pay ! udpsink host=10.0.0.171 port=5806

Try that instead. You can also set GST_DEBUG=3 to get some more debugging information (i.e. GST_DEBUG=3 gst-launch-1.0 your_pipeline_here .)

Thanks I’ll try the debug. Streaming is not the problem. The script is example of what is happening in my code. I need to split the stream to get both a video stream to the driver’s station, as well as data to read the contours, and send positioning data back to the cRio via Network Tables. The code basically worked last year with a TX2, but we’ve switched to a Nano (and different cameras) and right now this is giving me fits.

I know I can simplify, but the base code for OpenCV I’m using needs to convert to BGR - at least that’s what the comments say. For what it’s worth, the failure happens even if I don’t add the BGR conversion. It’s the split and conversion to h264 that seems to be causing the problems.

Normally in my example, the split code gets piped to appsink, not xvideosink. I was using xvideosink and scripts to try to figure out where in the pipeline it is failing.

Thanks.

This is just an example (stream script) of what I’m trying to do on board the jetson. A gstreamer script works without issue to send camera data to the driver’s station. The issue is when I’m splitting the stream to do not only video to the DS, but also using OpenCV to find and manipulate the contours. If I don’t do the split, the scripts work fine.

So you’re trying to read video from the camera, process it, and simultaneously stream H.264 to the driver?

Obligatory OpenSight plug.

You can also look at something like the upgraded-engineer Python module, which lets you essentially call the write_frame function of one of the provided EngineWriters to stream a provided frame. (It depends on rusty-engine for underlying streaming but it’s a similar effect to the plain RTP streaming you’re already doing.)

Somewhat solved. I still want to try OpenSight - but having challenges with Jetson Nano (see discourse conversation in help.

The debug helped me figure out that the omxh264enc pipeline has been deprecated. I replaced it with “nvvidconv ! nvv4l2h264enc” (both are necessary) and voila, streaming seems to work on the split. I’ve got a queuing issue, but that seems solvable.

Thanks for the input!! Hopefully I will have both solutions to compare in the near future!

1 Like

Try changing !tee name=split split. ! to ! tee name=split !
otherwise you are using the second branch in two places.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.