Gstreamer error with multiple USB Cameras - Jetson TX2

Trying to get 3 simultaneous streams (scripted to start) from USB cameras. I have a simple script, below, but am getting errors related to space on device and allocation of buffers. So far searching hasn’t turned up much. Any thoughts would be appreciated.

Script:
#!/bin/bash

gst-launch-1.0 -e v4l2src device=/dev/video1 -v !
‘video/x-raw, format=(string)I420, width=(int)640, height=(int)480’ !
omxh264enc bitrate=300000 ! ‘video/x-h264, stream-format=(string)byte-stream’ !
queue max-size-buffers=0 ! h264parse ! rtph264pay ! udpsink host=192.168.0.23 port=5805 &

gst-launch-1.0 -e v4l2src device=/dev/video2 -v !
‘video/x-raw, format=(string)I420, width=(int)320, height=(int)240’ !
omxh264enc bitrate=150000 ! ‘video/x-h264, stream-format=(string)byte-stream’ !
queue max-size-buffers=0 ! h264parse ! rtph264pay ! udpsink host=192.168.0.23 port=5806 &

gst-launch-1.0 -e v4l2src device=/dev/video3 -v !
‘video/x-raw, format=(string)I420, width=(int)320, height=(int)240’ !
omxh264enc bitrate=150000 ! ‘video/x-h264, stream-format=(string)byte-stream’ !
queue max-size-buffers=0 ! h264parse ! rtph264pay ! udpsink host=192.168.0.23 port=5807 &

Error:
libv4l2: error turning on stream: No space left on device
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Could not read from resource.
Additional debug info:
gstv4l2bufferpool.c(1054): gst_v4l2_buffer_pool_poll (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
poll error 1: No space left on device (28)
EOS on shutdown enabled – waiting for EOS after Error
Waiting for EOS…
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Failed to allocate a buffer
Additional debug info:
gstv4l2src.c(884): gst_v4l2src_create (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.

Those errors are due to running out of USB bandwidth. I’ve not read much about v4l2src but generally speaking USB cameras don’t support format I420, only YUYV or MJPG, so it’s unclear to me what mode the camera is actually getting set to. Many Linux drivers tend to overallocate bandwidth in MJPG mode so that may also be part of the problem if the camera is ending up in that mode, even if the resolution is being set correctly.

Thanks @Peter_Johnson. Any thoughts on a better format/string to use? Looking here: https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-v4l2src.html I could select another format besides x-raw or choose another format from the list. I’m a bit of a noob here … :slight_smile:

Run v4l2-ctl -d /dev/video[0,1,whatever] --list-formats-ext to see what your particular cameras support.

FWIW I’ve had poor luck running multiple USB cameras on a single controller. It’s not what you actually use, but what some (possibly dumb) allocation routine thinks you might use.

@buchanan - running the command on the Lifecam 3000’s give me YUYV 4:4:2 and MJPG. On the Logitech C20, I get those two as well plus H.264 (I have a mix of cameras right now in testing). I tried YUY2 (packed 4:2:2 YUV) but when I run that using the script (below) I get “WARNING: erroneous pipeline: could not link V4l2src0 to omxh254enc-omxh264enc0”

Script:
gst-launch-1.0 -e v4l2src device=/dev/video2 -v !
‘video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240’ !
omxh264enc bitrate=150000 ! ‘video/x-h264, stream-format=(string)byte-stream’ !
queue max-size-buffers=0 ! h264parse ! rtph264pay ! udpsink host=192.168.0.23 port=5806 &

I doubt I’m fluent enough with Gstreamer to debug pipelines via CD, but I’d suggest using gst-inspect to see what formats the src and sink that aren’t linking each support - Gstreamer tries to find a common ground, but if there isn’t one you get problems like this. Sometimes dropping a videoconvert between helps.

To your original question, your first problem really isn’t with Gstreamer, it’s with the underlying USB support either running out of resources or allocating them unwisely. I’ve personally given up on trying to run multiple USB cameras, but as others have suggested reducing bandwidth from the camera (reducing it later in the pipeline won’t help with this) is probably your best shot. Using H.264 on the camera that supports it might be worth a look in that regard.

Thanks for the advice. After a little playing around, I got the 3 streams to function by messing with the bitrate from the source. Looking to do a fairly good resolution “front” camera and a couple of “side mirrors” for sandstorm mode. This script works. Now on to figuring out the coding side since I don’t really want to do this as a script - need some image processing as well… :slight_smile:

Script:
gst-launch-1.0 -e v4l2src device=/dev/video1 -v !
‘video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=10/1’ !
omxh264enc bitrate=150000 ! ‘video/x-h264, stream-format=(string)byte-stream’ !
queue max-size-buffers=0 ! h264parse ! rtph264pay ! udpsink host=192.168.0.23 port=5805 &

gst-launch-1.0 -e v4l2src device=/dev/video2 -v !
‘video/x-raw, format=(string)I420, width=(int)640, height=(int)480, framerate=30/1’ !
omxh264enc bitrate=600000 ! ‘video/x-h264, stream-format=(string)byte-stream’ !
queue max-size-buffers=0 ! h264parse ! rtph264pay ! udpsink host=192.168.0.23 port=5806 &

gst-launch-1.0 -e v4l2src device=/dev/video3 -v !
‘video/x-raw, format=(string)I420, width=(int)320, height=(int)240, framerate=10/1’ !
omxh264enc bitrate=150000 ! ‘video/x-h264, stream-format=(string)byte-stream’ !
queue max-size-buffers=0 ! h264parse ! rtph264pay ! udpsink host=192.168.0.23 port=5807 &

Dropping the frame rate on the source side was the key. I’m pushing 3 cameras with < 1Mbps throughput (measured somewhat unscientifically using windows task manager).

1 Like