Vision Questions


I am very interested in using this camera. I would like to be able to stream at a framerate higher than 30 for sandstorm / teleop vision assisted driving.

After plugging in the camera to the roborio and using the CameraServer to stream it to the driver station, despite changing the settings of the camera, it still pulls ~21mpbs and tries to run at the max framerate it can.

I assume the framerate spikes due to a lack of processing power on the roborio. If I were to stream the camera from a raspberry pi and compress the images, would it be possible to get a usable 45-60fps stream under 4mbps?

In addition to this, I wanted to stream a LifeCam HD-3000 from the same RPI. My plan of action was to grab a single frame per request, calculate a turn angle, and send the value over NT to the roboRio. To calculate the angle, I was planning on just finding the center value of the 2 pieces of tape, comparing it to the center of the image, then converting the difference into an angle using the cameras FOV.

Would this be a viable solution?


Do you intend to send the camera stream to the driver station or all calculation are performed on the Rio/coprocessor only?
The stream bandwidth is determined by few parameters such as frame rate, compression, image size. Try tweaking them and see what best suits your needs. How much control do you have on those parameters?
You can configure your radio to enforce the 4Mbps limit to test if everything works.


You have not said what image size you are trying to use. A 320x240 sized image, with moderate compression, as a MJPG stream, can use about 1-2 Mbps at 30 FPS, so you can’t got much higher than twice that. If you were getting 21Mpbs on the Rio, it is not going to change if you change to a RPi unless you drastically change what you are sending.


For streaming on the raspberry pi, you can use gstreamer:

gst-launch-1.0 autovideosrc ! videoscale ! videoconvert ! queue ! omxh264enc target-bitrate=3000000 control-rate=variable ! video/x-h264,width=840,height=480,framerate=30/1,profile=high ! rtph264pay ! gdppay ! udpsink host=<YOUR IP> port=1234

On your computer: gst-launch-1.0 udpsrc port=1234 ! gdpdepay ! rtph264depay ! avdec_h264 ! autovideosink sync=false

Change the resolution, framerate, and port to whatever you want. You will get much lower bitrate at better quality because it uses h.264, not MJPEG.

You’ll have to install gstreamer.


Agree, use gstreamer, and H.264 to stream to the driver station - you have to install gstreamer on the driver station as well, and it’ll be a separate script/executable that has to run as to the best of my knowledge you can’t integrate with smartdashboard.

With multiple USB streams (more than one camera) be careful about your input framerate and resolution into the raspberry PI - you can saturate the USB and shut down one or more cameras if you’re not careful. I’ve got 3 cameras running - one at 640x480x30fps with a high bitrate, and two at 320x240x15fps with a lower bitrate, and I’m pushing less than 1.5Mbps at the moment (though I’m using a Jetson, not a r-PI.


Not exactly sure where those parameters would be used.

Are there any guides available for using gstreamer on the rPI?

Can gstreamer be used with WPILib rPI image?

If so, would I just implement gstreamer into the provided example project, rebuild, and upload using the WPILib webdashboard? Not exactly sure how I would get or use the gstreamer libraries.


Tried out gstreamer, your command returns this error:

WARNING: erroneous pipeline: no element “autovideosrc”


Do you have all the gstreamer plugins?

We aren’t using it with the FRC pi image, though I don’t see any reason why it wouldn’t work.


I installed the gstreamer-good, bad, and ugly plugins and ended up getting
WARNING: erroneous pipeline: no element “omxh264enc”

do you have the commands you used to install everything for gstreamer?