mjpg-streamer running on roboRIO

Here at the Orlando Regionals while waiting in the pits for our robot to start working, I ported mjpg-streamer to the roboRIO. It was pretty funky since the source doesn’t lend itself to cross-compilation. So, the answer? Compile it on a Beaglebone Black running Angstrom and copy the files over.

Result? A working camera feed that takes only 1% of the CPU. And, with a little Javascript magic, you can have multiple cameras running at the same time. There was only one little problem in that the libjpeg code isn’t available on the roboRIO. So, simply copy the libraries from the BBB and place them in /usr/lib. Open source is a wonderful thing.

Then, untar the mjpg-streamer code and run it ala the instructions.



mjpg-streamer-built.tgz (1.47 MB)
jpeg.tgz (197 KB)

mjpg-streamer-built.tgz (1.47 MB)
jpeg.tgz (197 KB)

Multiple cameras? Last I tried to use mjpg-streamer for anything productive I found that only a select few models didn’t hog the entire USB bandwidth on their own (causing a second camera to fail because there “isn’t enough bandwidth” . Most cameras that are “uvc compatible” just aren’t because of the underdevelopment by the companies that make them. What cameras are you using?

The Logitech C920 does hardware H.264 encoding. Mjpeg-streamer understands that and doesn’t repeat the process. This significant;y lowers CPU utilization.



This sounds great. I’ve been contemplating finding something like this and compiling for the roborio (though I was considering ffmpeg et al), now I don’t have to! :slight_smile:

I’ve been nervous about running the streaming stuff in the same process as our robot code (particularly since we’re in python and there are GIL contention issues), so having a standalone executable sounds great.

Have you guys run this during a competition yet? I would be concerned about the bandwidth taken by two simultaneous video streams. In the past, we’ve accidentally saturated the feed between our laptop and the robot with a single 640 x 480 stream.

How do you display it on the PC, thru smartdashboard or VLC? And does the FMS allow for communications over the ports used?

Actually, you drop a Javascript page onto the ./www directory that has a button that allows you to switch between the streams. So, you’re not getting both streams simultaneously unless you want them.



Actually, a simple web browser will do the display just fine. You can compile mjpg-streamer for vlc, udp, rtsp or one of several other options. However, we’ve just been using the output_http.so plugin and Chrome, Firefox or IE to display it on the DS. You can even make it start automatically just like the dashboard normally would if you want to.



We’re actually doing our DS in HTML/JS, with a python interface to forward networktables to/from the HTML page, so the mjpg-streamer is exactly what we’re looking for. We played with it a little bit today and it’s pretty sweet, but haven’t wired it into our interface yet.

I’m curious, what javascript magic do you need to do to switch streams? Is there a delay? We setup two servers, and were able to stream two cameras that way. Didn’t try the JS magic yet.

We tested two Lifecam 3000 cameras this afternoon running two servers, and at 160x120 were at about 5% CPU usage. 320x160 was roughly the same. Haven’t tried measuring the network bandwidth yet.

I’m trying to set this to test. Where did you guys install the ./www pages on the roboRIO?

Easiest thing that I can see is just to put everything under /var/local/natinst/www, since that would avoid having to edit the web config file (which seems to be at /etc/natinst/NISystemWebServer.conf).

The other alternative would be to add a second document root, but while that NISystemWebServer.conf bears resemblances to Apache .conf files (perhaps version 1.X?), it pretty clearly has proprietary NI directives in it, and I’m hesitate to make changes without real documentation (which I suppose I could hunt for on NI’s site).

Also, did anyone notice the mjpeg-stream stuff at /usr/camera_server/mjpg-streamer-r63? It appears that’s where the NI-IMAQdx routines get their USB webcam support from. It seems to be an older(?) version of the same MJPEG streaming libraries you guys are playing with.

When I try to run this, I’m getting an error opening the /dev/video0 device. I have a Logitech P930e plugged in, which should certainly support the requested mode.

admin@roboRIO-2877:/var/local/natinst/www/mjpg# source start.sh
MJPG Streamer Version: svn rev:
 i: Using V4L2 device.: /dev/video0
 i: Desired Resolution: 640 x 480
 i: Frames Per Second.: 5
 i: Format............: MJPEG
Unable to set format: 1196444237 res: 640x480
 Init v4L2 failed !! exit fatal
 i: init_VideoIn failed

admin@roboRIO-2877:/var/local/natinst/www/mjpg# ls -l /dev/video0
crw-rw-rw-    1 admin    ni         81,   0 Dec 31  1969 /dev/video0

What you want to execute is something like this:

LD_LIBRARY_PATH=`pwd` ./mjpg_streamer -i  "./input_uvc.so --device /dev/video0 -f 10 -r 160x120" -o "./output_http.so --port 5800 -w www"

Of course, you’re going to want to write scripts and such to make it easy to run, and more importantly start on robot boot. I did all of this already, so I put install instructions + the scripts I created into a gist, you can access it at https://gist.github.com/virtuald/c8835244759e53314211

Feedback/comments welcome!

Actually, it turns out the default shell script was working for me. I had forgotten to shut down FRCProgram, which also opened the camera!

But your scripts for starting, etc. will be helpful if we decide to go this route.
Right now at 640 x 480, a stream from my Logitech P930e is taking 2.8Mbps, which is higher than the stream from our robot program, which uses the NI MAXQdx libraries to send a 640 x 360 stream at 24fps and onlyh takes 2.0 mbps. I’m still leery of how this will perform through the FMS.

But you said something about doing an entire Dashboard in JavaScript. That really intrigues me. And a Python implementation of NetworkTables. Is that running server side or client side?

Check out the latest RobotPy project, pynetworktables2js. It forwards networktables traffic to/from an HTML page via a websocket. It hasn’t really had a proper release announcement, but we’ll be using it in our dashboard at our comp next week.

We run it on the driver station.

It’s worse than I thought. My earlier comparison was with the MJPG streamer at 10 fsp and my robot program at 24fps. The MJPG streamer uses about 3 times the bandwidth as the WPILib streamer running on the robot.

It looks like I’m getting an MJPEG stream from my Logitech P930e (similar to the Logitech P920 but with a wider field of view), rather than an H.264 stream.

Do you know if there’s any way I can persuade input_uvc.so to get an H.264 or MPEG-4 stream from the camera?

What quality setting are you using? That could affect it. (-q option). Also, I’m curious, what are you using to measure the bandwidth usage?

For anyone who wants to read data from mjpg-streamer, I’ve posted python code that can read from the stream using urllib + opencv.


We ended up not using any image processing at our competition, but because our dashboard was created using HTML/js, it was trivially easy to connect to the camera and display its output. Highly recommend mjpg-streamer as a solution if you’re only needing to display output on the Driver Station.

I believe I tried a few -q choices, but it was a while ago now.

I measured the bandwidth using the Windows task manager. Since there was nothing else on the WiFi between the laptop and the robot router (usual DLink 1522B), I believe the bandwidth measurements are reliable.

FYI, I packaged mjpg-streamer into an IPK file that can be installed by opkg. It includes an init script that automatically starts mjpg-streamer when the RoboRIO starts.

Installation: https://github.com/robotpy/roborio-packages
Notes: https://github.com/robotpy/roborio-packages/tree/2016/ipkg/mjpg-streamer