Potential-engine: A simple RTSP server for FRC (or anywhere else)

potential-engine is a Real Time Streaming Protocol server created in the hopes of getting better video feeds to drivers while using less bandwidth.

potential-engine streams using the h.264 video codec. h.264 video is (generally) far smaller than Motion JPEG, which is the default (and, to my personal knowledge) only “codec” supported by the WPI Camera Server.

h.264 is technically more difficult to produce, but hardware acceleration exists! And costs USD$35!

We saw this in the manual:

and ran with it. :man_shrugging:t2:

How do I use it?
Did you read the README and install all the dependencies first?

  1. git clone https://github.com/BHSSFRC/potential-engine.git && cd potential-engine
  2. mkdir build && cd build && cmake ..
  3. make
  4. Retrieve your user’s manual with ./potential-engine --help

If you want to make the program available from anywhere on the system (i.e. install it) just use sudo make install from within the build directory.

To retrieve video, you’ll need software beyond what FIRST provides. (We’re working on a version of SmartDashboard with this functionality, but it’s not 100% ready for distribution yet.) If you have FFmpeg on your driver station, you can display the video with ffplay -fflags nobuffer "rtsp://SE.R.V.ER:1181/stream" (assuming you left the port and url options at their defaults.)

Can I make it better?
Yes, absolutely! Please note that this project is licensed under the GNU Lesser General Public License v3.

Pull requests are also entirely welcome, of course!

Known Issues

  • The address option will not accept IPv6 addresses.

Why was this tagged as pi?
The server has explicit support for the Raspberry Pi camera module, and we imagine most teams will use a Raspberry Pi to run this server on their robot as well. (Props to anyone crazy enough to install it onto a roboRIO though.)


Any particular reason to choose the GPL? That automatically is a huge turn off for many people, and with much of the FRC software chain being bsd licensed it is an incompatible license, which might be confusing to some teams and people.

1 Like

The software doesn’t use anything on the FRC toolchain, so that particular licensing nastiness shouldn’t be an issue. (Our SmartDashboard hack is just a GitHub fork right now, so it’s still under the FIRST BSD license.)

I tend to put “large” software endeavors under the GPL just as M.O. (I also personally like that it requires people to open-source their changes because I’m not a fan of having trade secrets in a high school robotics competition.) If it really is an issue for people we can definitely “downgrade” to something less aggressive.

1 Like

Note that GPL licensing does not require teams to open-source their changes if they only use those changes on their own robot. The GPL only kicks in if a team shares their binaries with someone (in which case, yes, they need to share their source too). So in general it’s kind of useless in the FRC environment (e.g. it does not achieve your stated goal of not having team trade secrets).



We’ll look into re-licensing in the morning. I still believe that it is a discouragement from taking our code and turning it into a secret, but it’s quite late over here and thus a very bad time to try to dig into this.

Thanks for the advice!

1 Like

Ever solve a problem and feel less intelligent after the fact? Yeah, that’s me right about now.

Proudly(?) presenting SmartHackboard. (I’m very good at naming things.) This build of SmartDashboard adds a new element, FFmpegStreamViewer, that can view almost any video stream given the address to it. The backend for this process is essentially just ffplay, and as such any options you can use with ffplay are available with the “Extra ffplay options” property. These options are set with key:value,key:value notation. (For example, to have the same effect as ffplay -x 256 -y 144 ... you would set this property to x:256,y:144. fflags is automatically set to nobuffer and loglevel is set to quiet if “FFPlay expanded debug” is left unchecked before user options are applied.

These builds have not been rigorously tested! You have been warned! They should function for all operating systems, but have only been “tested” on Windows (which, for most teams, should be fine?)


  • Java 11 (Any JRE should work but we’ve only tested with Oracle’s.)
  • Willingness to void more warranties than you already have

If you want to replace the WPI SmartDashboard so that the Driver Station can open this version like the “real” thing, just replace C:\Users\Public\frc2019\tools\SmartDashboard.jar with the provided SmartDashboard-all.jar. You do need to rename it from SmartDashboard-all.jar to just SmartDashboard.jar.

And before someone yells at me about source code releases, have our GitHub fork. This build was built off of the feature/multi-track-drifting branch.

1 Like

Why not use uv4l instead of gstreamer?

We could’ve gotten this job done with any number of frameworks. FFmpeg has an API with bindings in many languages, and even had a fully functional server to itself (rip FFserver?)

UV4L doesn’t have RTSP in its streaming server plugin, which I assume is what you’re talking about as far as replacing GStreamer. UV4L is nice (I have heard glowing reviews on RPi stack exchange), but GStreamer gets the job done and is far from difficult to work with. (The most difficult problem we had to deal with was relearning how #include works.)

Also, I hadn’t heard of UV4L before we were maybe a quarter of the way in and was sticking to the plan at that point.

I’m having an issue with using potential-engine. It seems like my webcam doesn’t even turn on when the program runs, and aside from that it says it’s streaming on which I think is just a typo.

Gonna hazard a guess and say you cloned this today, so I won’t ask about what version you’re using.

The part is correct, it basically lets your OS bind it to whatever address if you don’t give a specific preference (as I understand it.) The port being -1 is definitely an error though. What options are you using to launch the server?
Some easy reasons it would show -1 are something else running on that port (make sure nothing is using port 1181 or whatever you’ve specified) or the port you’ve given is < 1024 and you’re not running as root.

Your camera will not turn on until a connection comes in, for what it’s worth. I would encourage you to use “raw” ffplay or similar to test connecting, since it’s muh faster to start than our SmartDashboard hack.

I was launching it without arguments and it still showed port -1, but what you say about the port already being used makes sense, since it only shows port -1 once I relaunch the program, not on first start. Maybe it’s because I’m using keyboard interrupt to exit it and the program doesn’t free the port?

I was already using ffplay, and the program was 1.1.

Edit: Just tried it again with a fresh boot and it still showed the port as -1. But when I specified in the command that the port is 1810 or 1812 for some reason it showed the port is 1811.

I have another question, is the ip address in the command supposed to be the raspberry pi or is it supposed to be the receiving driver station?

That’s very strange, I’ll have to investigate this. You shouldn’t need to reboot to restart the server, CTRL+C (SIGINT) is good enough to stop it and free the port. You can use various methods to see if something is still squatting the port after CTRL+C.

It should be the address of the Pi. You do not need to specify this option at all, the default behavior is “bind to whatever’s available.” There should be no need to specify the address option for most teams, and it’s really only there for people who are doing crazy things with networking.

Also, if you specify an address the server doesn’t “own”, the port fails to bind and you get -1:
The simplest way to combat this is to not specify an address in the first place, and let it bind to where the server will be accessible at any address the Pi “owns.”

I tried to get potential-engine working yesterday and had trouble. I built the program using the instructions in the README, and ran it with no options. The server started fine, but when I connected using ffplay it did not display any video and only showed errors about timings being missed. I tried to connect using VLC and only one frame was shown.

Any ideas? We are running on the latest FRCVision image.


David Fiel
Team 5401

Can you run the server with GST_DEBUG=3 before the rest of the launch line (i.e. GST_DEBUG=3 ./potential-engine ...) and post log output on the server and client side?
My best guess off the top of my head is that because you’re using FRCVision, some background process has a hold of the camera and potential-engine can’t get to it. However, if I cause this on my own machine, FFplay emits a 503 Service Unavailable error (as it should in this scenario.) I’ve also literally never used or dug into FRCVision, unfortunately, so my knowledge of any strange particulars of it is limited.

At its core FRCVision is a stripped down Raspbian. It uses standard Raspbian packages for everything but OpenCV and the FRC libraries, so there shouldn’t be any differences in gstreamer for example.

That’s probably the most likely case. By default there’s a streaming server that is started. To temporarily disable this you can click the “Down” button in the Vision Status tab of the webdash. Removing or changing the /home/pi/runCamera script will disable it / change what is actually run.

1 Like

I have it working now today. Not sure what changed. Only problem is that there is a 2-3 second delay on the stream. I tried the ffplay command from above, and the SmartHackboard from your post above. Both have the same delay

We’ve discovered significant delay using ffplay/SmartHackboard recently. We’re looking at why, but for now using ffmpeg directly seems to suffer less.

Try ffmpeg -i rtsp://server:port/stream -f sdl -.

Also, if you’re using a USB camera, please use OpenMAX with the -o or --use_omx flag on the server. You’ll get far better (i.e. less) latency that way.

Using the -o flag and fmpeg directly has us down to 1-2s latency. Anything else you can think of?

Not much off the top of my head, but I’ll keep you in the loop. I could have sworn we had better times, even on robot networking hardware.

Make sure your radio was configured with the latest version of the tool, maybe?

I’m assuming you’re using a USB webcam - if you have a Raspberry Pi Camera Module, those will most certainly get lower latency.


So we won Innovation in Control at Tippecanoe for this system, which is cool:

We also learned that for some strange reason, the FMS does not like UDP traffic on port 1181? This is worrying/troublesome, since the FMS whitepaper notes ports 1180 - 1190 as open to TCP and UDP for camera data. :thonk: We will aim to test this further in the future. Setting up a FMS simulator (i.e. VLAN with appropriate port config) isn’t really accurate enough for this type of testing IMO.
We circumvented this by streaming over TCP. Our drivers did not report any issues (e.g. greatly increased latency) in spite of this. Your mileage may vary.

There are plans for much future expansion here, such as support for multiple cameras (not necessarily the same execution as CameraServer but a similar principle) and maybe OpenCV integration. Maybe. We’re also planning on helping fellow Indiana teams configure this - if you’re in Center Grove, bring a Raspberry Pi (plus power cable and Ethernet), Lifecam or Raspberry Pi Camera module, a microSD card with Raspbian Lite, and we’ll do our best help you get everything set up. A network switch is not mandatory, but is highly recommended.

If you plan to take advantage of this opportunity, you should do your best to ensure that your microSD card not only has Raspbian Lite, but the dependencies listed in README.md as well. You can install them all with the following commands:

# install apt dependencies
sudo apt update && sudo apt upgrade && sudo apt install git cmake pkg-config gstreamer1.0-omx-rpi gstreamer1.0-omx libgstreamer1.0-dev libgstrtspserver-1.0-dev gstreamer1.0-rtsp
# install {fmt}, a modern C++ formatter
cd # go home
git clone https://github.com/fmtlib/fmt.git # download the source
cd fmt && mkdir build && cd build # create build files directory
cmake .. # create build files
sudo make install # build and install {fmt}

We will do our best to accommodate teams without these dependencies, but it will be drastically more difficult and we may not be able to help you fully without them.