A Guide to Relieve your H.264 Streaming Needs


I’ve seen some chatter on H.264 here and elsewhere, and as our team now has a reliable method of creating a stream, I thought I’d share blog post I wrote on the theme:


On a less lyrical note, any teams considering switching streaming protocols (or are curious as to why you may want to) should find this oversized blog post at least marginally insightful and/or useful. Or rather, I hope it proves marginally insight/use-ful. As a mild warning, the solution we settled on requires some technical talent (you’ll be compiling some C code and installing libraries if you follow along). If you’re lucky enough to be competing at championships and happen to be bored, perhaps you’ll be inspired to start a new programming project.

If your team has also developed or found great resources on setting up H.264 on your bot, I’d love to hear about and link to them.



This is amazing thanks so much for posting this



Plug time!

You beat me to the solution for OpenCV, gotta give you that. And wow is this guide comprehensive.
The test_launch example program is serviceable if you feel comfortable with gstreamer’s pipeline syntax. If you don’t want to bother with that, that’s what we created p-e for.



Wow this looks awesome! It’s like mjpeg-streamer but with a bandwidth-efficient codec! :slight_smile:

I’m about to add a section about it. Now I could really use a similarly convenient project for viewing the stream with GStreamer. Maybe one day we’ll see a dashboard that supports H.264.



Motion for RPi wraps ffmpeg with a bunch of cool tools, including ability to stream via RTSP, (which I didn’t set up). The web controls are plain but very functional from DS. Had two ELP h.264 fisheye cameras streaming simultaneously below the bandwidth limit. FMS issues caused it to work 75% of time (never got to try adding a switch or other solutions; did use static IP) led us back to direct connection to Rio with switching between the two streams on controller button for 100% reliability.



Interesting use! I once set up motion with my security cameras for fun since the software they came with was kept on alerting us that leaves were blowing in the wind. I’ve never seen it used for live streaming though. Curious, how much lag did you have with it? And if you didn’t use RTSP, what protocol did you use? If it supports H.264 over HTTP that would be incredible.



Used web browser (2 instances) on DS to direct connect to IP address; UDP. Lag was really low, better than the direct connection using now (and only 1 stream) which is mjpeg. Load on RPi wasn’t bad at all. Also had RPi feeding Pixy2 vector data to Network tables at the same time. Power load required special treatment.



All I can find on motion’s site is how to view mjpeg streams. Maybe I missed something? :man_shrugging: It’s nice to hear lag was low.

How would it transfer over UDP though? Was there some sort of plugin or Java applet the website used? Because otherwise it would have to be over HTTP over TCP.



Sorry, you’re right http over tcp. This was the inspiration https://www.instructables.com/id/How-to-Make-Raspberry-Pi-Webcam-Server-and-Stream-/



Funny you should ask… We took a shot at that too :smiley:

It’s way less than perfect, but it’s a start. We’ll be developing this more in the off-season.

Also, just as I’m combing through the post: Most teams are probably going to think “Raspberry Pi” when they hear “co-processor,” and in the process of creating potential-engine we discovered that Raspberry Pis, as awesome as they are, really don’t have the power to run x264enc elements quickly. A Jetson might (but if you have that available I’m going to guess none of this is news to you.)

You can use OpenMAX (Debian package gstreamer1.0-omx, element omxh264enc) to get hardware-accelerated video, which is plenty faster.



@MGoGolf are you sure that’s using H.264? The tutorial gives me the feeling it’s using MJPEG (for example the stream_quality option which defines how much compressing the jpeg encoder does).

@tkdberger :man_facepalming: This is what I get for not reading the full thread. For me FFmpeg always adds 1-2 seconds of lag (even after trying to disable buffering as your suggested :slightly_frowning_face: ), so maybe a GStreamerStreamViewer (sorry for the capitalization) would work better for me, unless you fixied the problem with your Java code. But then I tried mixing GStreamer and Swing as a project a year or two ago and the combo was also noticeably slower than GStreamer’s Direct3D-accelerated viewer. Since d3dvideosink supports reusing windows to draw streams, I’m itching to one day to see if I can use a child window to display the stream and use some off-the-shelf-Windows-api-library to draw buttons and dashboardy things. Or maybe I don’t need fancy buttons and could then use normal windows.



SmartHackboard is largely suspended for 2019… we’ll revisit it in the off-season, as well as other components of potential-engine.
For now my “official” recommendation is marked as the solution in that thread: ffmpeg -i rtsp://server:port/stream -f sdl -. For us, this brings latency to nigh-zero, but GStreamer is also totally fine. (We honestly just didn’t get GStreamer installed on our driver station in time for competition.)



The OP’s link discusses how ffmpeg is similar to gstreamer. Motion wraps/uses ffmpeg to encode (as needed, i.e. if camera doesn’t itself). If you go through the complete config file for Motion, there are ffmpeg settings - you can select various mp4 codecs including h.264. I tried others including h.265, but h.264 is better for our purposes. I assume the browser is decoding/playing. Maybe I am mis-understanding.



Hmm so ffmpeg works better than ffplay despite being based on the same libraries? :thinking: This makes less sense than the stock market to me, but your sdl thing works much better than the ffplay commands I tried so thanks a lot for that @tkdberger. It’s still a little laggy compared to GStreamer on my laptop (see below) but this will be so helpful all the times I’m on something without GStreamer.

The top is the current time, left is GStreamer, and right is ffmpeg, when encoding on my laptop (i.e. lag would be much better if I used a c920 or raspicam that outputs h264)

@MGoGolf It sounds more like I did the misunderstanding here. I’ve discovered I really have no inkly idea how motion works and have made more assumptions than words in this paragraph and I should really just play with it before I talk more. So I’m stopping myself now. Thanks for all the resources on it.



We used an h.264 cam. Served up fine and probably similar gstreamer.



Yes. FFplay is meant as a higher level video player, so it has some features like an internal buffer that FFmpeg does not. (Even with -fflags nobuffer it still buffers a little bit.)