So our team decided to use a Jetson for image compression to send over multiple camera streams. I’m thinking we should use network tables, but i have no clue how to start doing that. Where should we start?
Look into mjpg-streamer (https://github.com/jacksonliam/mjpg-streamer). Mjpg-streamer makes it really simple to open up the stream from a web browser
I found this, and since we’re using shuffleboard it will make our setup look really nice. Does mjpg-streamer support streaming to http?
Yes, mjpg-streamer streams to http. You can use the CameraServer class on the Rio to add a HttpCamera for the Jetson’s mjpg-streamer–this will publish info about the camera to NetworkTables for Shuffleboard to pick up.
Alternatively, you can try building wpilib (which includes cameraserver and networktables) on the jetson from source using cmake. It’s a bit more setup work but gives you the same CameraServer features as you get on the Rio.
So how does this communicate with the roborio? Is shuffleboard going to be able to view this version on the Jetson? And how can we make it compressed enough to use with the 4mb of bandwidth on the router?
You would use NetworkTables to communicate with the Rio. Shuffleboard can view any MJPEG over HTTP stream, whether it comes from an Axis camera, the cscore MjpegServer, or the mjpg-streamer application. Compression wise the smaller the resolution and the lower the quality the less bandwidth it will use, although clearly it’s also going to vary with brightness/detail because it’s lossy compression. You’ll really have to test to make a determination.