Beta Release of FRCVision Raspberry Pi Image

I’m pleased to announce the availability of the first beta release of FRCVision, an off-the-shelf Raspberry Pi 3 image for FRC use!

This Raspbian-based image includes the WPILib and RobotPy C++, Java, and Python libraries required for vision coprocessor development for FRC (e.g. opencv, cscore, ntcore, pynetworktables, robotpy-cscore, Java 11, etc), and bundles both a default application that performs streaming of multiple cameras, and example C++, Java, and Python programs to use as a basis for vision processing code. It ties into NetworkTables for easy camera use from FRC dashboards such as Shuffleboard and the LabView dashboard.

A web dashboard is also included to configure/monitor the rPi (e.g. changing network settings), monitor the vision program (console, restart), change CameraServer and NetworkTables settings, and upload vision processing applications, all without the need for SSH. The image has also been designed for robustness to hard power offs by defaulting the filesystem to read only mode.

Additional documentation will be coming soon on the WPILib ScreenSteps pages, but getting started instructions are on the above release page.

So far, this has only been tested with the Raspberry Pi 3 Model B. If you run into issues, feel free to open an issue on the above repository or ask questions on the WPILib gitter (wpilibsuite/wpilib).

Happy holidays!

But how does this compare to the Limelight?

Relax, I’m completely and totally joking and I really just want to say thank you for everything you folks are doing to make this technology more accessible to more teams and I’m eager to play around with this. Thank you for releasing it!!!

One day I’ll be cool enough to submit a PR for adding ZeroMQ support.

EDIT: Also, any teams looking to use this - remember that USB battery packs are now legal to keep your Pi up and running!

This looks great, this is a big step in getting vision processing on to more robots. Thanks Peter and the rest of the WPILib teams that worked on this.

Thank you for releasing this before kickoff, it’s appreciated.

Excellent! Thank you!

What cameras have you tested it with (Does it work with the Raspberry Pi camera?)

I believe you have to enable the pi camera in settings and then ‘modprobe bcm2835-v4l2’ if you want to use it with GRIP, but they may have already done this for you.

Yes, the settings is running “sudo raspi-config”, option 5, option 1. I haven’t done anything special to enable it by default in the image build, so that probably needs to be done as a manual step for now. I’ll open an issue and aim to fix that in the next release.

I’ve tested it with about 10 different Logitech and Microsoft USB cameras, but not the Pi camera.

Just as a note for anybody else trying it, the -v4l2 part is really -v4<lowercase l>2 .

Do you have any examples showing it in action?

Not yet. We’ll be working on documentation next week which will have lots of screenshots etc. The image includes a default streaming program so no code is required if you just want to stream to the dashboard.