I’m exploring the possibility of using a Raspberry Pi as a substitute for the RoboRIO in a non-competition robot (for offseason testing, demos, and learning purposes). I know the RoboRIO is required for official FRC events, but for prototyping, I’m curious how far we can go with a more affordable and customizable setup.
I’m particularly looking for:
Examples of Raspberry Pi controlling FRC-style hardware (PWM, CAN, encoders, sensors, etc.)
Projects where WPILib or ROS was adapted for Raspberry Pi use
Solutions for joystick input, wireless control, or dashboard integration
Any gotchas or limitations you’ve run into
If you have codebases, GitHub repos, wiring diagrams, blog posts, or just insights from your own experience, I’d really appreciate anything you can share.
This was in a different thread but I have personally used the above method for non-frc projects using CTRE hardware. It does work with raspberry pi or Jetson nano.
ROS on pi is doable (but can be slow), but depending on what you are trying to do in ROS some of the newer Jetsons or the OG nanos will work better. My former job was R&D on a small footprint AMR using ROS and COTS hardware. Stuff like handling vision processing a Jetson device with CUDA cores will outperform other CPU based SBCs. Both CNNs like Yolo and even traditional OpenCV see a noticable performance advantage when they able to use the CUDA cores.
Additionally I’d recommend looking up the ZebROS project which should allow you to find the whitepaper their team wrote about their FRC experience with ROS. It is absolutely chock full of good technical information.
additionally, any of the teams using ROS and the ctre usb-canbus adapter that i forgot the name of will work since ctre provides the packages for that device on linux natively. if you wanna do REV motors, i suggest using the HALSIM_SocketCAN hal extension and downloading the revlib files from their github repo