Just out of curiosity, what USB/IP cameras are teams using on their robots? I know there is the Limelight/Pi with Photonvision solution, but our team is against it because it puts the expensive camera right in the line of fire. (Like the limelight needs to be right on a turret or this season everyone had them 2" inside of bumpers, and you can’t really shield it.)
Our first choice was actually a Pi Zero W. Using showmewebcam, you can use a standard CSI Pi camera, mini-csi cable, the tiniest SD card (64MB) and a Pi Zero, and get a very fast, stable USB camera, print a case around it, and you have a webcam for ~$30. It was keeping a stable 1080p30, but you can’t go faster on the FPS. Our biggest fear was Pi Zero availability, so we scrapped it.
Then we tried a JeVois A33 camera. We thought, for $50, you get a really compact USB camera that could be mounted easy. We didn’t need the smart features (had Photonvision Pi buried in the robot), we’ll just use it as a standard webcam… Well, the Jevois has a steep learning curve, the default camera initializes in a size you don’t need (and shows a sample video), Photonvision can’t resolution switch with NetworkTables to get to the resolution you want, and even changing the startup scripts to the right defaults, CameraServer could only connect to it like 1/3 the time. It just became a messy project. I even found this old post from 2017 showing the Jevois connecting to the Rio, and while I could get the serial data out, the camera feed was just lost.
We ended up using an Arducam 4K 8MP IMX219. I mean, it worked, we were able to get a decent driver cam out of it, but nothing special. It had a little bit of frame issues, probably closer to a 20-22FPS stable. When we used it for AprilTag identification, we were getting hops of 1-2’ of where we thought we were using the PhotonVision detection pipeline, I think it was because we never tuned for the right pipeline (like different PoseStrategy choices), aka user error. We never went far down this pipeline, so your mileage may vary (I think we need to try in the offseason 2 cameras to triangulate targets).
My lesson out of all this is finding webcams for robots is tough, mainly finding them compatible with Linux/CameraServer… Like I’m actually thinking about running Windows next year just for camera drivers (it really pains me to say it…) UVC compatibility means it “should” run on Linux, but half the time it won’t. Also, don’t use HD-LifeCams, they have a terrible processor in them and will drop to 7FPS on you with any movement.
PS: Why do people run Limelights at 90FPS? Most robot code runs at 20ms loop time (50Hz), so anything faster seems like just a waste. I could see a stable 60FPS being a good target to have new data every loop.