I wanted to type this in for posterity in the hopes that it helps someone else with vision problems.
History:
We are using a limelight to target for shooting. We also wanted a driver camera to find balls behind the goal. Initially, we used the limelight by itself, and purchased an ELP camera from Amazon and plugged it into the roborio. We tried to stream it over the standard labview dashboard.
We found that the ELP camera, using H.265/H.264, would simply crash the dashboard over and over due to an underlying code problem. (Issue has been replicated by NI - not necessarily an NI issue though).
We purchased a different ELP camera that streamed via mjpeg to rectify that problem.
We plugged the ELP into the limelight to use their picture in picture and it appeared almost instantly and we thought we were in business. However, after we plugged it into the limelight USB port to stream it we began seeing limelight disconnects, crashes, and difficulties with the ELP camera not streaming. We made sure we were up to date on all our versions and patches, but the problem continued through our first competition.
This became one of our key ānext stepsā to improve robot performance. We have specific location shots programmed into the robot that donāt need a limelight, but it still significantly hampers our cycle time when we have to drive to a location to shoot.
The first thing we tried on returning was plugging the mpeg stream ELP into the roboRio USB port. This streamed to the dashboard on the first try without a problem. But when we ran our first auto routine, the robot overshot all its marks and ended up crashing into the driver station wall. Driving it in teleop proved nearly impossible due to delayed response.
We know that we are on the hairy edge with both our canbus communications and our cpu. Our cpu has been running at 78-85%, and we work very hard to never run above that due to previous experience. Our canbus is running at an average of around 80%, and weāve pulled motors off of canbus and put them on PWM until the errors mostly stopped.
Putting the ELP camera on to roborio increased our CPU by 10-15%, and the CPU use was causing the canbus to begin suffering errors as well. This caused dropped odometry information and things got ugly.
We pulled the ELP camera off the roborio usb, and tried a microsoft lifecam just to verify the same situation and we saw a similar result with the cpu usage.
Our next step was to try a coprocessor. We grabbed a raspberry pi 3B and put the WPILib pi image on it. We were able to configure it easily and got the camera streaming in the web interface, however the dashboard could not see the stream. We found (by accident) that if we plugged 2 cameras into the pi they both became active and we could stream to the dashboard, but a buggy option isnāt really an option.
We loaded up a couple axis cameras and got them streaming, but they werenāt an ideal option since weād have to attach a fisheye lens to them due to their poor field of view. LIkewise, we grabbed an old Limelight 1 out of storage but had the same issue with a narrow field of view. Still not a good option.
Finally, we took the same raspberry pi from before and loaded up photon vision. It took about 20 minutes. We plugged in the elp camera and it streamed over the web interface. We connected it to the robot, and it streamed over the dashboard within another 20 minutes. So we plugged the pi into the roborio usb port for power, connected the pi by ethernet to our switch, and everything has been good since. I wish Iād known how quick and easily photonvision would integrate. It was seemless.
Unfortunately, we only have 1 raspberry pi. I would love to have a spare for competition. But if you are looking for an inexpensive option to stream driver station, a raspberry pi and photonvision is likely to be your best bet.
Likewise, if you canāt afford a limelight, the photonvision software on a pi with a USB camera and an LED ring seems to be a VERY good solution.