My team added vision processing to our autonomous code between competitions to supplement our dual camera setup that was already in use at our first competition.
Cameras worked fairly well for our drivers, especially since we were able to add cross-hairs to the feed for easier alignment. However, it still took a bit of time to line up, although it did a great job of ensuring the shooting was on target. We were able to send over a USB camera feed and an image from an IP camera (for processing) with little trouble at events.
Our vision tracking worked twice throughout our second competition, though most of our troubles were rooted in our lack of carpeted area large enough to test on. We elected to process the image on our driver station laptop, and did encounter quite a bit of lag through the IP camera and network tables, but we were still able to make it accurate enough for two working shots in competition.
We did not have problems with the network (besides the lag) as long as we accessed the camera via its name, not IP address (e.g., axis-camera.local, not 10.32.38.11).
I’m sure our system could have worked from the defenses if our shooter had that capability.
We use LabVIEW and the MS camera. At first it was off but after tuning it has been 100% no fail. At the first district event it would not work correctly for 90 of the event. Toward the end after refinement it started to work and at the second district event it didn’t miss.
We use it for all shooting. Its does Auto high goal 100%. It missed one because the ball bounced out of the bot crossing over but its was not the shooter obviously.
Ours is reflective targeting with cross hairs. In Auto it looks for a match threshold and then shoots. In tele, its puts a box up and the drive team aligns the cross hairs and shoots.
To make the adjustments the shooter is on a turret that does azimuth and elevation. The software makes the turret adjust to the target so the bot does not have to move. We just need to be in a general area maybe ± 3-4 feet. Of course the margin can be bigger than that but to be most accurate we keep it in that range.
Thank you so much! We don’t have a turret so will have to adjust the robot position, but I would think it would be a mater of just telling different motor controllers to do something. But I am a mechanical mentor so may be off base with that thought.
Very accurate. Typically when we miss a shot it isn’t due to the vision.
The biggest issue we have had on the field is the co-processor (BBB) not booting up. We think we have determined the root cause and that shouldn’t happen any more. One other problem we had while tuning on the field that wasn’t a problem during the matches was the sponsor loop running on the screen behind the goal. The sponsor loop should never run during matches but the bright white that dominates the screen made tuning a little challenging until the AV crew changed over to the field camera for us.
A green LED ring bouncing light off of the retroreflective tape.
Yes. It was designed specifically to work while touching the Outworks.