So my team has a limelight that we are using for high goal tracking with good success. We have the capability to do this with both the photonvision for limelight image and the official limelight image.
We are also considering doing ball tracking. Using photonvision, we can run pipelines on the USB camera plugged in to the limelight, and have a solid USB camera for doing this.
But ball tracking has a lot of risks. The algorithm messes up and hits a partner; now you are a DNP. And to make it track the balls instead of bumpers makes the ball tracking less reliable, to the point where the ball has to be super close to the camera. In addition, with both pipelines running the limelight is going pretty close to its CPU limits.
I can’t imagine that ball tracking would be much faster than a driver in teleop, and it would definitely be much less reliable. So really the only time we would be using it would be during auto, where we know exactly where the ball is. And we’re pretty close on trajectory following, so we should be able to reach those balls anyway, most of the time.
As it is, I’m strongly considering abandoning ball tracking. But I’m looking for any suggestions; extra reasons I should or shouldn’t abandon it.
It sounds like you have a pretty good start at making a decision matrix. Pros / Cons, and ask yourself considering the Cons, are there better things you could be spending your time on? Will you get a bigger benefit from working on something else?
Did any top teams use ball tracking in 2020 (or any game piece tracking in previous years)? Tracking game pieces vs auto pathing vs manual drivers isn’t something that can be easily determined in the stands or on video.
This is a valid point, the key is to develop an auto routine that finds the right balance between scoring points & avoiding collisions.
This hasn’t been an issue for us, how are you tuning your pipelines?
Why would you run several pipelines in parallel? AFAIK this isn’t even possible with the Limelight software.
Why not use both? Use a pre-defined trajectory to get you “pretty close” to the cargo’s start location and then switch using the vision system to adjust for any error. For added safety you could cancel the routine if your robot starts moving too far away from the expected location.
I would continue to look into it. Start with a simple routine that only uses pre-defined trajectories and if that works well for you, great. If not, see how using the Limelight for feedback can help improve your routine’s accuracy.
Have you tested with a single set of Blue / Red bumpers, or multiple shades of both? Teams use a variety of fabrics and shades, especially blues, so I’m wondering if this problem is as bad as you’re currently experiencing in all cases?
In general, this sounds like a feature that’s really fun to play with, and fun to explain. I’d weigh the novelty and joy as a factor in your decisions, especially if you’re able to use it in the specific context where it’s most valuable.
The combination of high level strategic thinking and exciting technical execution will place you very high in consideration for controls awards.
What we’ve found is that the brighter the lights (we’re testing out car off road lights), the easier it is to contrast between the ball and not the ball which makes it easier to use contour detection to determine bumper or ball