Google coral

The easiest and most applicable solution for game piece detection is probably to use the limelight, which has worked quite well for many teams. However, if you wish to make and train and use your own models without the limelight, there are several repositories out there that use Python or C++ that can be used for inspiration.

Links:
GitHub - wpilibsuite/DetectCoral: Object detection for the FIRST Robotics Competition (WPI)
GitHub - ssysm/FTC-TFoD: FTC/FRC TensorFlow Object Detection Training Toolbox
GitHub - Arbitrary-2912/RaspberryPiCustomObjectDetection: Custom dataset object detection using TensorFlow Lite
… others that are just a search away

If you’re interested in going down this path, then I’d also suggest reading this post: Machine Learning or Computer Vision for Object Detection

Additionally, something I’ll point out is that standard contouring methods can also be used to detect game elements if the shape isn’t too intricate. I know a few teams used retroreflective contouring to detect cones this year. But if you decide machine learning models are the path for you, then in my experience a Raspberry Pi/Limelight isn’t fully sufficient/optimal by itself and the results involve very low fps. So, if that’s the case, a Coral is a great way to supplement your existing hardware and expand your potential.

1 Like

ah, yeah, I use tensorflow just because it’s more familiar. I get like 15 fps using the TPU for simple object detection.

I find Tensorflow very difficult to use for custom models for basic detection grabbing a mobile net and transfer learning it is fine, but making a heatmapping model or using transformer architectures is sucky for the first and actually impossible for the second on the coral

1 Like

did you guys use the neural classifier or neural detector?

If you guys did use the detector models, that means that limelight already released them, which I didn’t know about before. The docs still say that the models are WIP

Used the model posted here!

RC might’ve had a different experience than us, but my team had a relatively small width intake for cones, so it took a while to align properly with ground cones dropped at the chute or anywhere else on the field.

By using the LL with the coral, we were able to detect cones on the floor and auto rotate the robot so that the intake faced them. Then my drive joystick would change to just a forwards/backwards throttle to drive straight into the cone. In match, using it was about 3x as fast for us as opposed to ground pickup without it.

7 Likes

cool! thanks

Can you explain how you guys did this, if you don’t mind? Or at least any documents/sources you used to accomplish this?

The Limelight documentation is here: Getting Started with Neural Networks — Limelight 1.0 documentation

Our code is really simple. A rotational PID loop on the limelight’s tx value.

Can you explain how you created an algorithm to align with the game pieces that were detected? Any special placements of the limelight or ranging usage and how did you accomplish this in code?

The Limelight returns a value, in degrees, of how far away the target is from the horizontal center of the image. Therefore if you want to align to it you must get this value to 0. That is exactly what we did; run a PID loop on the robot’s rotation until the X value is 0.

Here’s the code.

Placing of the limelight was simple. I don’t have a picture on hand, but it was mounted on our upper linear arm (whose motion is dictated by a linear lower arm) at roughly a 45° angle behind vertical. This was ideal as our infeed didn’t get in the way, and when the arms and infeed were in position to pick up cones and cubes it had a good view of the field.

In terms of the Limelight software, little to no tuning was required. A little bit of tuning the confidence filter and we were good.

Thanks, but how does the limelight estimate distance from the game pieces? Don’t you need a different ranging algorithm for that?

This can be done through trigonometry but we forewent this, as simply having the driver control forward movement is much more robust.

If I want to use google coral on two limelights, how many google corals should I buy, one or two?

2 Corals, one per LL.

Thanks!

We did something similar, albeit a lot less sophisticated.

Will it be possible to align accurately and efficiently using a tank drive? Were you guys using swerves?

What about your team did you use tank or do you believe it will be implementable on tank.

It was decently accurate. We used tank drive that year.