Power Cell Detection Video

Here’s an example with what you can do with the WPILib Machine Learning tool DetectCoral.. This is my team’s (FRC team 190) model, which was trained to 1500 epochs, with all 5000 images of the dataset provided by WPILib labelled.

This video was automatically generated by the tool, using the testing.ipynb SageMaker Notebook. If you’ve tried testing your models, feel free to reply to this thread with your own video!

2 Likes

Just to add, this model runs on a Raspberry Pi 4 at 20-30fps, and outputs detection data to NetworkTables and the inference video to an MJPEG stream.

Nice results! If you want to post the ipynb, I might be able to suggest a change or two to get that model more robust.

1 Like

I mean go for it, you can find the training script run by AWS SageMaker here: https://github.com/wpilibsuite/DetectCoral/blob/dev/container/coral/train

The notebook can be found here: https://github.com/wpilibsuite/DetectCoral/blob/master/training.ipynb

Feel free to submit a Pull Request, or if you’d rather just DM me or reply here with a change that works too. It was really tough to get this whole tool working and available to literally any FRC team that’s interested, so if you have a better idea on how something should work, go for it. I can also link additional scripts that are downloaded when the Docker image is built.

The main issue with our model is that in our dataset, we did not label Power Cells that were eclipsed by other Power Cells. Also, if we supplemented our dataset with pictures of Power Cells where the ceiling lights were brighter, that would help too.

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.