Here’s an example with what you can do with the WPILib Machine Learning tool DetectCoral.. This is my team’s (FRC team 190) model, which was trained to 1500 epochs, with all 5000 images of the dataset provided by WPILib labelled.
This video was automatically generated by the tool, using the testing.ipynb SageMaker Notebook. If you’ve tried testing your models, feel free to reply to this thread with your own video!
Feel free to submit a Pull Request, or if you’d rather just DM me or reply here with a change that works too. It was really tough to get this whole tool working and available to literally any FRC team that’s interested, so if you have a better idea on how something should work, go for it. I can also link additional scripts that are downloaded when the Docker image is built.
The main issue with our model is that in our dataset, we did not label Power Cells that were eclipsed by other Power Cells. Also, if we supplemented our dataset with pictures of Power Cells where the ceiling lights were brighter, that would help too.