During the off-season, some of the students on 9016 were playing around with the Google Coral and combining it with the Limelight to make real-time object detection/classification and tracking.
What is Google Coral?
Google Coral is a specialized hardware suite by Google that is tailored to run machine learning operations, especially those using TensorFlow models. It helps ensure quick, efficient machine-learning computations for embedded systems.
The Fusion of Limelight & Google Coral:
How to integrate the Google Coral into your robot. Blend the tracking capabilities of the Limelight with the Coral’s object detection/classification.
- Train the AI Model with Google’s Teachable Machine: Start with https://teachablemachine.withgoogle.com/ to train your AI model. Feed images or videos to train the model for specific object detection tasks. (e.g. 9016 inputted several pictures of cones/cubes in developing a classifier neural network). Once you are ready, convert the model into a TFLite model.
- Upload your TFLite to your robot: First, download all necessary files for Limelight at Downloads – Limelight Vision. Then, navigate yourself to http://10.XX.XX.24:5801/ (your team #), and select the “Classifier” pipeline. Divert yourself to the uploads tab, uploading both the .txt file for the labels and the .tflite file for the trained machine learning model.
- Real-time Detection in Action: Once integrated, the Coral, with its trained model, grabs frames from the Limelight; classifies the objects in the frame with accuracy; and then identifies and locates the object in real-time (via the Limelight). This combo offers both precise detection and consistent tracking!
Video Demonstration from the spring. https://www.youtube.com/watch?v=6EYJzMOUVr0&ab_channel=LisulElvitigala