Limelight Neural Network Note Detection

Does anyone know when (or if) Limelight is releasing a new neural detection model for 2024 notes? Or if any good third-party models are being made?

As of writing this, only 2023 is listed on their downloads page:

3 Likes

Being discussed here:

Also there is a link on the limelight website for uploading images for a 2024 dataset:

2 Likes

@Brandon_Hjelstrom Will limelight be publicly releasing the images used this year?

When I do train a .TFLight model, I’ll make sure that it fits with whatever Limelight does do, if someone else can test it. My team isn’t using the Coral with a Limelight.

I just want to point out that one can certainly do “find notes on the ground” AT LEAST as accurately with a straight OpenCV pipeline as with a convnet. If your learners are doing an ML unit anyway, more power to you, but I think for a lot of folks it’d be a questionable allocation of effort.

It’s fairly straightforward to find orange/red things on the ground. It’s more difficult to distinguish notes from bumpers.

3 Likes

Dataset Colab allows FRC teams to upload their object detection / vision datasets and it automatically creates a combined dataset that is free to download in the COCO format (.TFRecord coming soon)! It currently has over 7,000 images that can be used to train models.

In the meantime, while this functionality is being implemented, I’ve been told that you can use Pascal VOC Dataloader in order to use Dataset Colab’s COCO dataset to train models for the Limelight.

1 Like

New Google Coral Compatible Tensorflow Models

Dataset Colab now has Tensorflow object detection models (SSD Mobilenet v2 & EfficientDet) that can be used with the Google Coral and a variety of other accelerators! To download and get started using these models visit our models page to download the model and our documentation page for a guide to get started.

  • SSD Mobilenet v2 (92.20% mAP50)
  • EfficientDet (94.50% mAP50)

New Only-Note and Only-Robot Models

In addition to these models we also trained only-note and only-robot models. These models are more specialized and have greater performance. To download and use these models visit our models page .

Recommend New Models

If there are any other models that you would like to see on Dataset Colab, DM me and we would be happy to train them for you!

1 Like

Hi, when I tried to use the EfficientDet and SSD Mobilenet v2 models from Dataset Colab, I could not run it well on a Limelight 2+ with Google Coral. The FPS was capping at around 10 fps, and would not change regardless of the resolution or pipeline configs I changed. I also had this problem with Limelight’s official models, and models trained in Limelights Model Trainer. Is there a reason these models only output 10 or lower fps? I really appreciate any help you can provide.

I was having similar issues. I found this neural network earlier on another thread and was unable to find it again afterward (luckily I put it on Google Drive). I don’t know what magic sauce is in it but we have been getting ~80 fps at 640x480 on the Limelight 3. Make sure to upload the labels file first.
note_labels.txt (5 Bytes)

2 Likes

Wow, this model is much better. I think the model is achieving very high performance by not utilizing any CPU function when running the model. Instead, everything is likely only running on the Edge TPU.

Edge TPU Compiler version 16.0.384591198
Input: /content/limelight_neural_detector_8bit.tflite
Output: limelight_neural_detector_8bit_edgetpu.tflite

Operator                       Count      Status

CONCATENATION                  2          Mapped to Edge TPU
RESHAPE                        13         Mapped to Edge TPU
ADD                            10         Mapped to Edge TPU
CUSTOM                         1          Operation is working on an unsupported data type
CONV_2D                        55         Mapped to Edge TPU
LOGISTIC                       1          Mapped to Edge TPU
QUANTIZE                       1          Mapped to Edge TPU
DEQUANTIZE                     2          Operation is working on an unsupported data type
DEPTHWISE_CONV_2D              17         Mapped to Edge TPU

This was the log when compiling an 8 but integer quantized model in Limelight Model Trainer. As you can see, some operations are not mapped towards the Edge TPU but are ran the CPU. Which is much slower, therefore impacting the latency and fps of the model.

Model successfully compiled but not all operations are supported by the Edge TPU. A percentage of the model will instead run on the CPU, which is slower. If possible, consider updating your model to use only operations supported by the Edge TPU. For details, visit g.co/coral/model-reqs.
Number of operations that will run on Edge TPU: 99
Number of operations that will run on CPU: 3

This was also the terminal log when compiling for edge TPU. So probably this model has everything mapped for an edge TPU, eliminating the bottleneck and improving performance. But so far, I haven’t found a way to map everything to edge TPU just yet.

Thank you for the insight. Have you been able to run this model on your Limelight yet?

Yes, the model I trained that didn’t run fully on Edge TPU had 10 fps. But the model sent on Google Drive was able to get 20 fps (720) on Limelight 2+.