Does someone know how is the Google coral ? Are you recommend it ?
We wanna take our program one step up and don’t know exactly what we need
I personally do not recommend it as you need to stick to older Tensorflow models, if you do any development in frameworks like PyTorch it’s unusable and developing in Tensorflow still results in a large number of operations running on the CPU. If you really want to do ML either using the NPU in the orange pi 5 or getting your hands on a newer nvidia jetson is going to be a lot easier in terms of actually deploying models.
The coral ai site has some documentation that might be helpful TensorFlow models on the Edge TPU | Coral
When you say your program, are you talking about your robotics program as a whole, or a software program you’ve developed?
Our robot program, we want more automation
I’d recommend taking a platform with PyTorch support and training up a yolo model they tend to be robust with small datasets and very easy to set up with lots of documentation.
They’re great and easy to use with a Limelight. There are better places to improve your robot programming, but that’s a seperate topic.
I’m sure they’re easy with limelight’s supported models designed to work with a coral but if you’re developing systems on your own you’ll notice most new developments rely on operations that the coral TPU ASIC does not support due to its age.
Alright, that helps. What is your goal with wanting more automation? (Ie, what do you want to automate?)
We used a Coral with our Limelight 2(+). Detection of game pieces worked exceptionally well and using it to auto-align was a great bonus feature. Almost entirely plug and play with minimal tuning required.
To remove responsibility from the driver, be more consistent.
If I talk on last season so Identifying game pieces - and then automatically pickup them,
find your location in space (also In the auto time)
The google coral is primarily good when used with the limelight, as that unlocks ML technology. All you need is a limelight 2 or above(i’m pretty sure), and you can use object detection+classification models on the limelight with the same interface as the regular!
Used a coral + limelight 3. Worked fantastic and would recommend to any team.
Hey, man! did you guys used the provided model by limelight? or did yall trained your own model?
Just used whatever LL had.
thanks for the quick response!
I believe the Limelight models are also in a standard format for ML models since they’re sourced from community data. You should be able to run them on any hardware that’s powerful enough if that’s the case.
I haven’t run in to this compatibility problem, but I’ve only done pretty simple things could you say more about what hasn’t worked for you?
What does worked fantastic mean? Can you provide any data it made a difference competitively?
So this isn’t as much of a problem if you develop in Tensorflow, but I developed all my models in PyTorch, PyTorch requires going .pt to .onnx to .pb frozen graph to .tflite to .tflite_int8q and this creates a lot of compatibility issues. A lot of operations have no TPU support and run on the CPU and some models will down right refuse to convert due to TPU constraints and the other issue is the 4mb limit for ram.