[Work in Progress] Vision Tracking Protocol for Arduino

So I’ve been working on a little thing that I call Vision Tracking Protocol(VTP). The idea is to use a CMU camera, which can detect objects (like Pixy by Charmed Labs) with an arduino and the RoboRIO to self-aim the robot. The pixy will send the Arduino info, which it will use to tell the RoboRIO where the target is. the description is here, it’s a work in progress though. Tell me what you think.

Great idea. Our team has offloaded vision to external processors a couple of times and it works well.

As there are frequently multiple targets, I wonder if a richer interface would be more desirable than X, Y, D - one that returned information about each detected target?

Steve

Cool project.

We’re always looking for new techniques and technologies to add to our own Arduino libraries.

Vision is currently missing, but would be a great addition, especially now that we have a lot more firepower in our new Gorgon controller.

Good luck.

Is an Arduino even powerful enough to implement tracking? I still occasionally use Arduino, but I try to use XMOS nowadays for the speed and multithreading. On the other hand, it’s pretty tricky to implement communication with it if you’re not up to hardcoding SPI protocols as defined in a datasheet.
Arduino definitely has the best community out of all the microcontrollers, so if you can get this working it would be crazy easy to implement. I hope this goes well!

EDIT: The google doc has commenting on.

Thanks for the feedback guys!
Like I said, it’s a work in progress, so anything could change at this point.
As for whether it’s powerful enough, the vision is actually done onboard the camera in the design. It’s a smart little thing called a Pixy, and it can do blob detection quite well.