There is a new vision sample program included with the latest update of WPILibJ (just posted on the update site). Also, in the WPILib project, Documents section, there is a paper describing how it works. In addition, also in the Documents section, there are some sample images to play with and a Vision Assistant script that was used to create the sample code.
Please provide some feedback on this sample and the paper.
This has already helped us a lot. Our team had been struggling for quite a while trying to figure out how to track but the paper and the sample code simplified it drastically! Thanks a bunch!
Thanks, Brad - this is very helpful. We are still in the process of figuring out whether we want to do onboard or offboard image processing, but at least with this update, doing it onboard is a viable option!
If we use a red led ring, the sample code should work, correct? There is nothing we have to change for it to run correctly? We are having some trouble getting it to work…
I have a question regarding the sample program, given that we have an image, what’s to do with it? How can I begin to translate the image into motor movement?
The plugin update came at the same time as the new sample.
If your Netbeans isn’t set to update every startup/day, go into the Plugins and Reload the updates page.