Bill thanks for posting this...

I want to open this thread with some of the difficulties we had of pulling off a task like this... as posted from the cheesy thread
this was our attempt at this... getting the tracking detection proved to be somewhat of a challenge... if you look at my avatar you'll see I have last year's bumper around my waste... I was on my way to test this bumper against a blue ball to see how the color would impact the geometry detection of the ball. We found a solution that could splice the bumper from the ball, but the performance (using imac calls from NI Vision) was expensive in performance.
Here is our project... that has been ongoing since 2012. As I write now it is due for an update in the source so I'll get this updated soon and post back when it has the ball tracking.
Ok so that is one issue... so for the most part we can track the ball, but then how to move the angle... this is not so trivial at least for us. We have been pushing the idea to move to h264 to keep the bandwidth down and finally this year has been successful as shown
here we get about 1.5 or less megabits per second for push 600x800 unlimited framerate on the m1013 as shown in these video clips. The only drawback is the latency, which for camera tracking is a bit challenging to overcome potential oscillations. So the solution is to wait out the latency in bursts. I'm not sure if it is the best solution, but it was effective in the tests. I think the most attractive piece to this solution is the ability to re-use the same resources that were used otherwise for live streaming feedback for drive team.
Even if there was not latency it is tricky to tell a robot to turn a given delta angle... for us in this demo it was open loop with no gyro and no encoders all turning was "timed" to a curve fitted poly that was empiraclly tuned to 90 45 22.5 angles etc... once again... trying to work with the minimal amount of sensors. Its funny how there are many ways to solve the same problem, but for us we didn't want this feature to have any overhead mechanically speaking, so we worked with what was available otherwise.
One other footnote: We have been using Network Tables for all communication between robot and driver station (i.e. sending ball tracking information), so this may introduce some other latency, but it should be minimal. The good news is that we can reuse the same code for all Network traffic including all feedback of voltages issued, other sensor feedback, hotspot detection, autonomous tweaking, and ball count to perform in autonomous... etc.