|
Re: 30fps Vision Tracking on the RoboRIO without Coprocessor
No one knows what vision processing will be needed in the future. For this year we found that feeding the results of processing into a control loop did not work well. We take a picture calculate the degrees of offset from the target. Then use this offset and the IMU to rotate the robot. Take another frame and check that we are on target. If not rotate and check. If on target shoot. We did not need a high frame rate and it worked very well. I'll note that our biggest problem was not the vision but, the control loop to rotate the bot. There was a thread on this earlier. We hosted MAR Vision day this past weekend. It has become very apparent that most teams are struggling with vision. While it's nice to see work like this, I would like to see more of an effort to bring vision to the masses. GRIP helped allot this year.
|