View Single Post
  #111   Spotlight this post!  
Unread 12-06-2016, 01:32
AustinSchuh AustinSchuh is offline
Registered User
FRC #0971 (Spartan Robotics) #254 (The Cheesy Poofs)
Team Role: Engineer
 
Join Date: Feb 2005
Rookie Year: 1999
Location: Los Altos, CA
Posts: 803
AustinSchuh has a reputation beyond reputeAustinSchuh has a reputation beyond reputeAustinSchuh has a reputation beyond reputeAustinSchuh has a reputation beyond reputeAustinSchuh has a reputation beyond reputeAustinSchuh has a reputation beyond reputeAustinSchuh has a reputation beyond reputeAustinSchuh has a reputation beyond reputeAustinSchuh has a reputation beyond reputeAustinSchuh has a reputation beyond reputeAustinSchuh has a reputation beyond repute
Re: FRC971 Spartan Robotics 2016 Release Video

Quote:
Originally Posted by AirplaneWins View Post
Could you explain your vision tracking process this year, I heard you guys used 2 cameras. And what coprocessor did you use,if any?
Sorry for the delayed response. Life got in the way of robots again

As Travis said, we wanted to do stereo, but didn't get around to verifying that it worked well enough to start using the distance that it reported. One of the side effects of stereo cameras was that we didn't need to deal with the transforms required to deal with the camera not being centered. Our shooter didn't have any space above or below the ball for a camera. The bottom of the shooter rested on the bellypan, and the top just cleared the low bar.

We did the shape detection on the Jetson TK1, and passed back a list of U shapes found to the roboRIO over UDP in a protobuf, including the coordinates of the 4 corners for each camera. We didn't find that we needed to do color thresholding, just intensity thresholding, and then shape detection. This ran at 20 hz, 1280x1024 (I think), all on the CPU. The roboRIO then matched up the targets based on the angle of the bottom of the U.

We were very careful to record the timestamps through the system. We recorded the timestamp that v4l2 reported that the image was received by the kernel, the timestamp at which it was received by userspace on the Jetson, the timestamp it was sent to the roboRIO and the timestamp that the processed image was received on the roboRIO. The let us back out the projected time that the image was captured on the Jetson in the roboRIO clock within a couple ms. We then saved all the gyro headings over the last second and the times at which they were measured, and used those two pieces of data to interpolate the heading when the image was taken, and therefore the current heading of the target. This, along with our well tuned drivetrain control loops, let us stabilize to the target very quickly.

Ask any follow-on questions that you need.
Reply With Quote