View Single Post
  #14   Spotlight this post!  
Unread 12-02-2012, 09:29
tickspe15's Avatar
tickspe15 tickspe15 is offline
Purdue University
AKA: Spencer Tickman
FRC #1747 (Harrison Boiler Robotics)
Team Role: Mentor
 
Join Date: Nov 2011
Rookie Year: 2009
Location: Issaquah, Washington
Posts: 252
tickspe15 has a brilliant futuretickspe15 has a brilliant futuretickspe15 has a brilliant futuretickspe15 has a brilliant futuretickspe15 has a brilliant futuretickspe15 has a brilliant futuretickspe15 has a brilliant futuretickspe15 has a brilliant futuretickspe15 has a brilliant futuretickspe15 has a brilliant futuretickspe15 has a brilliant future
Re: Team 3142: Week 5 preview

Quote:
Originally Posted by oswaldonfire View Post
Yep, the Kinect works - if you look closely, you can see how we put wax paper over the infrared laser projector, effectively blurring the light into a homogenous field and taking advantage of the kinect's infrared camera. When coupled with the retroreflective tape on the targets, it gives us a perfect tracking system completely immune to any changes visible light. The kinect is connected to an onboard computer, which does a huge amount of image processing to send a distance value (accurate to the inch) and information on how to move the turret (preliminary testing shows <1 degree accuracy) to the cRio.

In addition to running the Kinect, the onboard computer processes a feed from a second webcam which is pointed down at the field in front of the robot (not attached in this picture) and sends an augmented-reality video feed back to the driver station, highlighting the closest ball in green (or any other color) and overlaying information to help the driver line the robot up with the ball to pick it up.
We(1318) are doing a very similar thing but with AXIS cameras instead of a kinect because we did not want to run windows on our onboard computer and microsoft has rules about using the kinect with non windows devices
also we are using a PICO-ITX P830