View Single Post
  #7   Spotlight this post!  
Unread 04-06-2013, 18:19
JamesTerm's Avatar
JamesTerm JamesTerm is offline
Terminator
AKA: James Killian
FRC #3481 (Bronc Botz)
Team Role: Engineer
 
Join Date: May 2011
Rookie Year: 2010
Location: San Antonio, Texas
Posts: 298
JamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to beholdJamesTerm is a splendid one to behold
Re: Using a Raspberry Pi for camera tracking

Quote:
Originally Posted by Aaron.Graeve View Post
I know it is possible and a good idea. Don't quote me on it but I believe 118 used the Raspberry Pi's enhanced I/O equivalent, a BeagleBone, for the vision tracking on their 2012 robot.
This is what I know... as I spoke with their developer on this:

They used a beagleboard (http://beagleboard.org/) running embedded Linux with Ethernet and USB interfaces

They used HTTP GET calls via libcurl library... processed using OpenCV and finally sent UDP packets across to the cRIO as input for the control loops.

One thing we didn't discuss... which is something I may want to talk about at some point... is the danger of sending UDP packets if the robot is not listening to them. This can flood the buffers and corrupt tcp/ip causing the driver station to lose connection. The solution we tried to overcome this issue is to open the listener immediately on its own thread (task) that starts on power-up. This should work as the time it takes for the camera to power on (about 30 seconds)... is much later than what it takes for the cRIO to power up and start listening.

Oh yes and we both use WindRiver c++