View Single Post
  #5   Spotlight this post!  
Unread 23-08-2012, 14:26
Caboose's Avatar
Caboose Caboose is offline
Programmer - LabVIEW, C++, C#, Java
AKA: James Parks
FRC #0900 (Zebracorns)
Team Role: Alumni
 
Join Date: Jan 2009
Rookie Year: 2008
Location: Seattle, WA
Posts: 72
Caboose has a spectacular aura aboutCaboose has a spectacular aura about
Re: Who used Driver Station for Vision?

Quote:
Originally Posted by RyanCahoon View Post
I'm pretty sure the robot control data doesn't take more than about 1KB per update, at around 50 Hz. This gives you 400 Kbps for robot control data.

To transmit a single frame of video with 320x240 resolution and 24bit color is 320*240*24 = 1.8 Mb. Note that the Axis cameras use MJPEG compression as well, so this is a gross overestimate. For targeting purposes, given the network lag that's going to be inherit in the system, you shouldn't need more than 10 fps. Maybe 15 if you want smoother-looking video for display to your drivers.

That's still only 30 Mbps (even with uncompressed video).
Note the word Ideally, 300 Mbps it the theoretical bandwidth for 802.11n with the FMS router and robot in channel bonding mode (this doesn't happen as far as I know), but can drop to under 130 Mbps which would then give each team ~21Mbps or under. Also to do nice detailed image processing my team used 640x480 resolution on two cameras in stereo, pushing us to ~14 Megabits(7Mb*2) for both images at ~15FPS, excluding network overhead, this according to the FMS people in Raleigh was getting close to taxing the FMS when we were the only team using vision on the field during a match.
__________________
navX Labview Library

"Robots are aluminum shavings, held together by zip-ties."

myManga