View Single Post
  #1   Spotlight this post!  
Unread 23-01-2002, 21:01
patrickrd's Avatar
patrickrd patrickrd is offline
Registered User
AKA: Patrick Dingle
no team
Team Role: Engineer
 
Join Date: May 2001
Rookie Year: 1999
Location: Medford, MA
Posts: 349
patrickrd is a splendid one to beholdpatrickrd is a splendid one to beholdpatrickrd is a splendid one to beholdpatrickrd is a splendid one to beholdpatrickrd is a splendid one to beholdpatrickrd is a splendid one to beholdpatrickrd is a splendid one to behold
Send a message via AIM to patrickrd
Speed loss due to internal friction

Does anyone have any general rules they follow when choosing the gear ratios for their drive system? What I am specifically interested in is how teams account for internal friction in the robot. For example, the specs say that the drill motor output shaft is at 300 RPM in low gear. Now, I can gear this to a wheel that will rotate at a speed that results in 6 ft/sec, for example. However, how much slower will it actually move once it's on the ground and has all sorts of frictional forces acting on it? I know this depends a lot on bearings, chain tension, and design, but are there any general approximations teams use when making these determinations?

Thanks
Patrick
__________________
Systems Engineer - Kiva Systems, Woburn MA
Alumni, Former Mechanical Team Leader - Cornell University Robocup - 1999, 2000, 2002, 2003 World Champions
Founder - Team 639 - Ithaca High School / Cornell University
Alumni - Team 190 - Mass Academy / WPI