Does anyone have any general rules they follow when choosing the gear ratios for their drive system? What I am specifically interested in is how teams account for internal friction in the robot. For example, the specs say that the drill motor output shaft is at 300 RPM in low gear. Now, I can gear this to a wheel that will rotate at a speed that results in 6 ft/sec, for example. However, how much slower will it actually move once it’s on the ground and has all sorts of frictional forces acting on it? I know this depends a lot on bearings, chain tension, and design, but are there any general approximations teams use when making these determinations?
Thanks
Patrick