Speed loss due to internal friction

Does anyone have any general rules they follow when choosing the gear ratios for their drive system? What I am specifically interested in is how teams account for internal friction in the robot. For example, the specs say that the drill motor output shaft is at 300 RPM in low gear. Now, I can gear this to a wheel that will rotate at a speed that results in 6 ft/sec, for example. However, how much slower will it actually move once it’s on the ground and has all sorts of frictional forces acting on it? I know this depends a lot on bearings, chain tension, and design, but are there any general approximations teams use when making these determinations?

Thanks
Patrick

Come up with a value for the torque requirement (for example the load that your robot presents - Pull 130 pounds of potatoes on a mock up of your wheel system, if necessary).

You can see by the figures for free speed and the geared down speed (multiplied back up to the motor speed input to the gears) how much “friction” the gear box represents.

Then, with the a Speed-Torque curve for the motor in question in front of you, enter on the torque axis, go up to the line, the at the line, move horizontally to the left, and read the speed.:wink: