You Probably Geared Your Drivetrain Wrong: Definitive Proof

Well, maybe. Three events doesn’t really make a true sample size, and this only works for this year’s game, but graphs!

@Tom_Boehm_329 and I were talking about drive train speeds like we assume everyone does all summer long on here and we got to thinking: Zebra Dart data was collected for 3 events this season so far, why don’t we histogram every velocity of every robot at all those events and see where the percentiles lie for achieved velocity on the field. ~2 Million data points later (and Tom’s vacation time) we did it. All speeds under 2 ft/s were considered in one lump and are ignored for the graphs below, then we broke it out into .5 ft/s buckets. Our reasoning was standing still is boring* so let’s measure when they are actually trying to go somewhere.





All compared

Ignore the chart title in this picture, it should be (All 2019) not (SBPLI1 2019)


  1. Even at an event like IRI 99% of velocities over 2 ft/s were 12 ft/s or less
  2. The % of data below 2ft/s was 51.14% for IRI, 59% for SBPLI2 and 62.84% for SBPLI1. Less time was spent “standing still” at IRI. *Perhaps this is pick and place time or climbing early?
  3. After 12 ft/s you are probably best served minimizing your time spent standing still on pick or place than trying to go from the 99% of velocities to the 99.9% of velocities

This seems to agree with some of the common knowledge posted up on here in the past but its always fun to see data with some correlation.


I think it’s mainly this game, where the sprints are very short and defense requires you to switch directions a lot. I would guess that if you ran the numbers on a game such as 2017, 2013, 2008, or any other game where long sprints are frequent, the data will tell a different story.
That being said, of course teams should consider the game when gearing their drivetrain.


You didn’t happen to be geared at 12 ft/s did you?

1 Like

It’ll be interesting to see the same histogram for a single team, let’s say 2468.


While this is very interesting and well compiled data, I’m not sure this can be used to determine what the optimum speed was for the 2019 game. The optimum speed could have been way faster, but all you see here is what speed the robots were actually moving. If you are trying to quantify whether teams chose the correct top speeds for their drive trains, you could look at the maximum speed of each robot and percentage of time the robot was operating at max speed. I think you would find that there were a group of teams who were operating at their top speed for a high percentage of time, which would point to needing to gear a bit faster, and a group of teams who had a high top speed but only utilized it a small percentage of time, which would mean they probably got it about right.


What speed do you propose gearing?

To achieve 12 fps free speed, I’d wager teams are gearing anywhere from 14-19 fps based on their efficiencies.

What is the sampling method for velocity with Dart? If they’re looking at position over time with a course enough sample you’re going to miss some peak achieved velocities that will skew these numbers higher.


How could this possibly be, assuming most robots were geared for an after-losses speed above 12 FPS (which is pretty common)?

I don’t think most teams know what their after losses speeds are, and likely over predict them.

I wouldn’t be surprised if many teams geared far faster can’t break 12 fps

It’s not especially hard for any team with encoders and a reasonable estimate of their drive wheel diameter to determine this. I thought this was a fairly common practice among “top” teams, as that number is useful for planning purposes and future gearing decisions.

Agreed. what percentage of teams do that though? 5%?

1 Like

That is probably a good assumption, but the data here doesn’t prove that.

I agree completely with Adam here: what are you suggesting teams should gear at instead?

Is entirely possible that with identical robots, for example, the one geared at 19fps could reach 12fps faster than one geared at 14fps.

Similarly, if you look at the differential equations, you will see that it’s impossible to spend any large amount of time at 15+ fps given the size of an FRC field. But if long cycles are part of the game design, you may be leaving performance on the table by not gearing appropriately.

Can you explain how this is entirely possible?

Friction and the linear speed-torque relationship of motors.

1 Like

Here’s an example of a velocity vs time graph for the situation I’m describing, though I’m not saying that this is the exact graph for 14fps and 19fps

When I get back to my personal computer, I can graph the solutions to the differential equations for some real robot models and gear ratios.

1 Like

Those do not appear to be taking into account basically all frc robots being traction limited

You are correct, though that only helps the robot that is geared for faster speeds, because the robot geared for lower speeds isn’t able to fully take advantage of low end torque that the motors can produce

Dart Data is based on 100ms intervals so this data is actually “average measured velocity for 100ms intervals” so velocities occurring for less than the 100ms are reported as a lower velocity. Based on this, increasing granularity of the samples may capture higher velocities but the over all %'s probably will not change much. I would suspect that hitting 17ft/s for 10ms a few times in a match wasn’t worth it for the vast majority of teams for this years game.

This is true and there are times where hitting 15ft/s can be make or break, but given the velocities actually achieved it seems like a majority of teams geared for “real speed” over 12 ft/s were leaving torque/acceleration on the table given how they ended up playing the game. Now its also fair to say that I don’t know if their speeds were caused by gearing decisions or game play.

That’s true, but we don’t have data on what each team was actually geared for unfortunately. It might even be the case that teams geared for 14 might only be hitting 12 for 99% of their driving but gearing for 13 could pull the 99th % to 12.5 or because of increased acceleration change the shape of graphs from nice slopes to having big spikes at their max velocity.

Either way I feel this reinforces the effort vs reward aspect I bring up in my 3rd takeaway. Trying to go faster than 12ft/s takes a lot of considerations and design especially when looking to maintain low end torque and acceleration. Even at IRI for this years game 95% of speeds over 2 ft/s were still 10ft/s or less. The way I look at it is, for this years game, and knowing the play style that developed, I could spend very little time looking at a drive system, find a motor/weight/ gearing configuration that gets me ~12ft/s and never think of it again and be at 99% of what the best teams were doing in that category. I wish there was something that simple for scoring. If you don’t spend your time worrying about gearing after that point there is probably low hanging fruit for scoring that requires less effort than squeezing out another .5% from your drive train.

@Tom_Boehm_329 because you are on vacation and I am not… jerk


Is this considered gearing wrong? I think 88 FPS is reasonable…


Speaking of “Jerk” - well not quite - how about trying to make some Acceleration calcs & distributions from the data? It seems pretty well established that based on this field and game that the sprints seem to max out at 10-12 FPS, but were there any standouts for acceleration rates? And even though I’m sure the data gets messy quick, I wonder if you can even see the relative deceleration rates for controlled stopping vs. hitting a field element vs. head-on “defense” .

Thanks for sharing this data. Very interesting reading.