First, a fundamental concept - with DC motors, a robot will lose acceleration as the motor reaches its maximum loaded speed. We can calculate this maximum loaded speed, but then certain designs never reach maximum speed on a FRC field, strictly speaking. Also remember that since we’re estimating friction in the drive train, the maximum loaded speed could be off a bit. Some physical drive train builds are great at reducing friction, and others not so much. So to provide a compromise of strict detection that acceleration = 0ft/s^2 vs designs that are fielded by high-performing teams, I created a threshold for determining when a robot is at max speed.
The acceleration distance is the distance an initially-stopped robot travels before acceleration (ft/s^2) is below a threshold. By default, that threshold is 1% of theoretical 12V free speed. In your example, the robot takes 15.6 feet to accelerate to the point where the total motor output torque generates less than .159 ft/s^2 of acceleration. There is a way to adjust this threshold, but as-is it seems like a good rule of thumb for games 2015-2019.
The flag “Under-Geared” is just a warning to inform a designer that the robot will likely take longer to accelerate to full speed than what the designer desires (in your example, that is the 16 foot Sprint Distance input). For a team who values good throttle response and also tends to gear for specific time periods of a match rather than an overall speed (like mine), keeping acceleration distance well under sprint distance is a good thing.
On the right-hand side, the middle chart shows the impacts that changing a gear ratio has on acceleration distance and cruise distance within a single sprint. This type of math has helped my team understand how to adjust gearing in order to get a more desired outcome for a meta-game we didn’t expect in 2018. So I’m piloting that chart to the public this year, in the hopes that it helps more teams out.