Quote:
Originally Posted by asid61
One thing to note is that acceleration apparently barely changes from 10fps to 20fps. One parent on our team made me a spreadsheet (with graphs) that detailed acceleration given motor specs and robot weight, although it did not include friction in the calculations. It showed that for speeds up to ~30fps the distance vs. time was almost the same. By going at lower speeds you would get an advantage on the order of a few inches.
|
What model for acceleration are you using? What factors is it taking into account? Are these results backed up by empirical data? There are models available for acceleration in FRC, but they are just that - models. There isn't a difference "on the order of a few inches" in acceleration between 10 and 20 FPS, unless you perhaps mean the time it takes to make a 40ft sprint (and if it's a difference of a few inches, the 20 FPS drive is extremely inefficient and performs worse at close range).
I've made the mistake before of using spreadsheets to design drivetrains without really understanding what the values on the screen meant. I'm not saying that's what you're doing here, but that model seems to conflict with my empirical data so I'm curious how it works.