Quote:
Originally Posted by Mark McLeod
There's a body of anecdotal randomness...
Engineers measure things...
|
I'd have to disagree with you semantically here Mark. If team A loses comms or their cRIO resets during match 1 at a certain location, that's a data point. If they don't during match 2 at the same location, that's another data point. If team B loses comms at a different location, that's another data point..... specific events and frequency of occurence is data. Pass/fail tests are data as well - it always worked when we did this. The problem is, engineers tend to ignore things that can't be quantified (qualitative data). If a jet engine on the test stand starts spitting parts out the back, it doesn't really matter whether the sensors indicated something went wrong or not - the evidence speaks for itself.
The FMS' job is to communicate with the robots, and it isn't happening hundreds or thousands of times per season. Are the vast majority of the problems robot issues? Probably, but multiple teams have done extensive troubleshooting and don't have issues when they're not running through the FMS. If it's performing as designed, then it's designed wrong. If it has some limitations, either they are not being communicated well to the teams or no one is interested in looking for them.