I'm doing some tests on the reliability of the Driver Station. The first one logs the timing jitter of the DS packets.
Jitter is the uncontrolled variation in a timing source. The Driver Station sends a packet every 40ms. If it did this with complete reliability, it would have zero jitter. Here's a graph of how often the robot receives a packet from the DS.

This test took place over about 5 hours and 30 minutes.
The VI I'm using is a modified Robot Main, which you should be able to pull into any normal LabVIEW FRC Robot Project and run. It logs to a file called "jitter.txt" in the main directory of the cRIO. You can look at the data manually, or you can use the attached "extract jitter data.vi" to make a graph like above.
The reason for testing this with a direct ethernet connection is to use as a baseline for a similar test done over wireless, to then test the reliability of the wireless connection. Many of us have seen robots stop dead on the field, so I'm initiating some testing to aid in troubleshooting and diagnostics. The issue is commonly blamed on the placement of the robot radio or the quality of ethernet cables, but I've seen no testing to back this up.
Here's a general idea of what I'm trying to answer:
- What is the reliability of each point of the control system, from the joystick to the Jaguar?
- What external factors affect the reliability of the control system?
- How can these factors be measured and controlled?