Consider a standard tank-style drivetrain with encoders mounted on each side along with a gyro. When developing a method for driving straight to a desired distance, a sensible approach would be to use the encoders to track the distance traveled and the gyro to ensure the robot maintains heading. Under this setup, is it best to use the min, max or average of the two encoder values to represent your distance? Is there another option which is better than any of these?
Here are my thoughts:
Min: This ensures that both encoders have reached the setpoint. Assuming the encoders represent very close to real world distances, this would lead to an overshoot.
Max: This likely will result in only one of the encoders reaching the setpoint (not accounting for coasting). This would likely lead to an undershoot.
Average: This ensures either both will be exactly on target OR one will be over and one will be under. I have a hard time imagining what this implies, probably depends heavily on implementation details.
