I am frustrated by the nearly exclusive use of digital sensors in the kit -- sensors that output either 1 or 0. The Banner sensors are like that, and the IR sensors in the trackers are like that. Why not analog?
Some may think that "1" or "0" is a much cleaner, noise-free signal than an analog signal, but it is nearly information-free as well!
I know one team used constant tracker hunting to get a better idea of the direction of the IR, and that is a great solution given one-bit sensors. But if each tracker's IR sensor put out an analog value proportional to the "brightness" of the IR it was seeing, then small pointing errors could be determined as the difference between the values, without having to hunt back and forth with the servo. Integrate the sensor difference and apply to the servo -- or something like that.
If two analog line sensors were pointed, "defocused", at the left and right edges of the line such that the output was a measure of how far off the edge the sensor was, then, again, the small error signal would be the difference of the sensors. Negative means "too far left", positive means "too far right", and zero means right on.
Yes, the outputs would have some noise in them -- that's part of the fun, filtering out the noise.
I think part of the reason that the gyro approach works well is that it gives an analog measure of the error.
Comments anyone?