Denz, the math and the algorithms show there are different things which effect accuracy. The camera can be mounted at the max robot height and still give plenty of accuracy if the math is understood and some undocumented things are known. In determining what part of the vertical vs distance accuracy you are referring to here is a list of things to consider:
- Height of the camera center: max allowed is near 60".
- Height to the center of the target: see the rules. It is 130".
- Degrees per servo step: measure this experimentally. You can derive a simple formula to convert pwm to degrees.
- Camera sensor size in pixels: the vertical amount is in #define IMAGE_HEIGHT of 240 pixels.
- Camera field of view in angles: undocumented. This can be experimentally measured. People have reported 34-36 vertically, ours was 35 degrees.
- Target centroid x,y in pixels: undocumented. These are the mx, my values in the T_Packet_Data_Type struct.
- "allowable error" in pixels. This is in the #define TILT_ALLOWABLE_ERROR_DEFAULT and the default is 6 pixels. You can read tracking.h and tracking.c to see what this does. The camera stops moving when the target center is within that many pixels of the camera center.
The allowable error can be modified to have the camera center closer to the actual target center.
The camera pwm values can be used for a coarse angle measurement then the target centroid pixels can be used for fine measurement.