Our team is trying to calculate the distance between our Limelight camera and the vision target on the outer port, following the method described on the Limelight documentation (https://docs.limelightvision.io/en/latest/cs_estimating_distance.html).
Our Limelight camera is fixed to the rim of a turret, and we’re rotating the rim while keeping the turret stationary. A side view of our setup looks the same as the diagram in the Limelight documentation above. A top down view of our setup looks like this: (not drawn to scale)
However, we’ve found significant differences between what we measured and what was calculated using the method described above. Does anyone have any clue as to why this is the case?
Here’s our testing data. This data was taken with the center of the turret offset by about 10 degrees from the normal line of the vision target wall: https://docs.google.com/spreadsheets/d/1s3X6dH46xIil8N7J8F060AYJkjOUdWufw4LJfr8xBcw/edit?usp=sharing
We expected the distance to be at a minimum at tx = 0, and increase as tx gets further away from 0. However, from our data, we observe that ty seems to increase as tx gets further away, causing distance to decrease, instead of increase.
Is this an expected behavior? If so, is there some way we can take this into account in our calculations to accurately find distance?