Looking for help/suggestions for a problem where we are stumped. We are using a limelight 3 and have been looking at the accuracy of the data from the limelight based on the apriltags. We positioned the robot at two different positions, both straight on. One position is at about 100 cm and the other is about 290 cm. See the picture below for the details.
Our limelight is mounted at the back of the robot, so we have the yaw set to 180 degrees. The camera is tilted so the pitch is 40 degrees and the roll is 0 degrees. The limelight position basically is 16 cm up, -35.5 cm back and 0 cm across.
EDIT: Yes the setting in the Web UI is 0.16 and -0.355.
Our driver station is set to blue and we are sending the correct (0 degrees) offset to the mega tag 2 via the SetRobotOrientation() API. We are providing the yaw value and zero for all of the other values. The robot is completely stationary.
From the picture below you can see that the original mega tag is ok at 300 cm, but is off more than I would expect (9.7 cm) at 100cm. In both cases the mega tag 2 is off a significant amount.
Any ideas? I have read somewhere that the straight on problem is more difficult to solve and we will try at an angle tomorrow, but it seems while sitting completely still that the values would be more accurate than what we see.
I noted that the limits for camera roll are +/- 44 degrees, are there any limits for pitch?
We have tried changing the limelight for a different with the same results.
We have tried changing the frame of reference so that the camera looks like it is one the front of the robot.
Note, we are reading values via the LimeLight web interface and not looking at the robot pose after vision samples are fed back into the swerve drive odometry.
Thanks
Butch Griffin
Error Code Xero, 1425