How do I determine distance from camera/robot to AprilTag?

I have some code (Java) up and running that can detect AprilTags via a USB Camera connected to the RoboRIO.

However, I don’t really know how to determine the length from the camera to the AprilTag (preferably in inches/imperial because I’m American).

I wrote some code like this first:

public double determineDistance(double tagHeight, double cameraHeight, double cameraAngle) {
    // convert angle to radians
    double angleRadians = Math.toRadians(cameraAngle);
    double distance = (tagHeight - cameraHeight) / Math.tan(angleRadians);
    return distance;
  }

But this wouldn’t work because if the camera moved back and forth, I would still get the same results from the method, as it’s still being fed the same params.

Looking back at some old code, I wrote a method that does this properly (even accounting for when the robot moves), for the LimeLight:

public double estimateDistance(double limelightMountAngleDegrees, double limelightLensHeightInches, double goalHeightInches) {
    NetworkTable _table = NetworkTableInstance.getDefault().getTable("limelight");
    NetworkTableEntry _ty = _table.getEntry("ty");
    double targetOffsetAngle_Vertical = _ty.getDouble(0.0);
    double angleToGoalDegrees = limelightMountAngleDegrees + targetOffsetAngle_Vertical;
    double angleToGoalRadians = angleToGoalDegrees * (3.14159 / 180.0);

    //calculate distance
    double distanceFromLimelightToGoalInches = (goalHeightInches - limelightLensHeightInches)/Math.tan(angleToGoalRadians);
    return distanceFromLimelightToGoalInches;
}

To my understanding, this code works because we’re able to get the vertical offset of the camera (LimeLight), and as the robot moves around, this offset changes, thus our distance changes.

But then, I don’t know how I can retrieve the vertical offset of a camera connected to the RoboRIO.

I’ve googled and googled, but don’t know a way to get the vertical offset of the camera.

Perhaps there are other, easier ways of doing this, or the answer is floating in front of my face, and I just can’t see it.

But right now I’m stumped. :frowning_face:

Any help is greatly appreciated, manso

1 Like

But this wouldn’t work because if the robot moves, I would still get the same results from the method, as it’s still being fed the same params.

You need one more parameter, the detected angle (pitch) of the apriltag from the camera. The mounting pitch of the camera plus the detected apriltag pitch determines the angle that you would use in this case for solving the adjacent triangle side length (distance).

But then, I don’t know how I can retrieve the vertical offset of a camera connected to the RoboRIO.

You’ll need to measure the height and pitch of your camera from the ground.

I’ve gotta ask, what’s your motivation for running apriltag detection on RoboRIO instead of something like a Limelight or coprocessor running PhotonVision? As far as I know getting usable framerates on the roborio is difficult. Both Limelight and PhotonVision also have more complete tag pose estimation using solvePnP techniques, which will help you find the distance to the tag (and more) easily.

2 Likes

Thanks for the swift reply! I already have the pitch of the camera, along with the height of the camera, I just have to figure out how to get the angle of the apriltag from the camera.

As for your last question, I’m running this on the RoboRIO because I don’t have a co-processor, and we’re currently using our LimeLight for something else for now (can’t really explain, sorry).

Edit:
(forgot to say but) With some optimizations, I was able to get my AprilTag detection code to work pretty well, I can get 15fps with the camera (which is ok by my standards), and AprilTag detection runs on a seperate thread to improve performance, and I’ve tweaked my code to use less memory.

I should be able to get my hands on a LimeLight soon… our team is just very busy, and we are also smaller than last year.

You can also use PhotonVision on a Raspberry Pi or something similar. You can even run PhotonVision locally on your laptop. PhotonLib is a pretty nice API to work with as well, and also provides vision simulation with AprilTags, so you can work without having access to your robot.

Yes, as a biased PhotonVision dev I would recommend grabbing a coprocessor to use for vision. It should be easy/cheap to get quickly since you can grab a popular option like a Raspberry Pi 4/5 or Orange Pi 5.

As for finding the pitch of the detected apriltag in your setup, you’ll need to know at least the vertical field-of-view of your camera first. The most accurate way of determining this will be calibrating your camera, maybe through opencv or a tool like calibdb.

With an estimated vertical focal length in pixels (if not calibrating, this can be calculated from a given vertical FOV and the sensor resolution) you can do another trig calculation with the vertical focal length and detected apriltag center pixel y-coordinate to find the pitch angle through the camera’s focal point.

Note that this is an approximation and won’t represent the exact pitch angle for several reasons. Here’s how this is calculated in PhotonVision for reference. If you want something more accurate, you’ll have to look into solvePnP pose estimation of fiducials, which is what is used in PhotonVision and Limelight software.

1 Like

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.