Limelight distance calculation process question

I have a general idea of the process we need to do to get our final distance calculation equation:

1. Get working thresholds and contours (done)
2. Create a table in excel mapping LL area to actual distance
3. Create a regression trendline
4. implement that equation into code

So I’ve gotten working thresholds and contours, however, there’s a problem. It seems like the area is almost completely random depending on the orientation of the Limelight. For example, at 8 feet, the LL’s area invariably hovers around 0.34. It then dips down to around 0.28 at 9 feet.

However, at 10 feet, it seems to be almost completely random. At certain orientations, it ranges from anywhere from 0.15 to 0.5, frequently going to .26 or .34.

At the moment, I’m using a laser distance thing to get the distance to the goal from the Limelight’s camera, and have it mounted at 45 degrees on a makeshift wooden mount. However it seems to be nearly impossible to have the mount keep the exact same orientation relative to the target.

I need some potential ways to keep the orientation exactly the same. I theorized that a PID loop with tx could work but I’m not entirely sure how accurate it would be.

Any ideas?

1 Like

I would recommend you take a look into this: Case Study: Estimating Distance — Limelight 1.0 documentation

Its a fairly simple but useful method for calculating the distance to a target. If you are using PhotonVision they have the functions already implemented in their library.

3 Likes

Also, you would use a PID controller that uses the Yaw reported by the camera, and a setpoint of 0 (assuming your camera and shooter are aligned). You would use the PID’s output to control the robots angular velocity.

This is one way to do it. However, in many cases you’ll find that your view gets blocked etc. Then its a good idea to get the amount you’re off and Servo to that and not constantly rely on the camera.

Is PhotonVision compatible hardware-wise with Limelights? I know that it’s still an RPi internally but idk about the LEDs.

Also, I already tried this. After tuning it for 3 hours I gave up, because it’s so crude that it can realistically only give an accurate (±1ft) distance within an extremely small area (2-3ft max). We need it to work in an area of around 7-8 feet.

About PID: Like I said I considered it but don’t believe it to be accurate enough. It might be as accurate as a fixed mount on a Swerve train, if not less because of overshooting.

It is, we are currently using a Limelight with PhotonVision, if you are not doing anything too fancy the Limelight’s default image should work well enough, but here is the guide to install PhotonVision: Installing PhotonVision on a Limelight - PhotonVision Docs

Regarding the accuracy of that method, which resolution are you using?? We have used the trigonometry approach in previous years and it was quite accurate. Although you do need to have the camera aimed at the target before taking the distance measurement.

Using a PID to control the heading based on the Yaw reported by the camera is a good way to keep the robot aimed at the target. Here is an example of an implementation

(It used Chameleon Vision with a Lifecam, but the general algorithm works with any vision system that can report Yaw)

I recommend you use trig and estimating distance via angles and height instead (draw a right triangle!)

We are using that currently and it worked pretty well,

4 Likes

I already tried that and it didn’t work like I said. Maybe I’ll try a slightly modified trig approach

Their docs explain how to setup (already linked above) but they already have a built-in distance calculation in the API that is based on the trig formulation.

I’m not sure what setup you’re utilizing, but I can guarantee you that using trig OR a regression based on the V angle returned from the limelight will work.

3 Likes

You don’t need to use the exact formula if that isn’t how your setup is, just draw a right triangle with your angle having some combination of your mounting angle and ty, and your height having some combination of target height and mounting height, then solve for your range.

1 Like

This has made me realize that we need better code samples for flywheels. The docs have been updated with the following distance calculation example. One important point that was not previously called out is the conversion from degrees to radians:

double targetOffsetAngle_Vertical = ty.getDouble(0.0);

// how many degrees back is your limelight rotated from perfectly vertical?
double limelightMountAngleDegrees = 25.0;

// distance from the center of the Limelight lens to the floor
double limelightLensHeightInches = 20.0;

// distance from the target to the floor
double goalHeightInches = 60.0;

double angleToGoalDegrees = limelightMountAngleDegrees + targetOffsetAngle_Vertical;
double angleToGoalRadians = angleToGoalDegrees * (3.14159 / 180.0);

//calculate distance
double distanceFromLimelightToGoalInches = (goalHeightInches - limelightHeightInches)/Math.tan(angleToGoalRadians);

https://docs.limelightvision.io/en/latest/cs_estimating_distance.html

Noticed that as well, thanks for updating.

1 Like

Additionally, in Java, you can just use Math.toRadians(). And I think tan is also in Math.tan

2 Likes

TRIGONOMETRY WORKED WITHIN 4-8 INCHES. This is perfectly good enough given the size of the goal, plus our good shooter tables.

Thanks for the help

5 Likes

This might be a little overkill considering your desired accuracy, but you may want to look into correcting for the shape of the camera for when it isn’t pointing straight at the target:

3 Likes

To verify, is that saying the equation should be:

double dist = (kTargetHeight - kMountHeight) / (Math.tan(ty) * Math.cos(tx)) ?

Though we’d have to switch ty and tx, as it’s mounted vertically.

1 Like

Yes, although I haven’t tested it for myself yet

I just tested it with the target this year and the first equation was about 9-10% too low when the target was on either edge of the screen. The second equation was perfect though, the distance value it gave didn’t change when I changed which way the camera was facing.

@Prateek_M in case you never tested this for yourself, your equation works!

1 Like

Why exactly do you need the area of the target? Because if you are getting distance based on the ratio of the area to the rest of the image, its always going to be pretty unreliable. Truthfully I don’t remember exactly why it tends to be less reliable, but I know there was a good reason as to why it is. I’ll look into it and see if I can find something.

I would suggest to target the middle of the target, (the limelight documentation shows you how to do this). From there its just trig.

You could also try to improve the tuning of the limelight. Do you by chance have have the numbers from your tuning? From my own experiance, the tuning of the LL can greatly affect the numbers you get back