Using Limelight and target area to estimate distance

I am trying to estimate distance from the target on the rocket after the robot seeks it, so I can convert ticks of encoders to that distance and have the robot drive up to it. The limelight is not going on at an angle, so I was wondering how target area is used since the site doesn’t explain it much. Thank you.

I wouldn’t recommend using target area to estimate distance because if the target is heavily skewed with respect to the robot, your calculations will be way off. I’ve tested the values for target area when the vision target was skewed and it was off by as much as 30%.

Yeah I think using target height is the best bet

Don’t think you can use target area to estimate distance since the two are not linearly related.
We are either going to do the math or just use a table. i.e. put the robot at 15 feet what is the area.
Then go down to 14.5 feet and so on. Depending on the accuracy you need you could interpolate for any target area. But even then accuracy will suffer.
All depends on what you end goal is.

TA is the area: We used the following code to drive “Very close” to the rocket but not perfect. Enough to eliminate much error from driver but we’re still working on it.

Our code is
While (TA <= 30){
driveForward();
}

I think we used < less than. I’ll have to double check but I’m pretty sure the close to the target the higher the TA is so 95 was with our robot smashed against the wall.

The target area returns a percentage of the image that the target fills (http://docs.limelightvision.io/en/latest/networktables_api.html).

The limelight documentation has a good case study for 2019, with example software: http://docs.limelightvision.io/en/latest/cs_drive_to_goal_2019.html

In the case study, they are using the target area to generate a drive signal, and the x offset from the crosshair to generate a steer signal. If you have any familiarity with PID or control, you’ll notice the drive and steer signals are generated by running a P controller on the error term.

The tricky thing with using the target area is that this error signal is nonlinear. If you take a reading x units away, x-1 units away, then x-2 units away, you’ll notice that the target area does not increase linearly, but rather exponentially. You’ll need to introduce some additional correction this way.

You can try to use the same controller on the bounding box height, as this error signal should be linear (or a lot closer to a linear function) as a function of distance.

2 Likes

Have you tried using the newer on-board features in image 2019.5? (Disclaimer: I have very little working experience with Limelight and none with these newer features, but in lab testing the distance calculations were kind of accurate.)

We’ve had pretty good luck just driving toward a specified ‘area’. Just put your robot in scoring position and record the area. Change your code to drive until its at that desired area, be careful about the speed though!

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.