We’ve tried to use formulas in order to find the distance to the high target. When we checked it physically, we noticed that it calculated a distance that is wrong in approximately 2 meters.

I’ll explain first our formula – it’s the regular formula which WPILIB suggest to use:
We know the target height in inches – 20 inches for the high goal.
We know the target height in pixels (with vision processing)
We know the image width in pixels (it’s basically the resolution) – our resolution was 640 pixels.
So we can calculate the field of view width in inches. We know the camera horizontal angle which is (in our camera) 47 degrees, so we can calculate the distance to the target using tangent.

So the formula is:
distance = (targetHeightInches * ImageWidthPixels) / (2 * targetHeightPixels * tan(cameraHorizontalAngle/2);

I can think of number of reason to our errors, I’ll be glad if you’ll comment and tell me if I’m right or there are another reasons.

First, maybe the camera horizontal angle is not accurate (should we measure it ourselves?)

Second, the distance we measured physically was distance which is parallel to the ground. In the formula the distance has some angle with the ground – alpha - so the calculated distance is : parallelDistance / cos(alpha). I think it’s negligible (alpha would be something like 10 degrees, right?).

Another reason might be that in this calculation we assume that we are at the height of the target – when our camera is something like 15 inches from the ground and the high target is much higher so the field of view is not a square it’s some Quadrilateral. We assume in the calculation that the width of the FOV is constant but it’s changing - the lower pixels might be smaller in the reality compared to the higher pixels. We put the target in the center of our image – do our “wrong” assumptions still affect and cause to those errors?

Maybe it would never be precise with a formula and we need to use physical measurements for each distance??

Assuming the image element you are measuring is the retroreflective rectangle, you need to reconsider its physical height. I believe it is 12" inside the tape plus 4" tape on each side.

As mentioned, there are other sources of error, but I suspect those will be pretty small, definitely not 2 meters of error.

My fault - I was mistaken and wrote 12 inches but we used in the code 20 inches (now i edited the thread). sorry for that…
Still, we had this error of ~2 meters.

Does one of you tried this kind of that code and it worked with small errors?

I looked at the LV code and the camera FOV values being used are 48deg for the Axis 206 and 43.5deg for the M1011. I don’t have an M1013 to measure.

I modified the FOV numbers from the data sheet last year as well.

I don’t recall whether the data sheet indicated that this was a diagonal measurement vertical, or horizontal. And I’m not sure how it was measured or how accurate it was. I didn’t test it at many distances this year, but last year’s seemed to be accurate at a number of distances once the FOV was corrected.

Obviously this approach is not accounting for barrel distortion and planar distortions when the camera sensor isn’t parallel with the wall, but those aren’t expected to be significant and this was an attempt to keep the math somewhat simple.

Do you have a way to tell the angle and height of the camera? If so, you know the height of the tallest goal, and the angle relative to the camera, so you can use tangent to figure it out. Something like this:

I used a similar height-based algorithm last year and had the same sort of problem with error. I found this angle-based formula by looking at Miss Daisy’s vision code from last year.

I found Miss Daisy Code, regarding the distance calculation:

double x = square.getX() + (square.getWidth() / 2);
x = (2 * (x / size.width())) - 1;
double y = square.getY() + (square.getHeight() / 2);
y = -((2 * (y / size.height())) - 1);
double range = (kTopTargetHeightIn-kCameraHeightIn)/Math.tan((y*kVerticalFOVDeg/2.0 + kCameraPitchDeg)*Math.PI/180.0);

As I understood from their code, they have a fixed camera, and by the normalized Y center of mass position of the target, they find out how far they are from the target (their distance is horizontal and not a diagonal distance). Their camera is at kCameraPitchDeg and when Y changes from -1 to 1, they can know the actual angle of the target relative to the camera (In other words, this angle is the angle between the line which connects the target and the camera and a line parallel to the ground). They know height and an angle – so they can find the horizontal distance from the target!

Am I right?

If so, another question arises. We calculated the distance when the target is in the middle of the image because the lens of the camera makes stuff look bigger when it’s far from the camera’s center, and then the height of the target was bigger (in pixels) than it should’ve been.
They used only the center of Y of the target, so maybe it makes this lens distortion negligible?

If I was right about my understanding of this code, and the lens disorder is not affecting much, I think it might be a better solution for calculating distance.

**
Have someone from here tried this solution and it worked without major errors like the other method for calculating distance?? **

double range = (kTopTargetHeightIn-kCameraHeightIn)/Math.tan((y*kVerticalFOVDeg/2.0 + kCameraPitchDeg)*Math.PI/180.0);

is the same as the formula I provided before, if you substitute y*kVerticalFOVDeg/2.0 for relativeTargetAngle, substitute kCameraPitchDeg for cameraAngle, then convert it to radians before calculating tan() (multiplying by Math.PI/180.0 does this).

This method has some serious advantages over using height and linear perspective; you don’t have to worry about distortions or weird angles, and these cameras are pretty accurate when it comes to relative angles. Make sure you use the right values for your horizontal and vertical field of view values though. For the M1011, it has a 47 degree horizontal field of view and this stackexchange question provides a calculation for the vertical field of view yielding about 36.13 degrees.

When I worked on our distance calculations (over the summer), the first thing I did was get the actual horizontal and vertical FOV angles. The FOV angle in the manual is usually the diagonal FOV, which is useful to almost nobody. So, my first recommendation would be to measure your camera’s specifics. We counted the number of bricks visible on the exterior wall and then measured them.

Some of the other tricks we used are not applicable this year. However, if you got the smallest outer bounding box that held the tape and the biggest inner bounding box that fit in the tape, you could average the heights and compare that to 16" (inner box + 2*(half tape width). This should counteract any parallax from not being centered on the target.

Could you post a copy of your code? Someone might be able to find why you’re getting numbers that are so far off. You should be able to calculate within 2-3" or so.

If we are using the values of the current camera the vertical angle would be something around 33 degrees and not 36…

and to DELurker, (and also it connects to Ginto8)
we would check ourselves the FOV vertical and horizontal angle, and not rely on what the pdf says.

As for our code, it was just calculating the average height of the target by its 4 outer points, and the rest is the formula I showed in the beginning using some constants (we will correct them as you suggested).

It didn’t work, so we thought about trying the code of Miss Daisy, we didn’t try it yet, but we will right soon as we will.
We will also try the last code with the 2 meters errors again with fixed angle and also with 16 points average and not 8 like before.

*Thank you very much, we will update you when we will check it.
If you have more suggestion, or you tried one of those codes and it wroked please tell us! *