Log in

View Full Version : Angle Calculation from image of goal


Rflax40
11-04-2016, 15:28
Hello, our team (1124) is looking to implement an improved vision system, but is having trouble calculating the angle away from the center of the goal that our robot is pointing. If anyone could share how they accomplish that it would be great. The information we know is position of the center of our camera, and the position of the center of the goal.

TheOtherGuy
11-04-2016, 15:42
Hello, our team (1124) is looking to implement an improved vision system, but is having trouble calculating the angle away from the center of the goal that our robot is pointing. If anyone could share how they accomplish that it would be great. The information we know is position of the center of our camera, and the position of the center of the goal.

We plan on doing this soon, so I'll just share how I expect we'll do it. We'll double check the horizontal field of view (in degrees) of our camera, then find an equation that maps the goal position to degrees. For example, if the camera is 90° horizontal FoV and the resolution is 640 horizontally, then a simple equation would be

angle = (goalx - 640/2) * 90/640

We plan on closing the loop with a gyro so we won't have to rely on a quick framerate to lock onto the goal.

If you don't know the viewing angle, or want to do it empirically to be more accurate, you could mark a piece of paper with several angles, put the camera flat on the paper and measure pixel distances at different angles.

Tottanka
11-04-2016, 15:55
We tried doing it and ended up with awkward results.
Eventually what we did, is just get the difference between the two measurements, divide it by the distance measured from the goal (you have that in the GRIP), and use that as a fake "angle". For each "degree" in that angle we do a certain amount of encoder turns.
After some calibration it works very well, and fast.

Jared Russell
11-04-2016, 16:29
For example, if the camera is 90° horizontal FoV and the resolution is 640 horizontally, then a simple equation would be

angle = (goalx - 640/2) * 90/640

This can be a decent enough approximation, but there is a more correct way to do this conversion:


horizontal_angle_to_goal = atan((goal_x - center_x) / focal_length_pixels)

where:
focal_length_pixels =
.5 * image_width_pixels / tan(horizontal_field_of_view / 2)


The idea of a focal length is a little unintuitive at first, but is explained here: https://en.wikipedia.org/wiki/Angle_of_view

Typically, unless you calibrated your camera to compensate for manufacturing imperfections (total overkill for FRC):

center_x = (image_width_pixels / 2 - .5)


The -.5 compensates for the fact that if there are an even number of columns/rows in your image, the center is actually on the border between two of them (and we start counting rows/cols from 0 typically).

Note that these equations do give slightly different answers! (See attached image...red is the correct equation, blue is the approximate linear one)

Also note that this angle is relative to the camera...you need to whip out some more trig depending on the angle of the camera mount relative to its robot.

Al Skierkiewicz
11-04-2016, 16:49
You need to know the specifics of the camera and lens you are using to be ultimately accurate. If you know the focal length of the lens and the size of the pickup device, you can make a calculation using trig, to determine the angle of the field of view. Once you have that, you need to know the distance to target or you can back into the distance by calculating a known target size as a percentage of the field of view. This might be a frustrating exercise since the pickup and focal length of these lenses is so small. That allows a lot of error to creep into the calculation.

Jared Russell
11-04-2016, 16:55
If you know the focal length of the lens and the size of the pickup device, you can make a calculation using trig, to determine the angle of the field of view. Once you have that, you need to know the distance to target or you can back into the distance by calculating a known target size as a percentage of the field of view. This might be a frustrating exercise since the pickup and focal length of these lenses is so small. That allows a lot of error to creep into the calculation.

There is a reason why our vision system this season uses a device that provides APIs to fetch pixel array size and focal length information from per-device factory calibration :)

TheOtherGuy
11-04-2016, 17:58
This can be a decent enough approximation, but there is a more correct way to do this conversion:


horizontal_angle_to_goal = atan((goal_x - center_x) / focal_length_pixels)

where:
focal_length_pixels =
.5 * image_width_pixels / tan(horizontal_field_of_view / 2)



Are you doing linear approximation or using focal length in real life? I figure since atan approximates a line when the camera is near aligned with the goal, the linear approximation would work more or less identically with any moderate framerate. This is certainly useful if the camera isn't aligned with the robot, though!

There is a reason why our vision system this season uses a device that provides APIs to fetch pixel array size and focal length information from per-device factory calibration :)

What's your setup for both the camera and processor?

Hitchhiker 42
11-04-2016, 21:37
Here is some LabVIEW code that tackles this problem...

The way the VI works is it takes your current gyro heading, and does some trigonometry to find the needed gyro heading and the current distance from the goal (might be helpful if you have a certain distance range you can shoot from).

Make sure to set the constants in the code (I've commented it mostly, but the important ones are the goal target width (in feet), the camera's horizontal angle of view (in degrees), and the image width (in pixels)...
Disregard the Disabled code at the bottom.

Hope this is helpful, feel free to PM me with any questions.

Maxwellfire
11-04-2016, 22:53
I believe they are using a Nexus 5 with on-board vision processing :D

GeeTwo
13-04-2016, 22:55
If you've found the answer above, please disregard, but this is how we've done well:

Position the robot roughly aligned with the goal (that is, eyeball it).

Do a test launch. Note where the ball ends up, in terms of inches/feet to he left or right of the goal.

Rotate the robot a known amount (probably measured in terms of encoder counts on the left and right drive systems).

Do another test launch. Note where the ball ends up (same criteria).

Based on the two measurements above, calculate a "target point" that will result in a goal, and a "proportionality constant" to get there wif the robot is pointed somewhere else.

pipsqueaker
19-04-2016, 16:07
Also note that this angle is relative to the camera...you need to whip out some more trig depending on the angle of the camera mount relative to its robot.

Could you elaborate more on what trig you'd need to use? I've been wondering about this, and it seems to me that the angle the camera is mounted at should only affect the y coordinate, and since the angle calculation only takes x coordinates the mount angle shouldn't affect the result.

Jared Russell
19-04-2016, 16:47
Could you elaborate more on what trig you'd need to use? I've been wondering about this, and it seems to me that the angle the camera is mounted at should only affect the y coordinate, and since the angle calculation only takes x coordinates the mount angle shouldn't affect the result.

I will try to post a more detailed explanation tonight, but for now: it affects both coordinates.

Imagine a camera that is looking straight up. What does the x coordinate mean with respect to the robot? What does the y coordinate mean?

Camilo86
19-04-2016, 22:53
We actually took the time to do the trig out for the offset. You can take a look at our function https://github.com/FRC125/NU16/blob/boston/src/com/nutrons/stronghold/AngleCalculator.java#L31 you will just need to now the fov, dimensions of image, height of camera, x,y offset of camera in inches and angle of the camera.

Your offset basically changes based on the distance to the target. You use the y coordinate of the target to calculate this distance.

cjl2625
21-04-2016, 20:04
Basically it is giving out smaller angles than it should right now. Would the camera being offset ~2 inches from our center of rotation be enough to cause these issues?:confused:

Our camera is offset ~5" from the shooter wheel and we don't have that problem. If you're using a calculation that uses the camera's field of view, try increasing the value for the field of view until you get angles that look better. (that's what I did, anyway)

The horizontal field of view that you'd find on the camera's data sheet doesn't always agree perfectly with the observed results; it might take a bit of experimental tweaking.

Rflax40
21-04-2016, 20:08
I will try to post a more detailed explanation tonight, but for now: it affects both coordinates.

Imagine a camera that is looking straight up. What does the x coordinate mean with respect to the robot? What does the y coordinate mean?

I'm not sure this holds true when working with a "flat" image, we just tried rotating the camera and it did not change the x value of the target