View Single Post
  #8   Spotlight this post!  
Unread 02-04-2016, 20:26
Oblarg Oblarg is offline
Registered User
AKA: Eli Barnett
FRC #0449 (The Blair Robot Project)
Team Role: Mentor
 
Join Date: Mar 2009
Rookie Year: 2008
Location: Philadelphia, PA
Posts: 1,050
Oblarg has a reputation beyond reputeOblarg has a reputation beyond reputeOblarg has a reputation beyond reputeOblarg has a reputation beyond reputeOblarg has a reputation beyond reputeOblarg has a reputation beyond reputeOblarg has a reputation beyond reputeOblarg has a reputation beyond reputeOblarg has a reputation beyond reputeOblarg has a reputation beyond reputeOblarg has a reputation beyond repute
Re: How can angle offset of a shooter be calculated from pixel offset?

Quote:
Originally Posted by simon-andrews View Post
No. That function isn't what I need because it doesn't return an angle that can be plugged into the turntable. The program I'm supposed to write should be able to look at an image and figure out "Alright, turntable. You need to turn 45 degrees if you want to be on-target."

If we look at this drawing from an earlier post, both lines (which represent pixel offsets) are 100px long. With the function from before, both would return the same angle offset since the function doesn't account for distance away from the target.
You're slightly confused.

Distance away from the target doesn't matter, that's precisely why you're measuring in angle (a dimensionless quantity) and not in distance.

Imagine the camera at the origin, pointing in the +y direction. Place your target off to the side somewhere. How far off-center your target will appear in pixels is proportional to the angle between the line from the origin to the target and the y axis, not to the distance from the point to the y axis. To see this, place one target at (1,1). Place another target at (1,2). These are the same distance from the y-axis, which is the centerline of the camera's FOV, but they will clearly appear at different distances from the center when viewed from the camera, because the camera is a point at the origin and only sees along rays that pass through it.

For an even better demonstration: Cover one of your eyes, align your other (open) eye with the center of your computer screen. Place your finger between your face and your computer screen, so that it overlaps with the edge of the screen in your vision. Notice that your finger is closer to the centerline between your eye and the center of the screen than the edge of the monitor is, despite the fact that they appear the same "distance" away in the 2d image that you see.

The fundamental confusion here is that you're thinking in terms of rays parallel to the center-line of the camera. But that's incorrect, because those rays never reach the camera. Think of the camera as a point, not as a screen. All the rays that the camera sees must pass through the point. Thinking this way, it is clear that "distance" in the image the camera sees corresponds to *angular* distance. To preserve apparent "distance" between two objects from the point of view of the camera, you must preserve their relative angular positions.
__________________
"Mmmmm, chain grease and aluminum shavings..."
"The breakfast of champions!"

Member, FRC Team 449: 2007-2010
Drive Mechanics Lead, FRC Team 449: 2009-2010
Alumnus/Technical Mentor, FRC Team 449: 2010-Present
Lead Technical Mentor, FRC Team 4464: 2012-2015
Technical Mentor, FRC Team 5830: 2015-2016

Last edited by Oblarg : 02-04-2016 at 20:51.
Reply With Quote