View Single Post
  #4   Spotlight this post!  
Unread Today, 09:50
Greg McKaskle Greg McKaskle is offline
Registered User
FRC #2468 (Team NI & Appreciate)
 
Join Date: Apr 2008
Rookie Year: 2008
Location: Austin, TX
Posts: 4,770
Greg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond reputeGreg McKaskle has a reputation beyond repute
Re: Distance Measurements to Target

The distance estimation based on the optics of the camera is only valid if the camera sensor and the object you are measuring are in the same plane. When you tilt the camera up or down, you are introducing a perspective distortion dependent on the Y value where the size was measured.

This is a somewhat new wrinkle to FRC vision, as we have short robots and tall targets this year.

It is possible to calibrate this out of the system, and that was why the LV example had reference to a calibration file and a step to load it so that NIImaq could correct the distance with this info. But there are other approaches that will work as well.

Essentially, a 10px distance at the bottom of the camera image, in the middle, and at the top of the camera image are not supposed to give the same real-world distance estimate. I'd suggest making some empirical measurements and seeing how you can normalize this. Or look at the perspective distortion transform and figure out how to invert it (might not want to do this to all the pixels, however), or use some training utility.

With any of these, it is key to keep the camera height and angle pretty consistent or come up with a way to compensate based on some field element also in the image.

Greg McKaskle
Reply With Quote