Log in

View Full Version : Measuring distance with cameria


XXShadowXX
04-01-2009, 17:12
If the cameria on your robot measures that the hieght of a marker is " p " pixels at an distance of " c ", then if you are to move this item, and the item is " p' " pixels tall, then the distance of the object would be

d=c+ [(p-p')*(sec 1.79)]
if 1.79 is degrees, rounded (1 degrees, 47 minutes, 23.68 seconds to 1 degree, 47 minutes, 24 seconds)
right?

Luke Pike
04-01-2009, 18:48
I wasn't going to try to determine the distance with the camera, instead I was going to use an ultrasonic sensor pointed in the same direction as the camera. I don't think the difference in the height would be great enough, and you really need an accurate measure of distance in order to shoot a ball at it.

GaryVoshol
04-01-2009, 19:11
The vision target on the trailer is at a fixed height. Your camera will be mounted on your robot at a fixed height; make it lower than the vision target. If you measure the angle (above horizontal) that your camera is at to point at the target, you can use trig to figure out how far away you are from the target.

Adam Y.
04-01-2009, 23:23
If the cameria on your robot measures that the hieght of a marker is " p " pixels at an distance of " c ", then if you are to move this item, and the item is " p' " pixels tall, then the distance of the object would be

d=c+ [(p-p')*(sec 1.79)]
if 1.79 is degrees, rounded (1 degrees, 47 minutes, 23.68 seconds to 1 degree, 47 minutes, 24 seconds)
right?
No. It's surprisingly much more complicated than that. The main reason is that in order to do that you have to account for the distortions caused by the camera. I actually performed a calibration for my optics project using a 200.00 dollar camera. The distortion for some areas of the image was as high as 20 pixels.

geeknerd99
05-01-2009, 01:04
No. It's surprisingly much more complicated than that. The main reason is that in order to do that you have to account for the distortions caused by the camera. I actually performed a calibration for my optics project using a 200.00 dollar camera. The distortion for some areas of the image was as high as 20 pixels.

Photographers and camera geeks make a huge deal about lens distortions.

Why not simply use trig and a pan-tilt servo deal so you can read the elevation to the target?

nitsua60
05-01-2009, 01:22
No. It's surprisingly much more complicated than that. The main reason is that in order to do that you have to account for the distortions caused by the camera. I actually performed a calibration for my optics project using a 200.00 dollar camera. The distortion for some areas of the image was as high as 20 pixels.

There's the magic word. Given the uncertain inputs to the analytical solution (unless you want to take apart that camera and do some serious optics testing) why not just image the vision target at a range of distances and generate an image height vs. distance curve? If that goes by too quickly, do it repeatedly in different areas of the image to correct for the aberrations.

Adam Y.
05-01-2009, 07:24
There's the magic word. Given the uncertain inputs to the analytical solution (unless you want to take apart that camera and do some serious optics testing) why not just image the vision target at a range of distances and generate an image height vs. distance curve? If that goes by too quickly, do it repeatedly in different areas of the image to correct for the aberrations.
That is pretty much the idea. I unfortunately don't know what type of Labview distribution you have which means that it may or may not have the capability to do the calibration.
Here is a URL for some concepts about camera processing. (http://digital.ni.com/manuals.nsf/websearch/34548BDDD48DF68B86256F81005B94F8)

XXShadowXX
05-01-2009, 08:08
The vision target on the trailer is at a fixed height. Your camera will be mounted on your robot at a fixed height; make it lower than the vision target. If you measure the angle (above horizontal) that your camera is at to point at the target, you can use trig to figure out how far away you are from the target.

sounds by far the simplest solution, of course the object will still get smaller the further you are from the object, so it wall have some margin of error, it will still need some optic curve...

pgaston
05-01-2009, 08:26
And, what about the mass of the image you've found? i.e., the number of pixels inside the image blob defined by your color parameters.

That *should* vary enough to give you some measure as to distance - though experimentation is obviously next. Perhaps this could be combined with the trig calculation?

XXShadowXX
05-01-2009, 08:28
see first post, that what i was doing, but as i understand it the angle that objects change size at changes meaning you have to use a curve not only that but you need to account for optical defermations in the lens

elfinn
05-01-2009, 14:44
Size of the target is an easily obtained value and would be a more accurate measure of distance than the mass. The mass may be misleading if the particle returned has holes or is truncated by glare or bad parameters. But the height and width of the particle would vary directly with distance.