Distance to travel once targetted?

Okay, so we have the camera tracking the green light and centering the robot onto it. But can anyone drop some hints on figuring out how far forward to go?

I have read that some people go forward depending on the size of the bounds box returned by the t-data, but what other options are there? How are you implementing this?

See this document drawn up by Kevin Watson… it may be very helpful.


As you move towards the target the tilt on the camera mount will go up.

You could always drive the robot to where you intend to score from, see what the tilt is, then set up your autonomous to drive forward until the camera mount gets to that tilt angle.

oh of course… I knew this! Thank you.

Is there anyone nice enough to tell me the blob size from the edge of the home zone to the target, while the camera is about 1 meter above the ground(or any other hight given)?

You can calculate the distance using Kevin’s formula. Then using encoders or gear tooth sensors you can create functions to go the wanted distance.

Not, that’s fine, I know how to do that, but I was wondering if anyone can tell me the light’s “box” size from that distance.

There is no way to compute the blob size in advance. It depends on how far the target light is from the edge of the home zone, at what angle the camera is seeing the target light, how well the camera focus is adjusted, the details of lens and sensor geometry on the camera itself, and the distribution of brightness along the white diffuser panel.

The easiest answer is to put a camera where you want and see what it tells you the blob size is. You’ll notice that the size changes based on exactly where the rack is and in which direction it is facing.

As I persumed(spelled correctly?), we’ll have to hang the light at the hight and distance from where we position the robot.

is the range in meters? or something else?

and is the angel in degrees? or rdians? i remember that in the TANGENS function in MATH.H it is in radians… so we have to convert from degrees to radians

Wasn’t there a certain sensor that worked with radii degrees too?

Yes, the tan() function in <math.h> takes the degrees argument in radians, so be sure to perform the conversion as Kevin’s software calculates the tilt angle in degrees.

The units for distance will be whatever units you specified the light height and camera height with. In the .pdf document above, the 116 is inches, so be sure you calculate h (the height of your camera off the floor) in inches, and then the result will be in inches.

Good luck,

Could anyone give me the (equation/parameter) to convert degrees to radii?

radians == (degrees*PI)/180

(multiply by 3.1415926… and then divide by 180)

Thank you! :smiley:

For one reason or another the distance we were getting using tangent was close (within a couple feet) but not nearly accurate enough. This could be due to significant figures in many different variables (we tried multiple times to make it as accurate as possible).

Does anyone have any say on this?

For now we are using the tilt angle as the indicator at how close we are to the actual light.

One suggestion would be to mount the camera as low as possible. Also, have a look at this posting for a suggestion.


Thanks Kevin. This looks great. I’d like to try and make similar adjustments to our camera. However, I’m not sure about the terms that you are using and how they apply to control the pwm outputs. Is there a document that describes the function of pwm outputs (ie. gain, pulse width, etc.) and how servo motors respond to them? I’d like to get a better picture of how the gain of 70 and calculated pulse width of 2.389 combine to control the camera tilt to reach 90 degrees. Thanks again.

Heres a quick introduction to servos. The center and gain values are in units of 100 ns, so a gain of 70 means the pulse width will increase (or decrease) by 7.0 us for each count above (or below) 127. The absolute pulse width at 127 is defined by the center value, which is nominally 15000, or 1.5 ms.