target detection drops off with distance

Hello all,

In working with the ellipse/target detection code, we’ve discovered an interesting issue: if the camera is positioned within about 12 feet of the vision target, it gets a very stable lock on it. However, if the camera is moved back, target detection becomes very erratic. We’re currently using very loose parameters, and the code is detecting ellipses smaller than those on the target. It is also sometimes detecting the two concentric ellipses on the target, but refusing to recognize them as a target(it just sees two circles). Does anyone have any idea why this could occur, or what parameters we could change?

How is your lighting? That is a common culprit in vision target operation.

The lighting is standard math-classroom fluorescent. How would unchanging lighting cause different results at different distances though?

As you back off from the target, do any of the lights in the room come in to the camera’s field of view?

Do you have auto-exposure and auto-white-balance enabled on the camera?

I uploaded two screenshots in a different thread showing lighting differences. In one you’ll see we are far away such that the classroom lights are in the field of view. In the other screenshot, we are much closer, and the classroom lights are no longer in the field of view. When auto-exposure and auto-white-balance are enabled, you’ll see a big difference in “lighting” on the camera.

Our exposure is set to HOLD at a fairly high value(the idea being to cut down on noise). Should we adjust the white balance? I was under the impression that white balance shouldn’t matter since the circle detection is based on a threshold.

At our practice field, the target was not under a light and half of the lights are on there. The robot could see it close up, be when we moved farther away it could not. We moved the target under a light so the fluorescents shined on it and the robot could see the it. And in some other instances, the target can get washed out. Last year there were a few events in which camera tracking was impossible due to lights shining on the field or not enough light.

There is a white paper on ni.com/first that covers some issues that affect vision. The best way to get input on why the vision is acting the way it is, would be to attach some images, describe your camera settings, and ask if anyone has suggestions. There are many many things that can go wrong, but almost all of them can be fixed.

Greg McKaskle