We have our camera working. It tracks well, ignores the other lights, etc.
However, it seems to have a severe offset to one side. Even when close to the light (4-5 feet) it insists on pointing toward the left edge of the lightbox.
I modified the pixel to track in kevin’s code from a default of 79 to a new value of 112 - this centers the camera on the box. However, that’s a pretty huge offset.
We have 3 working lights to print, with diffusers etc, and it does it on all 3. We’ve turned off the overhead lights to make sure we have no interference and seen the same behavior.
I don’t get it - the camera tracking should be independent of everything else. We tried both our 07 and our 06 camera.
Has anyone else seen this issue where the camera wants to the look just to the left of the light?
We setup the light square and level with the robot and camera.
We moved the light and read the readings on the pan/tilt and moved the light until the angles read zero, zero.
We then adjusted the servo horns to align the camera squarely.
We did find that once we did all of that it was off by only 1 degree.
Phalanx - I understand what you’re doing - mechanically aligning the camera to get your servo centers correct.
That is not our issue. The relative servo values have no effect on how the camera tracking software works. The camera - for whatever reason, is intent on focussing on one side of the light when tracking.
We don’t currently have the knowledge to get the color tracking software working - but we have let the camera focus on the light then grabbed a snapshot. In every case the light is over to one side. - Until we adjusted the offset.
This doesn’t make sense. With the base camera code, the camera should be placing the light near the center of the camera lense / center pixel. Neither our new nor our old camera is doing so, with default camera code.
The optics on the camera assembly are obviously not completely precise. The image on the sensor isn’t necessarily centered when the lens seems to be pointing directly at the target.
Just pay attention to the values you’re getting from the camera, and find a way to relate the servo values to what you want the robot to do. Don’t worry about where the camera looks like it’s pointing.
Alan pretty much nailed it on the head. This is a “problem” many teams addressed last year as well.
Simply put, the mechanical alignment of the optics to the cmos sensor are not perfect. In addition, the optical components used have some manufacturing tolerances that can add to the “mis-alignment” as well. Just follow the advice Alan gave and you will be in good shape. What you need to be confident in is the cameras ability to reproduce it’s positioning each time. If it does that, then you will be able to get accurate tracking information from it.
we had this problem too, it happened every time our robot was close enough to think it was two lights (because our said value for two lights weren’t changed when we brought the light down for easier testing) I don’t know why it did it, and am currently trying to figure it out for myself, but when I increased the maximum size for one light magic no shifting massively to one corner.
EDIT: my problem was because my fellow programmer made a change while I was gone the build before (and forgot to tell me). he found that the position of our camera on the robot would lose one light or the other (due to the nature of our attachment) when coming within scoring distance of a spider between two lights, his solution, focus on one light primarily as to avoid losing both (which happened one in every ten times) hence why in my case it happened for when the light size was greater then one… so in the end our identical problems were not so identical. He just told me now. on msn.
Hooray for communication, and a night spent in confusion.
Yep, this is the case. My code has the ability to track on any pixel, not just the center one. Just allow the camera to find the light and then enter the “Interactive PWM Adjustment Menu” and step the servos until it is aligned. Now unplug your servos from the RC and enter ‘p’ and then ‘x’ so that camera data will start to stream to the terminal screen again. If you managed to do this without disturbing the camera, the error values on the screen tell you where the optically-centered pixel is located relative to the center pixel in the imager array. Use those offset values to calculate the new tilt and/or pan target pixels and enter them in to tracking.h or use the interactive tracking menu to set the new values.