team colors and misguided cameras

I heard someone joke about this in a recent team meeting when we were testing out the camera:

since the camera is meant to track a fluorescent pink and green color and since there are many teams out there whose team colors are very similar (who also tend to paint their robot completely in that color), isn’t it likely that those hot-colored teams would have an advantage at the competition, since it might distract a camera that is trying to find that color?

maybe we should just all paint our robots a fluorescent pink this year, just to level the playing field (sorry 233)… on the other hand, I don’t think that would match our robot. It might be better for us to paint our robot the fluorescent green.

We did a little test today (our team color is pink) and found that while our fleaces don’t attract our camera, our t-shirts do…:yikes:

I guess green and pink teams shouldn’t sit together

in my opinion those teams wont’ really have an advantage at competition because the camera should be looking for the appearance of both the green and pink. but it’s so weird because we were having a discussion about this today because our main team color is pink. we did a test with the camera and found that our team shirts would be key targets . . . hopefully people’s robots are programmed correctly so if they have a shooter, they won’t shoot those in brighter color shirts or their robots haha :slight_smile:

Katy…read the post above u…ha ha ha lol

we both WOULD respond to this:D

haha yes kate and i had a grand old time playing with the camera today and imaging our robot attack our mentor rachel because her pink shirt was too bright . . . just like our team shirts. oh deary :yikes:

The camera can be programmed ( I believe from reading the manual for the vision package a few years ago) to track not just a specific color, but a specific shape and pattern. in AIM high the lower goals were scored by this software counting the number of ball shadows that passed under the camera image.:smiley:

I’ll share my strategy in this case, because I think it will help the overall safety of the competition.
For the most part, aiming at the closest target is what you want, because it’s also the target that can most easily reach you. However, you also want to aim only at targets inside the arena. How can you tell where the edge of the arena is? The white flooring ends.
So, in your aiming VIs, you can disregard targets that are further away than the edge of the floor. Obviously, people are a lot larger than the targets on the top of the trailer, but you can tell where they are by where they touch the floor (or if they don’t, as the case may be).

You’re talking about producing a lot of vision processing code if you’re trying to account for the floor as well… just know that the floor isn’t the only thing that appears white (people’s t-shirts will appear white as well, so how does your camera know where the floor is? a toddler even has difficulty with that)

To the camera (or the robot) it’s not so obvious… you might want to rethink your algorithm to account for the fact that perception of height from a single focal point depends on an apparent distance due to size on the image rather than the actual distance or height. (many people seated together with similarly colored t-shirts far away from the camera could look like a target).

I’m guessing that the floor will be out of the field of vision for most people’s cameras (especially if you’re near the target, looking up), so this is probably not a good choice for a reference.

This thread was intended more as a joke than anything else (I guess also as a warning to teams that may try to track a single color). I can almost guarantee that we’ll have some rogue autonomous bots firing into the crowd at least once this competition. (and maybe at the judges, too, depending on what colors they’re wearing)

who knows, maybe a camera will even track the screen above the field (which will sometimes display a target pulled behind a robot).

luckily, the point of this year’s competition will not be shooting.

Judges wear fairly dark blue shirts. The refs wear striped shirts and the field crew get white t-shirts. All tend to wear dark pants. So they should all be safe. MC’s are a different story, but their fashions should be running to bright yellow, electric blue, or vibrant red. All colors that should not be picked up by the camera. So while it is likely that balls will get shot at somebody, it probably won’t be due to the cameras locking on to the wrong thing. Other programming or aiming errors are much more likely.

We’re going to do a quick sanity check before locking on to anything, which is basically something like

Is there a green and a pink area?
Does the orientation match the opposing alliance?
Are both areas the same size?

This should be pretty resilient, unless there’s a team wearing all pink sitting directly above a team wearing all green :smiley:

A good safeguard would be to compute where you think the vision target is relative to your robot (since you know the height of the target, and can measure the pitch angle/y position in the frame of the camera). If your robot thinks that the vision target is more than 50 feet away, it’s probably a safe bet that you are looking at something that is not the vision target.

You call it an error. I call it a feature :smiley:

The cameras look for pick + green patterns. We were joking that team 233 and team Swamp-Thing on the same alliance would wreck havoc on the cameras tracking :p. But knowing these two powerhouses, the camera would be the least of your worries;) .