Log in

View Full Version : Full Court Target Detection


Bpk9p4
14-01-2013, 23:25
Has anyone been able to detect the targets at full court. If so how have you done it

Greg McKaskle
15-01-2013, 06:48
To use the lighting to detect targets at full court, you'll need to do several things.

You need to make sure you run the camera at a high enough resolution that the 4" wide lines are several pixels wide. 320x240 should do, but it may require 640x480. Get images and measure.

You'll need to calibrate the exposure on the camera to darken the normal scene more than normal. You may be able to do this just with the Brightness setting, but it may be useful to set it to auto, expose it to a bright light, and set it to hold. This will also help keep the colors from the LED ring stay saturated.

You'll need to use brighter LEDs or more of them. If this is the only place the targeting will be used, you can also use a flashlight or other lens system to narrow the beam and concentrate the light where you want it. The intensity of light drops of as distance squared, so the reflected light will be 1/4th as bright as full court as at mid-court.

Finally, you may find that the initial settings for particle size are set too low for full court detection. The .5% threshold is used to ignore particles that are smaller than a certain size. As the robot backs up, particles shrink, and at some point you need to lower the threshold to keep the recognition of the targets. .5% work out to about 400 pixels on a 320x240 image, so watch the particle area of the targets to see what range they are in.

Greg McKaskle

Bpk9p4
15-01-2013, 07:06
Thank you for your help. We ran tests last night and it seems like we need a much brighter light. Do you have an suggestions on lights. Also have you tried color led vs white light

Greg McKaskle
15-01-2013, 07:16
I purchased an LED flashlight from home depot that has a red, blue, and white setting. It is certainly bright enough from full court because of its lens. It doesn't work well up close because the cone of light from the flashlight will only be a few feet across and won't illuminate the same area as the camera sees.

I've seen teams nest the LED rings one inside the other which will increase the brightness.

I've also used bicycle safety lights to see the targets full field. Again, they aren't dispersed enough to work well up close.

You most likely need to decide between the up-close and far-away lighting or combine several on the robot.

Greg McKaskle

dvanvoorst
15-01-2013, 07:30
If you are not testing with an LED ring around your camera lens, make sure the light source you are using is right next to the camera. The reflected light is extremely directional and even a few inches away from the camera can make a difference. We initially tested with a separate light about 10 inches to the side, then moved it right next to the camera and the difference was dramatic. Full field target recognition is crystal clear.

Bpk9p4
15-01-2013, 07:37
We are using a light ring but I do not think it is bright enough. Do you do your vision processing on he cRIO?

dvanvoorst
15-01-2013, 07:46
yes, we are doing our vision processing on the cRIO. Are you using the NI Vision Assistant to test with? If not, you definitely want to. :)
If you stand directly behind your camera, while your LED ring is on, you should see the reflective tape light up quite a bit with your own eyes. If not, then try it with a flashlight for comparison. Maybe you need brighter LEDs.

Greg McKaskle
15-01-2013, 07:50
Good point about mounting near the camera. It is actually the angle between the source and sensor that matters. So for very distant targets, you can actually increase the distance between camera and light. The data sheet for the material will give a plot of reflected light intensity versus angle if you want to calculate how close they need to be.

The location of the processing doesn't impact the ability to detect, but it may impact how long it takes to process the image.

Greg McKaskle

dvanvoorst
15-01-2013, 07:56
Interesting point Greg. Our initial "poor quality" image this year was with the target just 10 feet away, but the light source probably around 10" away from the camera. Based on what you said, we probably would have had a perfectly acceptable image if we were 30 feet away instead. In any event, an LED light ring seems like the best way to go.

Bpk9p4
15-01-2013, 08:13
Thank you for all your help. Do you know if there is a best color light. Also last year we ran into a lot of problems with interference from other light sources. An suggestions on how to prevent this

Greg McKaskle
15-01-2013, 08:29
I think the best light is the one that is the most unique, bright, and that the sensor is sensitive to.

Camera sensors tend to be more sensitive to reds and greens and less sensitive to blues. This is done to mirror how human eyes perceive color. So with LEDs of the same intensity, blue may not be reported as high as the others.

LEDs are based on photon emission from various circuits. When I was a kid, all LEDs were red, no other colors available. Today, there are all colors, but the cost and brightness per cost still differ.

All together, I'd say that the color probably doesn't matter much.

If you use HSL or HSV color thresholding and a relatively narrow hue filter, you should not have that many issues with other colors. But that is why the examples also do shape and size filtering. To interfere, something has to be the right color, right blob size, aspect ratio, it has to be pretty square and hollow.

Greg McKaskle

dvanvoorst
15-01-2013, 09:21
I think calibrating your camera, as Greg mentioned earlier, is very important too. Otherwise the colors can get washed out when you get in a brightly lit event.
We are experimenting with infrared lighting and an infrared filter on the camera. This is looking pretty promising so far as it excludes everything but infrared.

Bpk9p4
15-01-2013, 10:33
we were talking about using that yesterday. Has it shown improved results?

dvanvoorst
15-01-2013, 10:40
I have to admit, the results have been very good so far. I've attached an image shot from probably 40' feet away in a flourescent-lit room.

Bpk9p4
15-01-2013, 10:43
That is very impressive. If you do not mind me asking. what camera and lights are you using? Can you use the same calibrations for the full range?

dvanvoorst
15-01-2013, 10:48
I didn't order the equipment, so I don't have the specifics handy. But, in general, it's just a regular infrared light designed to be used with a security camera. I think it cost around $30 or so. It's not a ring, so it is mounted next to the camera. Then we bought an infrared filter that normally would connect to a camera lens. Currently that's just taped over the Axis camera lens. So, nothing too special.

Bpk9p4
15-01-2013, 11:05
Thanks for the help. I think we will give this a try.

Bryscus
15-01-2013, 11:35
We found last year that color tracking worked wonderfully when:

1. we overexposed the image and then froze the setting

2. also froze the white balance

The first one helps to keep the camera from going into saturation because of the brightness of the reflected light. This would cause the camera to perceive a color shift.

The second is absolutely essential when it comes to changing lighting conditions. By keeping the white balance fixed, we didn't once have to calibrate the camera at an event or even at practice. We played indoors in fluorescent lights, near sodium lights and even outdoors in daylight in a covered setting and the image processing picked out the targets every time.

We ended up not using tracking in the end only because our driver was amazing (when using our patented photon cannon) and we didn't have a turret system. Creating a controller to center the whole robot proved too daunting a task. A linear system would have been much easier to control.

- Bryce

P.S. That being said, last year we used the "Tools -> Create C Code..." option in Vision Assistant after following the Vision White Paper guidelines. We then stripped out all the extra generic code that the generator created. We ended up with nice streamlined c functions that work just like the vision assistant but didn't have all the gross overhead that the code generator created.

Bpk9p4
15-01-2013, 11:43
We will have to try the white balance. Is there any rule against that lights you can put on your robot

F22Rapture
15-01-2013, 11:54
We will have to try the white balance. Is there any rule against that lights you can put on your robot

Only that they can't be a laser above class 1, or so bright as to be blinding to the other team and impede their gameplay.

cjlane1138
15-01-2013, 11:59
In response to good LED lighting, FIRST recommends using this LED ring that goes around the camera lens. I prefer using a bright color like green due to the fact that it will give off a more clear reflection.
LED Angel Eye Headlight Accent Lights (http://www.superbrightleds.com/cat/led-headlight-accent-lights/)

cafrava2016
16-01-2013, 21:51
Hello we are having some difficulties as to how we wire the led ring for the camera into the cRio, and how to mount it on the camera itself, any help would be greatly appreciated.

Alan Anderson
16-01-2013, 22:06
Hello we are having some difficulties as to how we wire the led ring for the camera into the cRio,

Why do you want to wire it to the cRIO?

You can power it directly from the battery (via a 20A circuit on the Power Distribution Board) if you want it to be on all the time. You can wire it to a Spike relay if you want to be able to turn it on and off under program control.

and how to mount it on the camera itself, any help would be greatly appreciated.

We used double-sided foam tape.

Arhowk
16-01-2013, 22:15
I have to admit, the results have been very good so far. I've attached an image shot from probably 40' feet away in a flourescent-lit room.

That is very impressive. If you do not mind me asking. what camera and lights are you using? Can you use the same calibrations for the full range?

And to add onto that, what are your current BinaryImage filters and resolutions?

Bpk9p4
17-01-2013, 06:19
We are trying IR but are having some problems. How did you set your lights up

protoserge
17-01-2013, 08:29
Look into using the Kinect for next year if you haven't already been exploring it. The sensor capability is pretty impressive. We have been testing the Kinect with a nano-ITX form factor computer.

In outdoor full-sun tests, we were able to get approximately 60' recognition using a flashlight beamed on the reflective tape. I do not recollect if this was using the IR or electro-optical sensor. We were using a circular polarizer to cut down reflected light as well.

Our mentor lead is out this week, but I can get more information when he returns.

Last year, we used a purple (red + blue) LED light source to make a custom color for our 3D vision system. In LogoMotion, we learned that green was not a good color choice due to the field scoring projection screen had green squares behind the pegs/goal height. This green would actually cause the single-camera vision system to go off target.

Additionally, one of the major hurdles with vision processing is available field bandwidth. You may need to operate at very low framerates and resolution, so keep that in mind when designing your system. This is one of the issues that has lead us to pursue on-robot vision targeting using the Kinect.

Bpk9p4
17-01-2013, 08:57
Thanks for your help. We have a system that seems to work well at short distance but are having lots of trouble getting long distance. The only way we can get good reflectivity at long distance is with a bright LED flash light. Only problem with this is we are not sure having such a bright light is legal. Does anyone know if having a very bright flashlight is legal?

protoserge
17-01-2013, 09:19
Don't disrupt gameplay or make an unsafe condition.


R08 ROBOT parts shall not be made from hazardous materials, be unsafe, cause an unsafe condition, or interfere with the operation of other ROBOTS.


Examples of items that will violate R08 include (but are not limited to):

A. Shields, curtains, or any other devices or materials designed or used to obstruct or limit the vision of any drivers and/or coaches and/or interfere with their ability to safely control their ROBOT

B. Speakers, sirens, air horns, or other audio devices that generate sound at a level sufficient to be a distraction

C. Any devices or decorations specifically intended to jam or interfere with the remote sensing capabilities of another ROBOT, including vision systems, acoustic range finders, sonars, infrared proximity detectors, etc. (e.g. including imagery on your ROBOT that, to a reasonably astute observer, mimics the VISION TARGET)

D. Exposed lasers other than Class I.



I haven't had any experience testing color photography filters on the optics of the Axis to limit the wavelength of light they can pick up. I'm also not sure if the Axis has a built-in IR filter like most CCD/CMOS cameras.

I would suggest you look at making a custom LED emitter as well.

Bpk9p4
17-01-2013, 10:11
we have made are own LED emitter only problem is they are very bright and not sure if they are legal

Daniel_LaFleur
17-01-2013, 10:29
we have made are own LED emitter only problem is they are very bright and not sure if they are legal

Look at it from a distance of a few feet, if it bothers your vision, you are probably in violation of R08. A lot of light is not required, just enough to overcome a set threshold.

dvanvoorst
17-01-2013, 12:39
So far in testing our infrared, we only need to run it through an HSI Threshold filter and we get a nearly perfect result. If our target is near the side of the image, and we're fairly close, then we get a poorer image that results in some small particles - so we may add a filter for that, or just ignore those in code.
Our infrared light is currently mounted immediately next to the camera. We are still testing things though....

Bpk9p4
17-01-2013, 13:15
what type of camera are yo using? Did you remove the IR filter from the camera if it has one?

dvanvoorst
17-01-2013, 14:28
We are using the Axis M1011 camera. For testing we just taped an infrared filter in front of the lens. The filter is one intended for use on a regular 35mm camera lens. We haven't done any special camera settings (yet).

Greg McKaskle
17-01-2013, 21:26
The axis cameras do have IR filters, but at least for the 206 it was on the lens, not the sensor. I suspect that the M1011 is the same, but can't be certain.

If you choose to use IR, you may want to consider setting the camera to monochrome and if it returns an RGB image, simply take one of the planes such as red. This should reduce time needed to process.

Last year I took some movies of the field using a Kinect and there seemed to be lots of IR coming from the lights above and to each side of the field. This light would bounce off tape stripes, plastic, and of course other robots. I'm not saying whether IR is or isn't a good choice.

One of the issues is that you can't see IR. Just because you don't see it doesn't mean it isn't there.

Greg McKaskle

dvanvoorst
17-01-2013, 22:04
Hi Greg,
What do you mean by the Axis cameras already having an IR filter? Is that a physical filter? A software filter? I wasn't aware of any particular IR capabilties - but I have minimal experience so far.
We just put an IR filter in front of the camera - and that worked well, but if the camera has a way to do that itself, that would be convenient.
Dale

Alan Anderson
17-01-2013, 22:14
What do you mean by the Axis cameras already having an IR filter? Is that a physical filter?

It has a physical filter to block IR light.

Greg McKaskle
18-01-2013, 06:54
As Alan said, the IR filter is a physical coating or lens element on the interior portion of the M206 lens.

I learned this when I replaced the lens to change focal length. My color saturation was horrible after that. The new lens was for a monochrome sensor and had no filter.

The camera sensors, a CMOS sensor in this case, is sensitive to the visible spectrum and down into IR. If you do not block IR, the IR light will wash out the image and you will lose color saturation. So camera manufacturers add this to capture the portion of light you typically care about in photography. Alternately, you could block visible light and allow IR through and make it an IR camera. If you have both filters in place, you've made a camera that senses very little.

Greg McKaskle

Bpk9p4
18-01-2013, 08:55
you are saying the IR filter is in the removable lens for the axis 206? Where did you get a replacement lens for the axis 206 that did not have an IR filter?

Greg McKaskle
18-01-2013, 09:05
From here

http://www.edmundoptics.com/imaging/imaging-lenses/micro-video-lenses/infinite-conjugate-micro-video-imaging-lenses/2196?ref=related-products

The factory lens in the Axis 206 camera is a 4mm.

Greg McKaskle

Bpk9p4
18-01-2013, 09:10
thanks you for posting this. Would you recommend using this lens on an axis 206?

Greg McKaskle
18-01-2013, 09:16
I'm not sure I understand the question. They are good quality lenses and I have used them on the 206. If you remove the lens you will find that the threads are pretty loose. I applied a bit of teflon tape to keep the lens from twisting due to vibration on the robot.

I'm not convinced that IR is a better solution than visible light, and I think it will be harder to debug if/when it doesn't work. If you try out IR, do it methodically, and perhaps have a backup plan -- a visible-light LED ring and lens to swap back to.

Greg McKaskle

Bpk9p4
18-01-2013, 09:25
i agree that IR might not be the best way to go. Are only problem with RGB light is currently we can not get full court detection without having a very strong light and running the risk of violating the rules

Greg McKaskle
18-01-2013, 09:47
If you get the LEDs close to the lens I think you'll find that they won't be nearly as bright as the folks with the photon cannons. Please post images and I'll do what I can to help analyze them.

Greg McKaskle

Bpk9p4
18-01-2013, 09:53
thanks a lot for your help. i will try to get some screen shots this weekend

Bryscus
18-01-2013, 16:00
If you get the LEDs close to the lens I think you'll find that they won't be nearly as bright as the folks with the photon cannons. Please post images and I'll do what I can to help analyze them.

Greg McKaskle

The difference is that for camera tracking you typically want a somewhat wider spread light which is more likely to bother people (unless of course you're shooting from far enough away to allow you to limit the spread). Using a highly directional photon cannon pointed up shouldn't bother many opposing drivers. Its just the people up in nosebleed seats that will get blasted.

- Bryce

Twistedtruth360
18-01-2013, 16:42
Do I have to use a network camera with the led lighting or can we use something.

Greg McKaskle
18-01-2013, 21:24
The Asix camera is well integrated into WPILib, but nothing requires that you use any of it. Perhaps the question to ask yourself is, do you have a better idea? If so, investigate it, here or elsewhere.

Greg McKaskle

Greg McKaskle
18-01-2013, 21:40
The difference is that for camera tracking you typically want a somewhat wider spread light which is more likely to bother people (unless of course you're shooting from far enough away to allow you to limit the spread). Using a highly directional photon cannon pointed up shouldn't bother many opposing drivers. Its just the people up in nosebleed seats that will get blasted.

The way I look at it, the spread of the light determines how many people it impacts. The brightness of the light determines how much it affects the people with the light in their eyes. The teams I'm familiar with that did this last year shined it on the floor, wall, ceiling, and everything in between. I'm not saying it was a bad strategy, but the lights were far far brighter than the LEDs.

Heres another way of looking at this. The retroreflective material is graded by how much more light it reflects than white paint. I believe the material used is at least 300x brighter. That means it only takes 1/300th the intensity of light when used properly.

Greg McKaskle

Bpk9p4
19-01-2013, 17:37
Just a warning for other teams. The axis 206 camera has an IR filter. We attempted to remove the filter today and found it not to be possible.

twiggzee
07-02-2013, 09:14
I've seen teams nest the LED rings one inside the other which will increase the brightness.

I don't know if other teams are experiencing this as well but it seems to me that the retro-reflective tape in this year's game doesn't seem to reflect as bright as the one last year with the same green led ring light. has anybody tried adding another LED ring as Greg suggests here? did it improve brightness / vision results?

faust1706
13-03-2013, 04:14
1706's program for computer vision is able to function at well over 80 feet. Without the illuminators and just using the build in ir light on the kinect, it reaches a little under 60. last year we got to over 100 feet.

Johnny_5
13-03-2013, 06:55
1706's program for computer vision is able to function at well over 80 feet. Without the illuminators and just using the build in ir light on the kinect, it reaches a little under 60. last year we got to over 100 feet.

How are you doing this? What are the lighting conditions in your testing environment?

Bryscus
13-03-2013, 08:01
SPAM is able to track the retro reflective goals using just two ring lights - we originally tried just the smaller one and it wasn't quite bright enough. We overexpose the image to darken it a bit so that in bright conditions the green light doesn't saturate. If this were to happen, a color shift would occur and make thresholding with HSL not work. We shine a bright light into the camera and hold the settings for both white balance and exposure. We are able to detect full court without any problems. Last year we did the same thing and we could work outside, in fluorescent lights, tungsten, at events, etc. and we never retuned. Hope this helps!

- Bryce

faust1706
13-03-2013, 13:49
We do it in every condition. it doesnt matter. To test, we did every hallway at the high school, the gyms, and put up stage lights, still works, unless the stage lights are pointed at the kinect, then it blinds us. That is why the program doesnt work well outside when sunny. I used OpenCV's libraries with a Microsoft Kinect. This year, my mentor suggested saturating the targets with more IR light to help, and we tested it and it does make a huge difference. I hated using axis cameras. They were cheap and the fps was garbage. For the kinect, it is 30 fps if it only captures an rgb image, with my program, it runs a little under 27. I would invest in different hardware. 1706 will be attending the St louis Regional and the Terra Houte (pardon my spelling) regionals. If you are there, I'd love to explain it for you. I wrote a paper about last year's vision program for state symposium, and am going to ISWEEEP in Houston for a week to compete at internationals. I have a part of it posted as a doc on here. Feel free to take a look. (Cant give away all my secrets, sorry, the teacher that sponsers robotics wants me to patent it)

apalrd
14-03-2013, 11:32
I don't know what Axis camera you are using, but I had no issues running at 30fps. We chose to run at 20 last year to avoid any buffered TCP issues, and it seemed to work fine.

The Axis camera is also very light and very easy to use.

faust1706
14-03-2013, 20:13
please refer to 1706 at the st louis regional. solutions accurate to the quarter of an inch.

Greg McKaskle
15-03-2013, 07:30
One man's garbage is another man's treasure ...

But seriously, both the Kinect and Axis are quality products intended for rather different purposes. It sounds like you've converted your Kinect into a rather big IR camera and splotchy IR emitter.

I think using it on the robot is an awesome project, but there is no need to trumpet so loudly, especially if you aren't willing to teach others how it is done.

Greg McKaskle

Coach Norm
15-03-2013, 10:44
One man's garbage is another man's treasure ...

I think using it on the robot is an awesome project, but there is no need to trumpet so loudly, especially if you aren't willing to teach others how it is done.

Greg McKaskle

Well said Greg.

faust1706
15-03-2013, 21:51
I've posted a paper on here describing my methods from last year's program. I help team's in the pit all the time, answer questions they have and give advice.

Greg McKaskle
16-03-2013, 12:23
I downloaded and reviewed the paper, and I applaud you for publishing it. I do have a number of questions it didn't answer.

I didn't see a description of which image feature was used to estimate the distance. The accuracy of the distance measurement will depend largely on the resolution of the camera and the pixel size of that feature. One quarter inch accuracy seems to be a tall order even at twenty feet. Interpolation of other image features can be used once you can make assumptions about a feature, but that wasn't mentioned.

Also, couldn't the approaches in the paper be applied to other cameras? It doesn't seem like IR is a requirement for the approach either?

Good job taking on the challenge and good luck in the competition.

Greg McKaskle

fb39ca4
16-03-2013, 16:00
All together, I'd say that the color probably doesn't matter much.


I'd say to go with green, because the goals are either surrounded with red or blue.

faust1706
18-03-2013, 00:31
I admit, I was scared publishing it-making all the other teams better, but my mentor and I agreed to put my paper for the public, not the code. I dont want people just using the code and not understanding it. The final copy is on here now, finally found that thumbdrive. I used object pose to track distance. I defined the field target in 3D coordinates, with the center of the top hoop being the origin, then the outer 4 corners of the 4 targets being the points of interest. I called those my object points. it was a 16 by 3 array, x, y, z, where z is zero because it is on the same plane in the z dimension(not protruding into the field).

For my image points, I defined at 16 by 2 array, the 4 outer corners. X and y where it's pixel coordinates, with the center being the center of the screen.

Now, there exists a rotation and translation matrix between my image and object points. Linear Algebra is used to find that. This website, though I didn't use and just found, provides a good explaination of the math http://www.fastgraph.com/makegames/3drotation/, that's the part i am not allowed to share, I do apologize.

And you are right about the color of the light. I doesnt matter as long as you have a method to threshold your target of interest. I use IR because I asked teams in the past how using green or white leds are, and they say they are good, but they have trouble in different environments. The only thing that my camera doesnt like being outside and stage light.

And yes, other cameras can be used, we used the microsoft kinect because it was already available thanks to first, and we had a decent amount of extra weight last year. It caught little kids' attention and got them interested when the saw us do presentations around town. To prove it works with other cameras, I began this year with a webcam and removed the IR filter (which was a pain), and then got a filter that filters everything except IR, and had an IR illuminator to see. The frame rate was poor but I am willing to bet money on it being because the webcam costed maybe 5 dollars.

The year, I rewrote a function I had already adjusted the in OpenCV libraries that approximates a polygon around a contour. If you look at the final image, the squares drawn around the inside and outside of the target arent perfect. That is really bad because I used those corners as my image points! Now, I use the_actual_corners of the target.

faust1706
18-03-2013, 00:45
This paper isn't the best. It's come to the point that the more I read it, the more I dislike it, but that stands for all my english papers as well. And it has some typos...... anyways, like I've said, the finished paper is on here now, please take a look at it. It may answer future questions, and if it doesn't, don't hesitate to ask me, I love talking about this stuff.