![]() |
Turrets and cameras
Those who have actually played in a game and have a turret that is aimed by a camera, what has been your experience?
I am hearing in the forums that the lighting is bad and cameras are not being effective. Is this true? |
Re: Turrets and cameras
Well, the code provided by FIRST detects the target fairly well in the pits at Buckeye. I'm not all sure how it was working on the actual field, with incandescent bulbs instead of fluorescent/LED lights, and I didn't calibrate it at all. My drive team tells me it worked fairly well at a distance, but is not very good close up (This is due to MY programming, since the turret won't turn fast enough to keep the target in the field of vision up close)
|
Re: Turrets and cameras
From what I hear the stadium lighting is totally messing up cameras, I've only heard that from a couple sources though. Just be warned that you are very likely going to have to re-configure your camera settings for the on field lighting conditions.
|
Re: Turrets and cameras
Are they allowing teams to go on the field during breaks (primarily lunch) to calibrate their camera with field conditions? I remember doing that at a number of regionals in 2006 and 2007, it was very helpful.
|
Re: Turrets and cameras
At the Midwest Regional, after the practice matches on Thursday during lunch we were allowed to go onto the field to check the camera software.
What we found out was that the lighting really messed with the area percent on the targets, so we eventually gave up on using it. We found that turning down the brightness helps a bit. |
Re: Turrets and cameras
perhaps this isn't a good idea, but instead of using area function provided, I suggest doing your own area calculation using the rectangle it finds. Oh, and how would one go about calibrating camera? Do you just run the white balance vi that came with the example and make sure that the set white balance function is nowhere in the normal main (as to not overwrite the calibration)?
What were you using for brightness (and white balance)? |
Re: Turrets and cameras
In the example (2 color tracking) there was a brightness selection, that had been defaultly set at 40, which we turned down to 0.
I don't think we had time to check out white balance (we programmers were getting really hungry by that time). How exactly does the rectangle work? When we were using area, we noticed that it pretty much cut off a side the target. |
Re: Turrets and cameras
We took advantage of the calibration time during lunch on Thursday, and found that the images we were getting were basically just glowing orbs of (barely) green and pink. We couldn't find any brightness, WB, or exposure settings that made it work. We found setting brightness low (like 10... sounds like 0 would have worked too?) helped a bit but we could never find green. Camera worked GREAT in the pits and on the practice field.
We did find a target once or twice in autonomous with the 'flourescent 1' and 'auto' settings for wb/exposure but it was way too unreliable to be used during teleop. I'd really appreciate hearing from anyone who got the camera to work under those super-bright lights on the field. I know FIRST tested the camera "under FRC competition conditions" but I wonder if they tested with the white FRP floor? I'm thinking it may be reflecting a whole lot more than the classic carpet we're used to... and they probably selected the camera long before Lunacy was designed...! |
Re: Turrets and cameras
Maybe we'll bring sunglasses to put on Toasty and see if it works?
(Toasty the Camera....) |
Re: Turrets and cameras
It seemed like Simbotics had some sort of sunglasses like feature on their camera, maybe someone could check on that.
|
Re: Turrets and cameras
The simbotics are using a fisheye lens.
|
Re: Turrets and cameras
Quote:
|
Re: Turrets and cameras
We didn't use a Camera this year so I have no first hand experience with it's success or failure but I do know that someone published some Camera Threshold Values in the pits on Friday morning they were displayed next to the inspection station. Also, at about 7pm Thursday once all of the practice matches were over teams were let onto the Field for about 10 minutes each to test out Camera Settings. This is just what they did at Jersey, I don't know what'll happen at your Regional(s).
My suggestion to any teams who need to tune Cameras is to ask the FTA or someone else in power on the Field if there's any time when you could go out on the field to gather values. I've found that usually if you ask politely they'll try their best to help you out. |
Re: Turrets and cameras
We didn't do the camera calibration on Thursday, and regretted it. Our radar screen would show ghosted images of targets, but never a fully-solutioned target. The ghosted image meant that a pink/green combination appeared at that spot sometime in the last second, yet the data wasn't continuous. In other words, the targets would pop in and out while on the field, making them difficult to track. It still proved useful when we were close though.
We'll have to get it calibrated and practice with it in Florida. |
Re: Turrets and cameras
Quote:
|
Re: Turrets and cameras
Did anyone find a setting that worked. Thanks everyone.
|
Re: Turrets and cameras
At the DC Regional, there was an opportunity during lunch break on Thursday to calibrate cameras on the field. Greg McKaskle from NI (many of you might recognize his name from Chief Delphi) was there to help, but I only saw two teams (including us) actually out on the field. The first thing Greg said to do was to log in to the camera (192.168.0.90) and lower the brightness level to 0. After that, we tweaked our HSL values a little bit, showed them to Greg, and he approved.
During the matches, our camera seemed to work fine. The only problem was that we had written our code to rely almost completely on the camera, and we only tested our code on stationary targets. When our robot and the target robots were moving, our shooter would miss the target. The camera by itself, though, worked better for us than most people here seem to describe. |
Re: Turrets and cameras
Thanks for the info, we'll try to get onto the field to do that as soon as we can at our first regional.
We did some playing around with moving targets before ship, so we know it's a challenge...and this is with the target trailer moving pretty slowly. http://www.youtube.com/watch?v=cT_zdHa2BsQ |
Re: Turrets and cameras
The values that were given to us by the folks from NI were fairly accurate. We created our own lighting settings in WindRiver specifically for the arena, and our camera was very effective at tracking.
We were given calibration time on Thursday night, and took advantage of it. I was glad we did. |
Re: Turrets and cameras
Quote:
The Java radar screen on our laptop allows the second driver to follow targets around with the turret, with a cursor representing the turret bearing and shooter speed. There isn't much automation in it due to the fact that it is more difficult than our capacity & timetable allow to create automated tracking algorithms. So it is up to the second driver to lead the shots on moving targets. He was getting decent practice with it Saturday, but we didn't have enough time to tweak it. Unfortunately, judges don't believe the screen is good enough for an award because of its lack of automation. 90% of their questions were geared towards automation and accuracy of the target tracking system. Forget the fact that we did some pretty technical processing & re-wrapping of data in between the camera, cRIO, DS, and laptop, or the fact that we use a closed-loop turret tracking algorithm, or the fact that we used a standardized DoD System of Systems approach to the whole thing. We'll have to make an animation and presentation for all of that stuff one of these days... |
Re: Turrets and cameras
Quote:
We had problems throughout the season with the green HSL values, where the green would flash in and out - but I'm thinking now that maybe that has more to do with whether the algorithm had properly located the pink square first? (E.g., it doesn't look for green unless it finds pink properly?) |
Re: Turrets and cameras
Quote:
He also mentioned that toning down the brightness might help if your camera is angled high enough. This is because it will see the arena lighting and that will definitely change the way the camera see's the colors. NJ's field lighting was also adjusted to maintain the brightness all over the field to the best of their ability. |
Re: Turrets and cameras
Quote:
|
Re: Turrets and cameras
Maybe the robot needs "headlights" to illuminate a target. The stadium lighting shouldn't mess up the vision if your lights are brighter than theirs. Just keep in mind that they'll be shining into people's eyes, so don't make them too bright, and turn them off when they aren't needed.
I'm not surprised the stadium lighting is messing things up. We tried using the CMU cam with the passive vision targets in 2005 (the tetrahedron game) but required a huge amount of constant lighting to get it to work in the shop and gave up on it before trying it out on the field. I'm curious why, after having good success with active (light emitting) vision targets for a couple of years the GDC would choose to go back to the passive targets. Jason |
Re: Turrets and cameras
Quote:
Probably concerns for getting power onthe trailer, and the difficulty of 360 degree illumination kept it out of this year's vision targets. I'm glad they did the green/pink combo on all trailers to ensure fair color tracking between alliance colors. Imagine if the red trailers had only pink targets, and the blue trailers had only green! |
Re: Turrets and cameras
Quote:
It is unfortunate that people work so hard on the cameras only to find out the lighting is an issue. |
Re: Turrets and cameras
Does anyone have any sort of anecdotal evidence (or otherwise) to show whether or not a shroud that blocks light from above would be helpful? I wonder if the effects are similar to when you shield your eyes from the sun to look at something in the distance.
If not, then could someone try it at a Week 2 regional? |
Re: Turrets and cameras
would it be against the rules to mount a large colored light (not pink or green) to manually aim a turret. this would avoid camera problems, as well as help the driver aim the turret. that way you can see if you are illuminating a trailer or not?
|
Re: Turrets and cameras
Just make sure it won't blind the drivers as your robot points at them.
|
Re: Turrets and cameras
I suggest you read the rules carefully, then figure out exactly what you want to do, then ask on the Q&A, being very specific, providing lots of infomation.
|
Re: Turrets and cameras
Quote:
You could legally use the Robot Signal Light, as long as you satisfied all its rules. Otherwise I would say no. Generally, lights fall into the nonfunctional decorations category and are thus allowed to draw power from the robot, but this is a functional light so it doesn't fit (read R19 and R49). Also, if it is a very bright light it might violate R02-A by shining in people's eyes (lasers are specifically banned in R02-D) |
Re: Turrets and cameras
Quote:
Currently our plan is to use the time on the field for calibrating to set both the white balance and the brightness level. Will we also need to set the Brightness level in the camera to 0 as well? If we don't do the camera setting and only use the vi, will it have any ill effects? |
Re: Turrets and cameras
When you set the brightness level to 0 by logging into the camera, does it stay like that even if you are using LabView/C/C++ code after restarting the camera and not doing anything with the brightness in your code? Is there any way to control the brightness in C/C++ code?
|
Re: Turrets and cameras
1712 used the given calibration time during Lunch on Thursday to work with the NI representative at DC to calibrate our camera. I don't know the specifics on the settings (was working on other items at the time), but we managed to get our tracking code working succesfully.
We would sometimes lose targets at a longer range, but that may have been due to their movement and the presence of multiple targets (when we lost a target, we would almost always lock onto another almost immediately) in the camera's vision. We did notice some difference between when the camera was aimed at the black curtain background and when it was aimed anywhere else. It wasn't a massive difference, but it may be enough at some venues to impact your results. |
Re: Turrets and cameras
Quote:
Yep, one of our software engineers, Paul Gehman, worked with Greg from NI at lunch to get the 1712 settings right. I'll see if I can get Paul to post our values here - maybe I can get some screen shots from the programming laptop to post., but I'd suggest on field calibration on Thursday if/when it might be allowed. Bob Bellini, our other engineer helped to refine our target code on Sat, but since the lighting on the field was so bright and different, we really couldn't test a lot in the pits and never went to the practice field because of the difference. We just tried to learn through Thursday and things really worked out well for our autonomous routine using the camera and target size only with no other sensors. |
Re: Turrets and cameras
1 Attachment(s)
Attached are the cam values we used from calibrating with Greg from NI at lunch on Thurs. to score in autonomous in DC. I'll see if I can get Paul Gehman to comment here since he is the one that worked with Greg.
It's also important to note that it's possible that some slightly different values might be optimal for red/blue alliance as we always have had a harder time identifying green as opposed to pink and with the "other" color on top and the position of our camera the same values may not be as good in both cases. Nontheless, the lunchtime calibration on Thursday is what allowed us to consistently track and score on targets in auto. |
Re: Turrets and cameras
Quote:
Green H: min: 55 max:125 S: min: 35 max: 255 L: min: 92 max: 255 Pink H: min: 220 max:225 S: min: 30 max: 255 L: min: 80 max: 255 |
Re: Turrets and cameras
As many have stated, the lighting at the events is indeed different from what is in your classroom or shop. I'll give a general description for how we arrived at values, I'll attempt to answer some of the camera related questions, and I'll give a few hints on how to tune your camera up. I won't be giving absolute values as I fully expect that different events will have variations in lighting brightness, arrangement, color, and in backdrop. Because if this, knowing how to find the values is far more important than knowing the values of a single field. I highly encourage you to take advantage of the Th field time to tune your camera. Be prepared for a BIG post. If you like, you can then share the values with other teams.
First off, the brightness and other camera settings are persisted on the camera. The camera settings are modified via HTTP CGI protocol and can be set interactively using the camera web pages, the C based camera demo application, or via the LV wrapper VIs. The Find Two Color LV code included a brightness parameter on the panel specifically for tuning. Other camera settings such as exposure are set and behave similarly. Lighting on the practice field seems to work reasonably well with default settings. It is perhaps a bit brighter than the average classroom, but the tests I ran in DC worked. The lighting on the competition field is quite different. There are two trusses of lights running along the long sides of the field approximately 12 ft outside the edges of the field and suspended ~30 ft in the air. Those trusses are packed with lights. They are adjusted for even coverage and to produce few shadows. With the default settings, the test went reasonably well on three sides of the field, but glare was present and caused some white streaks on the targets. The fourth side of the field, the one with the black curtain will unfortunately not work with the default settings. The reason for this is that the camera is sampling the pixels in a frame to determine how to adjust future exposures to capture enough light with the camera sensor. This typically works quite well, but since the camera has no idea what portion of the scene you are most interested in, it uses a statistical sampling and when the background of the image is not returning any light, it causes the camera to lengthen the exposure resulting in an image with lots of glare. This same effect may be seen if the house lights in the stands are set very low. If the overall light from the background is too low, the brightly lit targets will be blown out by the bright lights. The solution is to shorten the exposure. With the Axis 206 camera, there are two modes to control exposure, leave auto exposure on and decrease brightness, or determine a way to calibrate the auto exposure and set it to hold. Personally, I think it is more repeatable and more robust to use auto exposure. Of the fields seen thus far, most decided to set the brightness to 0, one field determined that 10 was a better value to use. The brightness works similar to an exposure compensation setting on most consumer cameras. It modifies the determined exposure to let the human adjust for back lighting or dark backgrounds. For glare lower it. For backlit targets, raise it. The other aspect of the competition field that is difficult to deal with is the light position. In some locations, especially as the camera gets near the target, it will be aimed upwards. In some locations on the field, the camera will then be staring into the lights. This further complicates matters by partially blinding the camera. Finally, in the worst locations on the field, the angle from a low mounted camera, a target, and the light will be symmetric. When the angle from the target's plane to the light and the angle from the plane to the camera sensor are roughly equal, the glare from the lights will go way up, practically turning the target white. This doesn't occur that many places on the field, but it would be wise to allow for the occasional dropped frame in your tracking code, your lead estimator, etc. Because the camera will sometimes be staring at the lights, you will find that the color saturation of the target drops compared to the shop. Because of this, you may find it useful to lower the minimum values of saturation you are thresholding against. How much? As you can see, different teams determined different values. If taken too low, gray or pastel colors will start to be included in the mask. Not low enough and some areas of the field will flicker or stripe. To determine the value to use, it would be good to interactively move the camera and target around on the field viewing an HSL display. To do this, the LV example was modified to display more information. This will be made available for FTAs and teams to use if they wish. An announcement and instructions will be released soon. A sampling of test images and the threshold results can be found at http://flickr.com/photos/35780791@N07/ From the stands, I saw some camera work very well, many others were not even turned on. As weeks go on, I expect more successes. Feel free to post questions or results. Greg McKaskle |
Re: Turrets and cameras
Quote:
|
Re: Turrets and cameras
Greg, awesome post, thanks! Only question: what was the 'correction' for the images @ flickr? Just a drop in brightness (probably to 10 or 0) and a drop in saturation threshold? Or something else?
We pretty much had images similar to your "Default West" (lots of glow / washed out) for the area we were trying to calibrate in, and couldn't find a setting to get an image as reasonable as your "Corrected West 3/4/5" (which seem excellent!). We're totally willing and prepared to run an on-field calibration ourselves but it would help to know which values you changed to get the effects in the images. |
Re: Turrets and cameras
Quote:
|
Re: Turrets and cameras
The corrected images were taken after dropping brightness and lowering saturation.
Greg McKaskle |
Re: Turrets and cameras
We have absolutely horrible lighting in our workshop - it's flourescent tubes every 3 or four feet. Our camera always points upwards and is mounted in a fixed orientation.
The best procedure we have come up with for camera tuning in labview goes like this: 1. Enable HSL debugging in the Find Two colors VI. 2. Enable HSL coloring in the Vision VI. 3. Hover your mouse over different parts of the greent target. Try to hover it over a dark patch of green and a bright patch of green. Watch your initial HSL values from the HSL debugging window. Note the maximum for each value, and the minimum. 4. Return to the Vision window and input them on the front panel VI. 5. Turn off the "find pink first" checkbox and look at the mask. More than likely, you'll see a bunch of noise and your two colors. One of the colors (usually the green color) will be flickering. You also may see a bunch of noise. 6. Make sure you're happy with the framerate before going any further. We're happy if we're seeing 15-20 FPS. We go into the Vision VI and set our framerate maximum to below the lowest number we see (with our tracking code, we very much want consistent values). 7. Return to the vision VI. Methodically adjust your green values up and down by 5 at a time. Return periodically to the HSL debugging window to verify none of your values there have changed. 8. Once your green locks in, go back to pink and adjust the values a bit. At this point you should have a reasonably clean mask, with little noise, and two solid shapes. 9. If you are still having problems, or if you aren't getting solid shapes, repeat the process. If you still don't have any success, we've several times resorted to modifying the area and particle settings to make them somewhat more forgiving. 10. Now you need to turn the robot so the lighting changes and repeat. If you're getting a lot of washout, or the color values you're getting between light / dark patches on the target are significantly different, you may need to raise or lower brightness to repeat. It usually takes us 10 minutes or so to get nice solid values that work over most of the area (we've done this in 3 or 4 significantly different lighting conditions just to practice - bright sunlight, flourescent tubes, and incandescent bulbs). This has worked pretty well for us in several different very poor environments. I hope it helps you guys. Team 1718 |
Re: Turrets and cameras
Greg, this is great information. We'll definitely spend a lot of time in Florida figuring this stuff out, as our shooter is the last thing we have to fine tune.
Our setup: we mounted our camera up as high as possible on our bot; the lens is roughly 55" off the floor. We do not tilt the gimbal; instead it pans to three set locations left, center, and right. We'll add a glare shield to the upper portion of the protective lexan cover we made, and bring back some settings & results from Florida next week. Too bad we can't get the camera feed back to the laptop during the matches -- I'd love to FRAPS it so other teams can see what we tried that may or may not work so they can further perfect the whole system. |
Re: Turrets and cameras
Quote:
All of the other years, I don't know what benefit there would have been for doing this though. |
Re: Turrets and cameras
I have been able to get the camera working pretty well in two locations by adjusting the brightness.
At our playing field there are long fluorescent lights attached directly to the white ceiling. As you can imagine, this lighting is very bad, it causes whatever is in the foreground to bleed into the ceiling where a light is. However, I was able to get it working by setting the camera's brightness to 25. It was here that we shot our promotional video. At our shop, there isn't as much light, so a brightness of 30 works well. Towards the end of build season I would work all day in the shop and around 3:00 PM the sunlight would shine through a window and blind the camera. A piece of cardboard over the window was enough to get it working again, but once the sun was low enough, the light was to low and I would have to take the cardboard off to make the camera work again! In other words, variable light is bad for the camera. Good luck to all teams with their camera calibration! |
Re: Turrets and cameras
Quote:
Greg/NI was available during lunch on Thursday, at the DC Regional, to assist Teams who wanted to calibrate their cameras, and adjust parameters in their softwares. We jumped at the opportunity to field test our camera settings and vision VIs (LabView). For Team 1712, that meant varying key settings in our implementation of the "two-color camera servo" example code. Read Greg's reply carefully and take a good look at his posted images and masks. The field's lighting at DC was much brighter than the lighting back at Lower Merion High School, or in the DC Regional pit areas or on the practice field. For an extreme example of what the bright lights could do to a target, as viewed by the camera and software, look at Greg's image labeled "Default West" - read front glare. On a related point, the DC field's lighting consisted of 2 high-mounted banks of can lights aligned with the long sides of the Arena, aka the Crater. This could create bright sides on the pink/green target - read side-glare - leaving a "shadow" down the center. For a somewhat similar mask to what we were experiencing in DC initially, albeit a more extreme example, look at Greg's image labeled "Default SW Corner mask." Prior to lunch on Thursday - Team 1712's first autonomous run indicated that the camera/two-color tracking software locked on nothing - even when targets were directly in front. On analysis = green was never recognized - in essence, our hue settings were too high. With Greg's general guidance, and the sample pictures similar to what was posted above, Team 1712's coding crew talked over our proposed adjustments, played with lowering the brightness value, lowering the green hue's upper and lower range values, adjusted the lower red saturation value, and adjusted the servo speed/ranges. All this activity took one busy lunch hour. Part of Team 1712 then spent the rest of Thursday watching the Dawgma Team's robot "Alice" begin to hone in during the next several matches, as we tested and refined our approach. Tom Line, also in this thread, lists the key tuning steps that Team 1712 also utilized to refine "Alice's" vision. I can definitely agree with Greg and Rich on a couple of interesting issues. When "Alice's" camera stared at the black background behind the big-screen TV - we believe the camera adjusted enough to void out one Thursday match. And what eventually worked for "Alice" in autonomous mode on Friday and Saturday might not apply elsewhere. Best advice -- grab a hold of any available time to test on the real field. Using Greg's/NI's snapshots of targets and masks for ideas, and understanding how the brighter lights, the darker backgrounds, and the software settings were interacting and affecting "Alice" in DC -- we eventually developed better and better masks for "Alice" to use in tracking and scoring in autonomous mode. |
Re: Turrets and cameras
Quote:
|
Re: Turrets and cameras
Quote:
Here's a quote from Sean in another thread Quote:
|
Re: Turrets and cameras
I saw a handful of video posts on the general forum about autonomous scoring. The attached video will give you an idea of the glare and how much the camera is shifting the exposure to keep colors from over saturating. It also shows why it is good to not overreact when a frame doesn't contain the target. Glare, a hit, all sorts of things can cause small glitches.
http://www.youtube.com/watch?v=xJnq-...eature=channel Greg McKaskle |
Re: Turrets and cameras
Quote:
Then, if a team in the Einstein division is using a camera they will need to recalibrate again. Einstein has different track lighting on it than all the others have. Hopefully the NI people and the other teams in ATL will figure something out. |
Re: Turrets and cameras
What we did was we got on the field, took some screen shots and determined an acceptable range of the HSL values using Paint.
But then our operator says he barely uses our camera track. So I can't really say that it works. Good luck! |
| All times are GMT -5. The time now is 02:44. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi