![]() |
They scraped the camera :-(
For all the hype they put on the camera the past two years, I would have sworn they would have done more with it this year somehow. I was really looking forward to working with it again :( (though would liked to see something other than the green light again)
|
Re: They scraped the camera :-(
I completly agree with you, im not too happy about them not having the cmucam again..
|
Re: They scraped the camera :-(
From my past experience with the CMU cam with my team, I found it to be something to be put on the put-a-side list and done as an afterthought more less. I will say its a let done to work for 3 years now and not have it around anymore.
|
Re: They scraped the camera :-D
Sorry to spoil the party, but I am glad the camera is gone (at least for a while.) We have never been able to get the camera to work consistently. We have a small team and have never had the time or resources to debug it. We had autonomous software that would have worked if the camera was not checking out the overhead lights or other points of interest, but we rarely got a consistent lock on anything. I believe that has been the experience of the majority of teams.
<RANT :mad: > From what I have seen, the camera has been a failure. It is balky and unreliable. In 2006 and 2007 very few teams at the Boilermaker were able to make it work consistently. (Even with the 'best' teams at IRI in 2007, autonomous ringers were few and far between.) Also, they have not made a big difference in the game. In 2006, most teams used dead reckoning - ignoring the camera. In 2007, very few teams even tried to score, let alone made a ringer - and the advantage for doing so was minimal. </RANT> I am excited about the new IR challenge. I hope that it will be reliable than the camera. |
Re: They scraped the camera :-D
idk, for all the teams who say its was tracking all the overhead lights and stuff you just have to spend a couple hours and play around with sataturation, brightness and other values in labview, when we got ours calibrated it was the key to our success in autonomous mode in 06.
|
Re: They scraped the camera :-(
In spite of my programmers, I am a little sorry it isn't in this year. But of course it took us 2006 and 2007 to understand it. And yes, the field lights are not forgiving to find that ol' green light. I think, in terms of programming, the hardware has to catch up with the software on this part.
|
Re: They scraped the camera :-D
Quote:
I have to dissagree. I think that the reason that CMUcam2 has been removed, is because it was mastered. |
Re: They scraped the camera :-(
We had the camera working pretty good - TX to the programming team.
an i would have like to see something like it again, sience we always had trouble other sensors used for navigation. I had the idea to use the camera with the preprogrammed tracking code to as a carrier forthe IR sensor. then give the ROBOCoach a green light at his location. the camera logs on to the light, no matter were the robot goes. and the IR comunication can be used. what are your thoughts, esp. about a big green light for the Robocoach? and if the 2007 camera would be allowed in 2008? or any other ideas to use the IR sensor while the robot is driving and being ushed arround? The CatAttack 451 - Jens |
Re: They scraped the camera :-(
I don't know about you guys, but I'm happy I'll never have to mess with the CMUCam again. It was really unreliable and flaky. It was the cause to many late nights fighting with it follow a green light. Eventually I got a routine down that found the light and hung a tube most of the time but once we got onto the competition field, the robot had a hard time finding the green light amongst all the ambient lighting in the station.
Good-bye CMUCam (for this year), you won't be missed. :) |
Re: They scraped the camera :-D
Quote:
|
Re: They scraped the camera :-D
We had enough programming problems that for the most part we didn't even bother with the camera. I'm glad we can finally move onto a new challenge.
|
Re: They scraped the camera :-(
I don't think they've scrapped it at all. the CMUcam is still a totally viable system, we just have to buy it ourselves now and include it in expenses. I think it could be used for places other than tracking green lights. Maybe with the correct filtering and what not you could track an alliance ball. Also 1 of the 2 balls has white spots on it, maybe the camera could lock onto that.
of course this is pure conjecture and has not yet been tested, nor have we had much success with the camera in the past. I think FIRST is trying to get more teams involved in making their vehicles more robotic and so they met us half way with this hybrid mode. anyways, good luck guys. |
Re: They scraped the camera :-(
Quote:
In 2005, was tough, though we did manage to find the Vision Tetra on the field, lighting was the main problem. In 2006, we nailed the auto mode every time and if someone hit us, we still tried to correct and get a second aim. We also used the camera in tele-mode to "auto aim". Driver pulled the trigger and let the robot take aim, rather than human take the aim, worked pretty good for us. The only troubles we noticed was the backup battery. We changed the backup battery after every match in 2005, then we we got the rechargeable circuit in 2006 for the battery that helped out ALOT...Camera was rock solid both years for us. In 2007 we was a lifter only robot, so no camera for us. we had no grippers to handle rings. |
Re: They scraped the camera :-(
Quote:
I worried that this might be considered sending more than 4 discrete pieces of information from the RoboCoach to the Robot, but frankly, I think the idea is so cool that it would be worth getting DSQd just to see it work! I promise that if you call me as a witness at your trial, I will testify that wearing hats with big green lights is all the rage among teenagers these days, and certainly can't be construed as trying to influence a robot. I am not an authority on it, but it seems to me the Robot rules preclude use of the CMU camera as shipped in previous KOP because it is a custom made part for the FIRST competition. For example, here is a piece of <R36> The item must not be a part custom made for the FIRST competition and provided in the Kit Of Parts for a previous FIRST Robotics Competition (e.g. 2006 FRC transmissions, custom-made motor couplers, custom sensor strips, 2006 IFI CMUcam II modules, etc. are not permitted) And I agree with both sides of the comments here: it was really hard to get the CMUCam to stop looking at the MountainDew signs and stage lights, but now that we've figured that out, we wish it was integral to the competition again this year. And the other side I agree with is that now that we've figured it out, it is time to remove it and put some other stress on our brains. (but I really hope it comes back again in the future). I think the right answer is to have something you could do if you can get the camera working, but also have some useful things you can do even if you can't. For this year's game I think a good twist would have been to allow the camera, but not have the green light. That will present a whole new set of calibration issues (similar to the Tetra challenge) if you want to use it to figure out which ball to knock off, or whatever. I don't see any preclusion against using cameras this year, it seems you just aren't allowed to use the one all the veteran teams already have. I can understand that as a way to prevent Rookie disadvantage, but hey, we Vet teams are broke too. :-( [EDIT: I read on some other threads that they think just the 2006 camera is explicitly disallowed, and the 2007 camera is OK because you can buy it COTS: http://www.ifirobotics.com/camera.shtml - - again: only imho] |
Re: They scraped the camera :-(
I am quite disappointed as well about the disappearance of the camera this year. There was quite a bit of a learning curve but teams that have the resource to do an autonomous mode should have been able to pick it up by at least the 3rd year. And last year letting us go on the field and calibrate the camera did make tracking a lot easier during competition.
|
Re: They scraped the camera :-(
Quote:
|
Re: They scraped the camera :-(
Quote:
|
Re: They scraped the camera :-(
Quote:
-dave . |
Re: They scraped the camera :-D
Quote:
|
Re: They scraped the camera :-D
Quote:
In the "real world" vision systems are 90% about lighting. I use Cognex vision systems in an industrial setting. Excellent vision system by the way, it's based on an Excel platform...awesome stuff...anyway... http://www.cognex.com 1501 in 2005, the problem I saw was the tetra plaque was very hard to see, even from a vision stand point. It was not illuminated, however we still managed to find the vision tetra at Boilermaker in 2005. Video 2005: http://www.youtube.com/watch?v=i1t_WIyBBDE 1501 in 2006, we had AWESOME great time with the camera. I am sure the GDC understood the "lighting" problems in 2005 and learned, so they gave us an illuminated target. That was the best thing they did for us to help the camera. When we fired up the CMU the first time with the green light, it was like a night and day difference from 2005 and 2006. The ability to see the target, again is 90% of your vision application solved with the lighting problem we where able to find and use the CMUcam very effectively. Here is a video in 2006, in this video you'll notice the opposite team hits us from the side, (it was a redo match 4 times), but they hit us and knocked us off course, we stopped shooting. The camera found the light again, corrected the robot, locked on and started shooting again. Alot of teams thought we did this with "dead reckoning". This was not the case for 1501. Video 2006: http://www.youtube.com/watch?v=o2SgDRhX9Kg In a second video "same match" we had 4 REDO matches in Match 10....anyway, you'll see that the opposite team did a good job stopping us, and the camera knew it. It stopped shooting and held on to the balls, until tele-mode. Video 2006 #2: http://www.youtube.com/watch?v=0W90F4V7EXQ We also had in tele-operated mode, the ability to pull the trigger and the camera would take over the drive system and align the shooter for the drive team automatically. This was very effective because when you approached the goal from different angles it was very hard for the drive team to figure out if you was aligned with the hole or not, so pulling on the trigger in tele-mode used the vision to align to the light. Video 2006 great match showing auto, and auto aim "tm": http://www.youtube.com/watch?v=lYUOC3icGPk Here is the source code for that robot if you would like to see the software used: http://www.frcsoft.com/forums/index.php?download=11 1501 in 2007, did not use the camera because we did not have an arm. We were a lift bot. Video 2007: http://www.youtube.com/watch?v=cJuxH_nHp8w 1501 in 2008, is once again looking at the camera. Of course, we are back to the lighting PROBLEM again above in 2005. There is no illuminate target to make it easy to see. It's not the camera's fault. It's the "application's fault". A challenge? Yes in deed but more realistic in today's world. If I have a green light to look at in the real world, it would be too easy. The point I'd like to make is it's not the camera's fault. The camera works and works well when you can control certain elements. Sorry that it didn't work out for you, but feel free to contact me about the camera and I will try and help you out the best way I can. You can only eat an elephant one bite at a time. Look at the CMU cam in this way. It's alot to chew on. Here is part of an e-mail I sent to another team that gives the low down on the camera: I also reference tracking the BLUE BALL in this years game (I reference EasyC, because that is what we use to teach new programmers) MY QUOTE: The Java Tool is the easiest tool to get setup to pick a color and get the "color value" you want to track. You take that color value, and you put it in the Camera Table in EasyC. When you drag and drop the camera initialization function, you'll see a "table" that you need to configure. In there you input the color settings etc. By default, EasyC has the table #1 configured to track the GREEN LIGHT. So you don't have to change any of the numbers in the table to track the green light. However since I assume you want to track a BLUE BALL, your going to need to test the camera under the java tool, pick the color from the screen grab, and take that and change the default table. EasyC supports 10 tables. I "assume", you'll be teaching TWO TABLES, one for blue ball, one for red ball. Then when you know what color you want to track, you have dump the camera configuration and load the correct settings "table" per what ball you want to track. The java tool was PRE-EASYC, released in 2005, so all the settings to populate the table ARE NOT SUPPORT in the JAVA TOOL.... Here is a download to get Java tool http://www.cs.cmu.edu/~cmucam2/CMUcam2GUI.zip I dunno if the above is NEWER or SAME as what you got now..... The Java Tool GUI manual can be found here: http://www.cs.cmu.edu/~cmucam2/downloads.html PS: Or you can download all the camera documents and java tool here: http://www.frcsoft.com/forums/index.php?download=17 So if you want the best compatibility from teaching the camera the color tracking, then you need to install National Instruments Labview, then download these applications: CMUCam Labview Application http://www.frcsoft.com/forums/index.php?download=32 Danny Diaz is a good guy to get LabView help, installing, running....if you need help http://www.chiefdelphi.com/forums/member.php?u=11247 If you get the application loaded and running in Labview, then you can track the color in Labview (your doing the same thing in Java) and it actually spits out a CFG file. That is the same file you need for EASYC. You can have Labview dump the CFG file, then go to EasyC and load the CFG file. Pretty nice when you get it working.... Step #1 - teach camera the color values. (either Java or Labview application) Step #2 - input color values and settings into EasyC table using manual entry or load the CFG file from Labview Step #3 - load and initialize the camera in your EasyC software code. Step #4 - camera should start tracking and streaming data of centroid to you via TTL RS-232. (red LED light on camera indicates that it sees the tracking configuration you loaded it) Step #5 - map over variables from camera tracking nd write a program or download Adams: example here: http://www.chiefdelphi.com/forums/sh...highlight=code My word of advice: Spend alot of time in Labview with the camera, changing the gains, settings. Make people walk in front of the camera, throw different colored balls in front of it. Use photo filters or gel filters. Don't go to the robot software until you have mastered the object you are tracking. If you can't see the object "consistently" in Labview, then you might as well stop because your vision application is only as good as it gets right at the camera level, don't cheap out your camera settings and send garbage data to your robot controller. A camera is like your vision, you can't see in the dark, the camera can't either. Give your camera 20/20 vision. Hope that helps. |
| All times are GMT -5. The time now is 18:36. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi