Chief Delphi

Chief Delphi (http://www.chiefdelphi.com/forums/index.php)
-   Programming (http://www.chiefdelphi.com/forums/forumdisplay.php?f=51)
-   -   Camera Issues -- So close yet so far. (http://www.chiefdelphi.com/forums/showthread.php?t=37588)

team66t-money 24-04-2005 23:03

Camera Issues -- So close yet so far.
 
Well my hat is off to our world champions.. congrats. But this year in our journey i have found problems. as some of you know our team used vision, for auto. one of the few teams who actually got it working. But for some reason FIRST did not ban green from the game our anywhere around it, i think that because they did not do this they pretty much ripped our chance of making it farther in the finals in our division and maybe further in the championship. in the first match of our division finals the announcer for curie had green hair and pretty much an all green shirt on our camera was programmed to find all shades of the vision green because the lights going on and off changes but anyways my team asked him politely to move until autonomous mode was over, he did move but not far enough the camera picked him up drove forward turned to him and pretty much tried attacking him, we had so much power behind our robot that it slammed into the side and finally our manipulator flipped up over the side and snapped the steel cable for our arm then the ref wanted to give us a penalty for slamming other robots when they come after us first. i know that there is nothing that can be done but, next year if FIRST uses the camera our in the future they need to ban that certain color from around the field because it could really hurt another team our it just might get some one hurt. so please go to first and let them know something must be done not to let this happen again. though there was a lot learned by using the vision but it come to be more of a disadvantage than anything, beside the national award that we received i think that FIRST also needs to reward the more advanced robots more points for all of the hard work i don't think what is available is enough. Once again i am sorry to not use a lot of punc. i don't like for some reason using it while i am typing but thank you and great season and see you guys next year.

Eugenia Gabrielov 24-04-2005 23:08

Re: So close yet so far.
 
I won't comment on your pronunciation to much extent, though it was hard to follow.

I find it interesting that you mention the green. There has been quite the good amount of discussion about it in the past, and it seems that people came to a general consensus that green was ok. I wish you wrent' forced to expereince this, but please try to look at it in the sense that now that you understand your robot's ability to put a good deal of power into its actions and that you programmed it accurately, that in the future you may need to deal with adjusting this kind of programming on the fly.

If that announcer hadn't been standing there, something else green would have. I wish you luck in the future, and I hope you won't look at this as a cause for you not continuing to a certain level. There are many more factors than a match with an issue in the effect of winning competition.

Sincerely,
Genia

PURPLE! 24-04-2005 23:30

Re: So close yet so far.
 
Our program was so close... then the lighting on the field came up. Wow... talk about inconsistent. By the time the robot found the green it would go up then cast a shadow on it because of the lighting changing the color. :rolleyes: Okay, after a few rounds of loosing the chance to cap we get the numbers and then we find out that shadowed yellow also picks up the tan in the volunteer's shirts... by the time we figured all of this out we had no real time to actually DO the capping...

Quatitos 24-04-2005 23:35

Re: So close yet so far.
 
Quote:

Originally Posted by team66t-money
in the first match of our division finals the announcer for curie had green hair and pretty much an all green shirt on our camera was programmed to find all shades of the vision green because the lights going on and off changes but anyways my team asked him politely to move until autonomous mode was over, he did move but not far enough the camera picked him up drove forward turned to him and pretty much tried attacking him, we had so much power behind our robot that it slammed into the side and finally our manipulator flipped up over the side and snapped the steel cable for our arm then the ref wanted to give us a penalty for slamming other robots when they come after us first.

I got to personally witness this event from about 10 feet away and when I saw 66 slam into the wall I knew that something must have gone wrong. After the match I heard from one of our mentors what had happened and I felt really bad that this had to happen to you guys during the finals. You had an awesome bot and a great autonomous. Its too bad you had to have it happen then but the only thing you can do with the situation is to try and learn from it and get even better, which I am positive that you guys will pull off.

Dave Flowerday 24-04-2005 23:42

Re: So close yet so far.
 
Quote:

Originally Posted by team66t-money
But for some reason FIRST did not ban green from the game our anywhere around it, i think that because they did not do this they pretty much ripped our chance of making it farther in the finals in our division and maybe further in the championship.

I'm sorry to hear that the setup around the field caused you trouble, but the situation you encountered was an issue that could have been anticipated and solved by your team. Even if the announcer had not been wearing green, there could have easily been green being worn by people in the stands, or by other team members standing near the field, or even other robots painted green.

Phil 33 25-04-2005 00:48

Re: So close yet so far.
 
I was watching that match from the side of the field. It was very unfortunate. However, I have to ask you one question: when it was clear something was wrong with your autonomous mode, why didn't your human player step off the pad and disable your robot?

Quote:

next year if FIRST uses the camera our in the future they need to ban that certain color from around the field because it could really hurt another team our it just might get some one hurt. so please go to first and let them know something must be done not to let this happen again
I don't think it's fair to blame FIRST for this, nor do I think FIRST should have banned the color green around the field. They gave teams a very nice "out" this year by allowing them to disable their robots by having a human player step off the pad. Again, what happened in that match was very unfortunate, and I'm not trying to sound ungracious or unprofessional, but I don't think FIRST did anything wrong by not banning the color green around the field.

Wetzel 25-04-2005 01:08

Re: So close yet so far.
 
Quote:

Originally Posted by PURPLE!
Our program was so close... then the lighting on the field came up. Wow... talk about inconsistent. By the time the robot found the green it would go up then cast a shadow on it because of the lighting changing the color. :rolleyes: Okay, after a few rounds of loosing the chance to cap we get the numbers and then we find out that shadowed yellow also picks up the tan in the volunteer's shirts... by the time we figured all of this out we had no real time to actually DO the capping...

You guys came the closest when the only other team in Archimedes tried to cap the center goal and your vision tetras collided. That was heartbreaking!

I commend you for your efforts with the vision system.

Wetzel

ldeffenb 25-04-2005 08:47

Re: So close yet so far.
 
Quote:

Originally Posted by team66t-money
But for some reason FIRST did not ban green from the game our anywhere around it

Actually, had I known of your problem with green around the field, I'd have suggested our solution. We anticipated this based on something we read in the rules about distractions off of the field being our problem. We angled our camera so that it was staring at the floor about 3 feet in front of the robot. At that angle, the tetras and/or goal colors were still seen at the top (bottom in the T packets) of the camera's view. The color would have had to be inside the field wall or on the floor outside for us to see it. Hair would not have been a problem (unless your announcer, like Archimedes, did a head-stand on the field).

Can you tell me your match numbers? I'd like to download SOAP 108's videos of them to see you all in action.

You can download our videos from my BitTorrent server at http://ldeffenb.dnsalias.net:30049. Look for the ones dated for the Championships. if you prefer SOAP's videos, we were in matches 5 (which they don't have right), 24, 42, 57, 71, 80, and 89. Match 80 is the one to watch as we go head-to-head with 624 also trying for the vision cap. Their miss caused our miss. Such is life....

Lynn (D) - Team Voltage 386 Software and Coach

PS. BitTorrent clients can be found at www.BitTornado.com (Windoze) and www.BitTorrent.com (others).

PPS. The Tiki Trophy thread has a posting as to what caused our failure to cap. And we aren't blaming anyone on the field for any of it.

Kit Gerhart 25-04-2005 08:53

Re: So close yet so far.
 
Quote:

Originally Posted by Phil 33
I was watching that match from the side of the field. It was very unfortunate. However, I have to ask you one question: when it was clear something was wrong with your autonomous mode, why didn't your human player step off the pad and disable your robot?

My guess is that the HP didn't step off the pad because they didn't want a 30 point penalty.

Mike 25-04-2005 09:20

Re: So close yet so far.
 
Although I agree with you to an extent, it was just another obstacle that you had to overcome.

Biff 25-04-2005 10:41

Re: So close yet so far.
 
As a television engineer I have worked with cameras for over 25 years. Glib comments about "it's just another item you have to work around" don't help. Our attempts at getting anything useful out of our camera system were utter failures.
I watched 66 at GLR I was impressed.
From reading the forums here they are one of only about 4 teams that had a camera system that worked at all. That's four teams out of about 1000. Other comments in the forums here lead me to believe that the numbers for calibrations were at best inconsistent.
Looking at the playing fields with the skills I have learned shading cameras lead me to conclude the only way to get a good set of calibration numbers was to put your robot out there and do it yourself. This was not accounted for.
In the rules, and question and answer system mentions about programming for interference from colored objects. It was also stated that deliberate color schemes and clothing that sole intent was to confuse the vision system were against the rules.
So bottom line hat's off to 66 and the other teams that went down the vision system road.
In my opinion the match should have been replayed as the announcers clothing, although maybe not deliberate, clearly interfered with a vision system.
Like last year First clearly had high hopes for some kind of robot vision. The teams that had the best luck with vision last year clearly limited what they used the IR system for.
Maybe in the future there can be time on the field set aside for Teams to do the calibrations for future vision systems. My suggestion would be lunch on Thursday, First come first serve.(Pun intended) After all whining with out offering a solution is not a GP thing to do.

Steve W 25-04-2005 10:53

Re: So close yet so far.
 
I was the announcer on Curie field. There was only one time that I was approached, it was during the playoffs. When I came over to your team to see how we could remedy the situation I was told that it was not me but a field reset crew member that had a green collar on his shirt. As the collar was the same color as the green tetra, I asked him to tuck it inside his T-shirt which he did immediately. The fact was that you did not even come out to play that game. I was also told by another member of your team later that you were locking onto the volunteer passes that they were wearing.

It seems that there was a little confusion on your team as to why your vision was not working. I do laud you for giving it a go but please don't attack others when you are not sure of why you had problems. If you ask anyone there, they would tell you that I would do whatever necessary to help teams and not be a hindrance. I WANT to see teams succeed.

Dave Flowerday 25-04-2005 11:11

Re: So close yet so far.
 
Quote:

Originally Posted by Biff
Glib comments about "it's just another item you have to work around" don't help.

It's not a glib comment - other teams (mine included) anticipated this problem and successfully solved it. Perhaps I missed it, but I didn't see any questions on this forum of how to solve it or I would have explained what we do. Part of engineering is anticipating possible problems and incorporating solutions into the design. One suggestion was already made (point the camera at the ground). Our solution was to have the camera look only at the exact locations that tetras should have been found and use the virtual window function of the camera to further restrict it's view. We did not have a single problem with our vision system at the Championship. We successfully tracked the tetras and goals in every single match we played using the calibration values that FIRST provided. Since our software knew which spots the tetras were located in, we could intelligently decide which one to attempt to pick up, and in the cases where we knew that we couldn't pick up either of them (due to the robot's design and not having enough time) we instead would drive over to the autoloader so that it was ready to go when driver control started.
Quote:

Other comments in the forums here lead me to believe that the numbers for calibrations were at best inconsistent. Looking at the playing fields with the skills I have learned shading cameras lead me to conclude the only way to get a good set of calibration numbers was to put your robot out there and do it yourself.
We used the provided numbers at the regionals and Championship and they worked perfectly. I remember hearing after Week 1 regionals that some teams had trouble with the numbers, but I haven't really heard anything like that since. Did other teams have trouble with the numbers in Atlanta?
Quote:

It was also stated that deliberate color schemes and clothing that sole intent was to confuse the vision system were against the rules.
... In my opinion the match should have been replayed as the announcers clothing, although maybe not deliberate, clearly interfered with a vision system.
Are you suggesting that the announcer was deliberately trying to confuse the camera with his clothing? I'm sure you're not, but FIRST made it clear that they would not go so far as to restrict people from wearing green or anything like that.

Jack Jones 25-04-2005 11:30

Re: So close yet so far.
 
Quote:

Originally Posted by team66t-money
…. in the first match of our division finals the announcer for curie had green hair and pretty much an all green shirt on our camera was programmed to find all shades of the vision green because the lights going on and off changes but anyways my team asked him politely to move until autonomous mode was over, he did move but not far enough the camera picked him up drove forward turned to him and pretty much tried attacking him, …

That’s not the way I heard or saw it. I was the ref on scorer’s side Blue end. After 66 slammed the wall late in the Qs, we removed anything green from the table behind me - that’s where the robot aimed, right at me, not at Steve. It did no good. The robot came my way again. We then thought it was the green shirt on the auto load attendant and asked him to vacate next time during autonomous. A short time later the head ref told me that 66 had feedback showing that their video targeted my badge (maybe in concert with my hanging Thunder Chicken). That made perfect sense; it always works to blame the ref.

The odds of capping the center goal with a vision tetra were slim to none. As evidenced by that fact that nobody pulled it off. If we re-played every match that went without success, we’d still be trying to finish up the regionals.

the_short1 25-04-2005 12:17

Re: So close yet so far.
 
i honestly think team 66 coulda been a STRONG contender in the finals.. you guys seriously are one of my top 5 teams ! . . team 1596 felt your pain when your cable broke.. as it happened to us at GLR (our third match), and recabling it not a fast easy thing (am i right or what!),,

also wasnl;t the yellow/black caution tape giving you guys problems? .. i think that first did not do enuf this year for those colors.. in autonomous i think all should clear away with those colors..!

GO TEAM 66! hope to see you guys again at great lakes next year as you guys are AWSOME!~ and hopefully we can add to that 66:67 alliance !

JoeXIII'007 25-04-2005 22:50

Re: So close yet so far.
 
My initial reaction what was happening on the field at Nationals was a bit angry I must admit. I didn't like how things were going and just how all of a sudden things went downhill.

But, being a humble FIRST participant, I rethought, reconsidered, and said to myself, "Hmmm, we could've gotten around this." Unfortunately, experience at past regionals told me that such actions were unneeded, and thus, I and the team was sort of blinded to this possibility that we would have to do such a thing.

Between these 2 thoughts, I can't really say how I feel about what happened, but I am definitely sure about the following:
  • We tried, and I'm happy to be able to say that.
  • We got close, and was even successful 3 times in practice and once officially at GLR.
  • We gave something for everyone to see, and added to the suspense of the game. With that in mind I've had a recurring thought about what would happen if we actually capped center or side consistently. May be that suspense would have been lost, the excitement diminished, and the curiosity of 'when will they get it right' would have gone extinct. It would be an expected, normal event. I even saw a referee give a very animated reaction when we got close at nationals. So, it was REALLY fun to see it work like this.
  • Very few others were doing this, so it was a real treat to see at least one robot go after something green.
  • Due to all of this, we recieved the Delphi "Driving Tomarrows Technology" award, which I am eternally grateful to FIRST for giving. THANK YOU VERY MUCH.
Even though green objects outside the field led the robot to terminal damage, things happen, and it isn't the end of the world, thank God. I hope all who saw us enjoyed our little stunt of sorts, and I end my 3 cents there.
-Joe

dradius 25-04-2005 23:25

Re: So close yet so far.
 
Quote:

Originally Posted by Biff
Other comments in the forums here lead me to believe that the numbers for calibrations were at best inconsistent.
Looking at the playing fields with the skills I have learned shading cameras lead me to conclude the only way to get a good set of calibration numbers was to put your robot out there and do it yourself. This was not accounted for.
In the rules, and question and answer system mentions about programming for interference from colored objects. It was also stated that deliberate color schemes and clothing that sole intent was to confuse the vision system were against the rules.

what actualy makes it interesting is that it was more like 4 out of 1650 teams total and 360ish at nats.

calibration values were utterly dismaying. every field was wrong. the lighting was horrible--not only inconsistent, but also one sided and not omnidirectional. the former made ugly shadows on the field which messed with the color recognition, and even had the ability to cast a shadow over the lens from the robot itself. in order to accomodate for the crap values, we simply demanded a calibration on archimedes field--nothing else to do.

and i brought up a few things about distractions at the drivers meeting on thrusday (i stayed till about 815, waay after the meeting, to talk to the judges). little was accomplished and the judges explained that little could be done about the yellow caution tape on the field and the bad lighting. something DID get their attention, though. i mentioned that in order to get the camera to recognize the yellow on the field, shadows in the hues and saturations had to be accounted for. interestingly enough, the TAN shirts of the field volunteers were a perfect target if the robot so chose. its sad that it took that much calibration to get it to work correctly, but archimdes head referee allowed for one less distraction on the field--the volunteers were required to stand behind the drivers boxes and away from the field during autonomous (this is why i talked to the head ref before every match if a few of you are wondering. i had to remind him) and then they could resume play.

the green was bad in some cases, and its too bad that not only did the hex values for the colors they gave us for the vision tetra differ from what was used in the competition, but there was little support for those whos sole purpose at competition was to cap in autonomous AT the competition. considering the new hurdles thrown at us at nats, one might go so far as to think that FIRST had completely forgotten, or even worse, stopped caring about us select teams. slightly discouraging. slightly. tis the case.

vision is the next big thing on robots--not color maybe, but some sort. this is the 101 for those interested. a starting platform, so to speak. next year will use the cameras again; im sure of it. i just hope they realize the completely unthoughtful mistakes they made this year regarding the use of it.

Will Hanashiro 26-04-2005 11:58

Re: So close yet so far.
 
First off, I would like to commend team 66 for their fantastic autonomous mode. In my eyes, they had the best vision seeking auto mode in the nation. Wait, let me correct that.... best vision seeking auto mode in the world.

What I do not understand is why some people are blaming team 66 for this incident. Why is it their fault that the camera locked onto the button of a person on the sidelines??? Had FIRST kept their lighting the same from regional to reagional, day to day, even hour to hour, then team 66 would not have had to program it so that the camera picked up all shades of green. Had the lighting been consistant, I am confident that this never would have happened, and I believe that they would have CAPPED THE CENTER GOAL ON A REGULAR BASIS. Thats how good they were.

66- I feel your pain, and all I can say is that things just arent fair sometimes. You guys did a fantastic job this year, and I'm glad that I was able to see your autonomous mode cap a goal in actual competition.

Dave Scheck 26-04-2005 12:54

Re: So close yet so far.
 
Let me preface this by saying -
To team 66 (and to others that attempted vision): Congratulations on your successes and near successes. Using the vision system this year was a huge leap of faith and to get it to work is commendable.

Quote:

Originally Posted by Will Hanashiro
What I do not understand is why some people are blaming team 66 for this incident.

I don't think that anyone is blaming them. I think that people are just pointing out that the environment was what it was and that it should have been taken into account.
Quote:

Why is it their fault that the camera locked onto the button of a person on the sidelines???
In all honesty, there was no reason that the camera should have been looking 5 feet off the ground for a vision tetra.
As has been mentioned above, there are ways that this could've been avoided.
Quote:

Had FIRST kept their lighting the same from regional to reagional, day to day, even hour to hour, then team 66 would not have had to program it so that the camera picked up all shades of green.
That's a really absurd statement. Did you happen to notice that the entire roof of the Georgia Dome was blacked out? That probably cost them a pretty penny so that they could at least give teams that were using the vision system a fighting chance. That seems like a lot of money so that a few could show their stuff.

Granted the lighting was very different between the practice field and the dome, which was different from the regionals, but FIRST at least tried to help by giving us calibration values.
Quote:

all I can say is that things just arent fair sometimes.
Really? I thought that they were perfectly fair. Each team had the same opportunity to get the vision system to work. I didn't see one team having an advantage over another at all.

Al Skierkiewicz 26-04-2005 13:28

Re: So close yet so far.
 
Quote:

Originally Posted by Biff
As a television engineer I have worked with cameras for over 25 years. Glib comments about "it's just another item you have to work around" don't help. Our attempts at getting anything useful out of our camera system were utter failures.

Biff,
I think that you have to agree that for all that was tried in vision systems, many teams didn't understand the variable nature of field lighting and white balance. At three regionals and the nationals, lighting was variable and different from either side of the field. I saw the vision tetra go black at one regional when viewed from the player station. You understand the intricacies of color matching with different light sources and know that the camera supplied just didn't have enough tweaks to get 100% but the majority of teams did not. I would be surprised that FIRST paid for the black out drape just to help out vision seekers based on regional results. It could have been more easily (cheaply) handled with lighting at the player station and would have been more easier to control the variations from field to field.
There were enough tools to compensate for most problems but they take memory and code to implement. Vision was not as easy as it appeared for the CMU project in a small room. I think it was a good exeercise for auto mode but would have liked to see the camera in the hands of teams in September. If we are going to use it next year, we should be told now, so some learning and testing can be performed. We don't need to know the game to do that just that the camera will be included and what the color(s) will be. I would like to see the same colors used next year so teams can build on what they already know.
For those who didn't try but would like to in the future...Green for you isn't the same green for a camera. Your brain gets in the way and can tell a lot about what is out in front of your eyes. A camera cannot make those decisions and the same color in slightly different light looks like a compleely different color to a camera. Green can be black or white or even blue under the right lighting conditions.

Steve W 26-04-2005 13:29

Re: So close yet so far.
 
Quote:

Originally Posted by Will Hanashiro
Had FIRST kept their lighting the same from regional to reagional, day to day, even hour to hour, then team 66 would not have had to program it so that the camera picked up all shades of green. Had the lighting been consistant, I am confident that this never would have happened, and I believe that they would have CAPPED THE CENTER GOAL ON A REGULAR BASIS.

Let's be honest, you really don't know anything about lighting. It is impossible to give exact lighting at the same event hour by hour. Lights run at different temperatures which provide different colors. Different venues, different light sources, different bulbs all lead to a not so perfect world. FIRST had certain set lighting points that had to be met at a minimum. As far as I know those were met a t every regional (they were at the 5 I attended).

Should everything be perfect? It would be nice. However every floor at every venue was different. Did this stop people from running? No. Are football fields perfectly level, hockey rinks ice the same, baseball diamonds the same dimensions? No they are not but they are still used and the game goes on. Let's not put FIRST down on things that they cannot control. The lighting in the dome changed from last year because of an event before we arrived. The lighting people had to come up with lighting that would meet FIRST's minimum requirements. They did a good job. I deal with lighting at our church and know through training seminars and hands on experience that lighting is not an exact science. What works today may not be the same tomorrow. Let's try to improve things not always be tearing them down especially when they are things that are tough to control anyway

JoeXIII'007 26-04-2005 16:07

Re: So close yet so far.
 
Since lighting is inconsistent by nature, and the calibration numbers given never really worked, I have a little suggestion. If the camera system is used next year, I hate to say this, but let's try and go analog on some of the components used in the camera. Perhaps some knobs to configure and calibrate RGB or YGR (whatever cam config FIRST decides on) intensity, so that when the robot is placed on the field, calibration can be done match by match. How? Turn the knob, and when the camera gets a good reading on that particular color in its opinion, it will signal with a green light, otherwise it would stay red and request calibration. Do that for each color.

In programming, it would just return a 1 for something or 0 for nothing, and the calibration wouldn't have to be handled by the computer. (How many times did our team have to reload the program due to just a simple change in calibration?) Quite frankly, the Java application provided was too slow, and not very dependable in my opinion. This way I suggested, turn knobs, and you're on the way, that's it. Not saying that I want the easy way out, I just want a more logical and practical way to handle calibration.

-Joe

Lil' Lavery 26-04-2005 16:15

Re: So close yet so far.
 
Although Im not sure if anyone used them, there were also yellow triangles under the goal, and the colored loading staions that the cameras could look for, so If you banned green, youd have to ban blue, red, and yellow as well to be fair. That would make it so pretty much the only colors you could wear are black and white. Even the cream of the volunteers shirts appeared as green to the cameras, so because of that, the volunteers stood farther away from the field during autonomous.
Basically what Im saying is, if they banned green, they would be forced to ban more. And if all the team shirts, MCs, announcers, voulunteers and robots were nothing but black, white, and metal, it would be a far blander game. The idea was for your camera to be able to pick out the right green object, and if that meant starting your camera tracking once the autonomous mode started, as so the announcer would be farther from your bot by then, so be it.

Dave Flowerday 26-04-2005 16:17

Re: So close yet so far.
 
Quote:

Originally Posted by JoeXIII'007
and the calibration numbers given never really worked

People keep saying this and I don't understand why. I'm surprised that other teams' experience was so different than ours. We used the calibration values provided at 2 different regionals and the Championship event and the camera tracked the specified colors perfectly every time, from both sides of the field, all day long.

Can you explain how you are certain that the calibration numbers didn't work? Are you basing this assumption simply off the fact that your robot didn't go where you wanted it to? If so then you may be jumping to conclusions, which will not help this situation for next year. Instead, this situation needs to be examined carefully to really determine the root cause of your problem or else when FIRST does somehow manage to provide perfect lighting next year and precise calibration values your robot will still drive into a wall.

JoeXIII'007 26-04-2005 16:27

Re: So close yet so far.
 
Quote:

Originally Posted by Dave Flowerday
Can you explain how you are certain that the calibration numbers didn't work? Are you basing this assumption simply off the fact that your robot didn't go where you wanted it to? If so then you may be jumping to conclusions, which will not help this situation for next year. Instead, this situation needs to be examined carefully to really determine the root cause of your problem or else when FIRST does somehow manage to provide perfect lighting next year and precise calibration values your robot will still drive into a wall.

We are VERY certain that some of the provided numbers didn't work. #1. We didn't use some of the provided numbers. We experimented around. #2. If we did use the provided numbers, and they didn't work, the robot would halt in the middle of autonomous, signifying that it didn't recognize or find the color. Sure, it didn't go where we wanted it to sometimes, but our team was testing that camera pretty much after every match, at every regional, and at nationals. There was no assuming as far as I'm concerned.

Quote:

Did you change any of the numbers in the code other than the exposure values?
No.

Dave Flowerday 26-04-2005 16:29

Re: So close yet so far.
 
Quote:

Originally Posted by JoeXIII'007
#1. We didn't use some of the provided numbers. We experimented around.

Did you change any of the numbers in the code other than the exposure values?

Biff 26-04-2005 22:49

Re: So close yet so far.
 
Quote:

Originally Posted by Dave Flowerday
It's not a glib comment - other teams (mine included) anticipated this problem and successfully solved it. Perhaps I missed it, but I didn't see any questions on this forum of how to solve it or I would have explained what we do. Part of engineering is anticipating possible problems and incorporating solutions into the design. One suggestion was already made (point the camera at the ground). Our solution was to have the camera look only at the exact locations that tetras should have been found and use the virtual window function of the camera to further restrict it's view. We did not have a single problem with our vision system at the Championship. We successfully tracked the tetras and goals in every single match we played using the calibration values that FIRST provided. Since our software knew which spots the tetras were located in, we could intelligently decide which one to attempt to pick up, and in the cases where we knew that we couldn't pick up either of them (due to the robot's design and not having enough time) we instead would drive over to the autoloader so that it was ready to go when driver control started.We used the provided numbers at the regionals and Championship and they worked perfectly. I remember hearing after Week 1 regionals that some teams had trouble with the numbers, but I haven't really heard anything like that since. Did other teams have trouble with the numbers in Atlanta?
Are you suggesting that the announcer was deliberately trying to confuse the camera with his clothing? I'm sure you're not, but FIRST made it clear that they would not go so far as to restrict people from wearing green or anything like that.

Dave, you hit he nail on the head with your description of how to use the vision system to get reliable results. By limiting where you are looking first and then tracking to a color in only those locations, it makes sense to me why the calibration numbers worked for your system. If your team would be gracious enough to share the code or a white paper more teams could come to an understanding of how to do it correctly. I was not suggesting the announcer was deliberately trying to cause interference, I was only pointing out that First did provide a rule to discourage intentional vision system confusion. Further reading in this thread indicates 66s' problems would have been solved if they had coded the way that your team did. Thanks for the insight Tom Cooper

Goldeye 27-04-2005 02:33

Re: Camera Issues -- So close yet so far.
 
To everyone who has made a comment about how teams should have simply 'focused' their view on given areas, please realize how difficult a task this really is.

Even under ideal conditions, configuring the camera system was an uncanny feat. In many cases, testing conditions were extremely far from the conditions on a FIRST field, and often extremely detrimental to the camera's function. The java utility could not find any suitable exposure values in my school, due to poor lighting. Still, we used substitute colors that the camera managed to see, and attempted to continue working with those.

There were many problems occuring with many parts of the camera routines. In addition to the troublesome values, controlling the robot at a speed appropriate to the camera readings was as difficult, robot-dependant task. Endless amounts of tweaks were needed for that. As problems came up, we solved them, for example, requiring a minimum size of any object the camera found. Still, by the end of the 6 weeks, we couldn't get the bot happily tracking anything.

Between all these problems and the need to get the robot tracking on near-ideal conditions, there's almost no time to create solutions for changing environmental conditions off of the field. I congratulate teams such as 111 for their ability and time to create such a thorough, reliable code, but for those making rash comments making it seem like an easy thing to do... If FIRST plans to provide such a tool, they should take every reasonable step to avoid conflicting with it. Last year with the IR, and now with the vision, they're not doing their part to allow the teams to focus on the game challenge rather than the world around it.

Bcahn836 27-04-2005 06:43

Re: Camera Issues -- So close yet so far.
 
We got the camera to work in the shop, but we didn't really use it in competition. We picked up the vision tetra and almost put it on the left side center goal but that was an accident. lol. The tetra was placed right in front of us and our auto mode was to go straight while lifting the arm, turn left and drive to the human player loading station. It was awesome. In the shop we were having so many inconsistencies, so we stuck with our other 8 auto modes. I think the camera was mad at us for teasing it with green mountain dew bottles. Here robot, robot, robot, oops can't have the dew. :D

Mike 27-04-2005 09:57

Re: Camera Issues -- So close yet so far.
 
For everyone that was saying that the camera saw the caution tape, volunteer shirts, volunteer badges, etc: I was the one who got the official Calibration numbers for Galileo field and I tried getting numbers on the shirts, etc to see if these made problems. It wouldn't retrieve the numbers though, so that disproves that they were distracting the camera.

Dave Scheck 27-04-2005 10:30

Re: Camera Issues -- So close yet so far.
 
Quote:

Originally Posted by Goldeye
To everyone who has made a comment about how teams should have simply 'focused' their view on given areas, please realize how difficult a task this really is.

I made such a comment and I realize how difficult it is. We spent a lot of time getting our camera interface working reliably. As was mentioned above, our robot scanned the 8 tetra positions to determine what to do. The scanning was really nothing more than pointing the camera at a desired location and looking at a tiny window to determine if the tetra was there. To make it even harder, we had to scan with our lift tilted which complicated the positioning of the camera. Once the object was determined, the camera was manually pointed at it and then was told to scan. Using this metod, the camera was already locked on to the tetra and had no reason to be looking elsewhere.

I made my original comment about focusing as an argument for why a team needs to consider all of the elements that may possibly affect their robot's performance.

I don't know the exact situation of 66's mishap, but is it possible that they were the blue team and that their camera picked up the blue triangle by the autoloader? The camera had a really hard time distinguishing between blue and green....just a thought...

Quote:

Even under ideal conditions, configuring the camera system was an uncanny feat. In many cases, testing conditions were extremely far from the conditions on a FIRST field, and often extremely detrimental to the camera's function.
I 100% agree with you on this. The lighting between the practice field and the stadium was completely different. We had to download new numbers depending on where we were...but as much as that was a pain, it was something that we had to deal with.

Quote:

The java utility could not find any suitable exposure values in my school, due to poor lighting.
A solution that we found for this was to set up halogen work lights around our practice area. You may want to try this and see if it helps.

Dave Flowerday 27-04-2005 12:37

Re: So close yet so far.
 
Quote:

Originally Posted by Biff
If your team would be gracious enough to share the code or a white paper more teams could come to an understanding of how to do it correctly.

I agree, and I'd like to put a whitepaper together along with some of the code we used. However, I can't promise anything just yet. As you might imagine, our interest in working on anything FIRST related diminishes rapidly after Championships as we catch up on the rest of our lives. I was already starting to work on something the other night before you even mentioned this though so maybe it'll get done.. ;)
Quote:

Originally Posted by Goldeye
but for those making rash comments making it seem like an easy thing to do

I don't think anyone was (intentionally) suggesting that any of this was easy to do. The fact that not a single team was able to cap the center goal in autonomous is proof-positive that this is really hard stuff.

Greg Marra 27-04-2005 15:38

Re: Camera Issues -- So close yet so far.
 
On 177 we had managed to code our camera well enough that it was redundant enough to be able to change calibration on the fly during the pre-auton time, as well as using a non-auto-servo-mode tracking method that was able to locate the tetra about 75% of the time (from my estimates).

The only problem was, the rest of our autonomous was being a bit sketchy, so we never did anything except drive towards the vision tetra a little bit...

I would love to see FIRST use the camera again next year. Just hopefully for a slightly easier task...

BillP 28-04-2005 09:36

Re: Camera Issues -- So close yet so far.
 
Quote:

Originally Posted by MikeWasHere05
For everyone that was saying that the camera saw the caution tape, volunteer shirts, volunteer badges, etc: I was the one who got the official Calibration numbers for Galileo field and I tried getting numbers on the shirts, etc to see if these made problems. It wouldn't retrieve the numbers though, so that disproves that they were distracting the camera.

You have to be very careful when making general statements like this. Please keep in mind that all teams using the camera had different mounting systems and different software. We were able to get on the Archimedes field to take some calibrations and discovered that the numbers we got from admin were close, but not exact. However, when we cast shadows on the green and yellow, the numbers were WAY off. Through a little trial and error, we discovered that if the program was calibrated to green and yellow WITH shadows, it still recognized the colors without them but the opposite was not true. In other words, if you calibrated the program without shadows on the colors, it lost them when a shadow fell on the tetra or goal marker.

The fact that you could not get readings off the "distractions" only proves that on that field, at that time, you were unable to get readings off the distractions. I KNOW that our camera locked onto a volunteer's shirt for the following reasons: 1) If the camera does not recognize the color it's looking for, the drive-train shuts down (safety feature) 2) In the match in question, the robot had to turn approximately 30 degrees to line up with the volunteer with the tan shirt and then 3) moved directly toward him. If the camera had not recognized the tan shirt as yellow (this was after we already picked up the tetra) the robot would not have moved toward him.

I could get on my soapbox and spell out everything that FIRST could have done differently, but I would rather just congratulate all the teams that tried to work with the less than optimal parameters at Nationals. In retrospect, the real world is seldom how we would like it to be, and this was an excellent example of how you have to adapt in order to function.

dradius 04-05-2005 23:17

Re: Camera Issues -- So close yet so far.
 
Quote:

Originally Posted by MikeWasHere05
It wouldn't retrieve the numbers though, so that disproves that they were distracting the camera.

one thing though. the calibration values varied from field to field, and from what ive heard there have been different results in each arena also. adding to that, archimedes, specifically, didnt have the right values--i say this because we realized that though they were right theoretically, they didnt account for the shadows casted on the field. had there been better and more omnidirectional or overhead lighting, they probably would have worked.
heres the deal: the shirts are only picked up under ONE circumstance--that you account for shadows on the field and calibrate the yellow you're supposed to track with that in mind. interestingly enough, to pick up the shirts required the exact situation that was necessary to calibrate effective numbers on our field.

thats what we found, and took precautions not only for our ability but for their safety.


All times are GMT -5. The time now is 15:05.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi