![]() |
Re: So close yet so far.
My initial reaction what was happening on the field at Nationals was a bit angry I must admit. I didn't like how things were going and just how all of a sudden things went downhill.
But, being a humble FIRST participant, I rethought, reconsidered, and said to myself, "Hmmm, we could've gotten around this." Unfortunately, experience at past regionals told me that such actions were unneeded, and thus, I and the team was sort of blinded to this possibility that we would have to do such a thing. Between these 2 thoughts, I can't really say how I feel about what happened, but I am definitely sure about the following:
-Joe |
Re: So close yet so far.
Quote:
calibration values were utterly dismaying. every field was wrong. the lighting was horrible--not only inconsistent, but also one sided and not omnidirectional. the former made ugly shadows on the field which messed with the color recognition, and even had the ability to cast a shadow over the lens from the robot itself. in order to accomodate for the crap values, we simply demanded a calibration on archimedes field--nothing else to do. and i brought up a few things about distractions at the drivers meeting on thrusday (i stayed till about 815, waay after the meeting, to talk to the judges). little was accomplished and the judges explained that little could be done about the yellow caution tape on the field and the bad lighting. something DID get their attention, though. i mentioned that in order to get the camera to recognize the yellow on the field, shadows in the hues and saturations had to be accounted for. interestingly enough, the TAN shirts of the field volunteers were a perfect target if the robot so chose. its sad that it took that much calibration to get it to work correctly, but archimdes head referee allowed for one less distraction on the field--the volunteers were required to stand behind the drivers boxes and away from the field during autonomous (this is why i talked to the head ref before every match if a few of you are wondering. i had to remind him) and then they could resume play. the green was bad in some cases, and its too bad that not only did the hex values for the colors they gave us for the vision tetra differ from what was used in the competition, but there was little support for those whos sole purpose at competition was to cap in autonomous AT the competition. considering the new hurdles thrown at us at nats, one might go so far as to think that FIRST had completely forgotten, or even worse, stopped caring about us select teams. slightly discouraging. slightly. tis the case. vision is the next big thing on robots--not color maybe, but some sort. this is the 101 for those interested. a starting platform, so to speak. next year will use the cameras again; im sure of it. i just hope they realize the completely unthoughtful mistakes they made this year regarding the use of it. |
Re: So close yet so far.
First off, I would like to commend team 66 for their fantastic autonomous mode. In my eyes, they had the best vision seeking auto mode in the nation. Wait, let me correct that.... best vision seeking auto mode in the world.
What I do not understand is why some people are blaming team 66 for this incident. Why is it their fault that the camera locked onto the button of a person on the sidelines??? Had FIRST kept their lighting the same from regional to reagional, day to day, even hour to hour, then team 66 would not have had to program it so that the camera picked up all shades of green. Had the lighting been consistant, I am confident that this never would have happened, and I believe that they would have CAPPED THE CENTER GOAL ON A REGULAR BASIS. Thats how good they were. 66- I feel your pain, and all I can say is that things just arent fair sometimes. You guys did a fantastic job this year, and I'm glad that I was able to see your autonomous mode cap a goal in actual competition. |
Re: So close yet so far.
Let me preface this by saying -
To team 66 (and to others that attempted vision): Congratulations on your successes and near successes. Using the vision system this year was a huge leap of faith and to get it to work is commendable. Quote:
Quote:
As has been mentioned above, there are ways that this could've been avoided. Quote:
Granted the lighting was very different between the practice field and the dome, which was different from the regionals, but FIRST at least tried to help by giving us calibration values. Quote:
|
Re: So close yet so far.
Quote:
I think that you have to agree that for all that was tried in vision systems, many teams didn't understand the variable nature of field lighting and white balance. At three regionals and the nationals, lighting was variable and different from either side of the field. I saw the vision tetra go black at one regional when viewed from the player station. You understand the intricacies of color matching with different light sources and know that the camera supplied just didn't have enough tweaks to get 100% but the majority of teams did not. I would be surprised that FIRST paid for the black out drape just to help out vision seekers based on regional results. It could have been more easily (cheaply) handled with lighting at the player station and would have been more easier to control the variations from field to field. There were enough tools to compensate for most problems but they take memory and code to implement. Vision was not as easy as it appeared for the CMU project in a small room. I think it was a good exeercise for auto mode but would have liked to see the camera in the hands of teams in September. If we are going to use it next year, we should be told now, so some learning and testing can be performed. We don't need to know the game to do that just that the camera will be included and what the color(s) will be. I would like to see the same colors used next year so teams can build on what they already know. For those who didn't try but would like to in the future...Green for you isn't the same green for a camera. Your brain gets in the way and can tell a lot about what is out in front of your eyes. A camera cannot make those decisions and the same color in slightly different light looks like a compleely different color to a camera. Green can be black or white or even blue under the right lighting conditions. |
Re: So close yet so far.
Quote:
Should everything be perfect? It would be nice. However every floor at every venue was different. Did this stop people from running? No. Are football fields perfectly level, hockey rinks ice the same, baseball diamonds the same dimensions? No they are not but they are still used and the game goes on. Let's not put FIRST down on things that they cannot control. The lighting in the dome changed from last year because of an event before we arrived. The lighting people had to come up with lighting that would meet FIRST's minimum requirements. They did a good job. I deal with lighting at our church and know through training seminars and hands on experience that lighting is not an exact science. What works today may not be the same tomorrow. Let's try to improve things not always be tearing them down especially when they are things that are tough to control anyway |
Re: So close yet so far.
Since lighting is inconsistent by nature, and the calibration numbers given never really worked, I have a little suggestion. If the camera system is used next year, I hate to say this, but let's try and go analog on some of the components used in the camera. Perhaps some knobs to configure and calibrate RGB or YGR (whatever cam config FIRST decides on) intensity, so that when the robot is placed on the field, calibration can be done match by match. How? Turn the knob, and when the camera gets a good reading on that particular color in its opinion, it will signal with a green light, otherwise it would stay red and request calibration. Do that for each color.
In programming, it would just return a 1 for something or 0 for nothing, and the calibration wouldn't have to be handled by the computer. (How many times did our team have to reload the program due to just a simple change in calibration?) Quite frankly, the Java application provided was too slow, and not very dependable in my opinion. This way I suggested, turn knobs, and you're on the way, that's it. Not saying that I want the easy way out, I just want a more logical and practical way to handle calibration. -Joe |
Re: So close yet so far.
Although Im not sure if anyone used them, there were also yellow triangles under the goal, and the colored loading staions that the cameras could look for, so If you banned green, youd have to ban blue, red, and yellow as well to be fair. That would make it so pretty much the only colors you could wear are black and white. Even the cream of the volunteers shirts appeared as green to the cameras, so because of that, the volunteers stood farther away from the field during autonomous.
Basically what Im saying is, if they banned green, they would be forced to ban more. And if all the team shirts, MCs, announcers, voulunteers and robots were nothing but black, white, and metal, it would be a far blander game. The idea was for your camera to be able to pick out the right green object, and if that meant starting your camera tracking once the autonomous mode started, as so the announcer would be farther from your bot by then, so be it. |
Re: So close yet so far.
Quote:
Can you explain how you are certain that the calibration numbers didn't work? Are you basing this assumption simply off the fact that your robot didn't go where you wanted it to? If so then you may be jumping to conclusions, which will not help this situation for next year. Instead, this situation needs to be examined carefully to really determine the root cause of your problem or else when FIRST does somehow manage to provide perfect lighting next year and precise calibration values your robot will still drive into a wall. |
Re: So close yet so far.
Quote:
Quote:
|
Re: So close yet so far.
Quote:
|
Re: So close yet so far.
Quote:
|
Re: Camera Issues -- So close yet so far.
To everyone who has made a comment about how teams should have simply 'focused' their view on given areas, please realize how difficult a task this really is.
Even under ideal conditions, configuring the camera system was an uncanny feat. In many cases, testing conditions were extremely far from the conditions on a FIRST field, and often extremely detrimental to the camera's function. The java utility could not find any suitable exposure values in my school, due to poor lighting. Still, we used substitute colors that the camera managed to see, and attempted to continue working with those. There were many problems occuring with many parts of the camera routines. In addition to the troublesome values, controlling the robot at a speed appropriate to the camera readings was as difficult, robot-dependant task. Endless amounts of tweaks were needed for that. As problems came up, we solved them, for example, requiring a minimum size of any object the camera found. Still, by the end of the 6 weeks, we couldn't get the bot happily tracking anything. Between all these problems and the need to get the robot tracking on near-ideal conditions, there's almost no time to create solutions for changing environmental conditions off of the field. I congratulate teams such as 111 for their ability and time to create such a thorough, reliable code, but for those making rash comments making it seem like an easy thing to do... If FIRST plans to provide such a tool, they should take every reasonable step to avoid conflicting with it. Last year with the IR, and now with the vision, they're not doing their part to allow the teams to focus on the game challenge rather than the world around it. |
Re: Camera Issues -- So close yet so far.
We got the camera to work in the shop, but we didn't really use it in competition. We picked up the vision tetra and almost put it on the left side center goal but that was an accident. lol. The tetra was placed right in front of us and our auto mode was to go straight while lifting the arm, turn left and drive to the human player loading station. It was awesome. In the shop we were having so many inconsistencies, so we stuck with our other 8 auto modes. I think the camera was mad at us for teasing it with green mountain dew bottles. Here robot, robot, robot, oops can't have the dew. :D
|
Re: Camera Issues -- So close yet so far.
For everyone that was saying that the camera saw the caution tape, volunteer shirts, volunteer badges, etc: I was the one who got the official Calibration numbers for Galileo field and I tried getting numbers on the shirts, etc to see if these made problems. It wouldn't retrieve the numbers though, so that disproves that they were distracting the camera.
|
| All times are GMT -5. The time now is 15:05. |
Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi