Go to Post Being professional means doing things right. Being gracious means doing the right thing. They complement each other perfectly. But I think we should remember to use GP as a guide for our own behavior, not as a yardstick to measure others' shortcomings. - Alan Anderson [more]
Home
Go Back   Chief Delphi > Technical > Programming
CD-Media   CD-Spy  
portal register members calendar search Today's Posts Mark Forums Read FAQ rules

 
Closed Thread
Thread Tools Rate Thread Display Modes
  #16   Spotlight this post!  
Unread 25-04-2005, 22:50
JoeXIII'007's Avatar
JoeXIII'007 JoeXIII'007 is offline
Pragmatic Strategy, I try...
AKA: Joeseph Smith
FRC #0066
Team Role: Alumni
 
Join Date: Feb 2004
Rookie Year: 2001
Location: Ypsilanti, MI (Ann Arbor's shadow)
Posts: 753
JoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond repute
Send a message via AIM to JoeXIII'007
Re: So close yet so far.

My initial reaction what was happening on the field at Nationals was a bit angry I must admit. I didn't like how things were going and just how all of a sudden things went downhill.

But, being a humble FIRST participant, I rethought, reconsidered, and said to myself, "Hmmm, we could've gotten around this." Unfortunately, experience at past regionals told me that such actions were unneeded, and thus, I and the team was sort of blinded to this possibility that we would have to do such a thing.

Between these 2 thoughts, I can't really say how I feel about what happened, but I am definitely sure about the following:
  • We tried, and I'm happy to be able to say that.
  • We got close, and was even successful 3 times in practice and once officially at GLR.
  • We gave something for everyone to see, and added to the suspense of the game. With that in mind I've had a recurring thought about what would happen if we actually capped center or side consistently. May be that suspense would have been lost, the excitement diminished, and the curiosity of 'when will they get it right' would have gone extinct. It would be an expected, normal event. I even saw a referee give a very animated reaction when we got close at nationals. So, it was REALLY fun to see it work like this.
  • Very few others were doing this, so it was a real treat to see at least one robot go after something green.
  • Due to all of this, we recieved the Delphi "Driving Tomarrows Technology" award, which I am eternally grateful to FIRST for giving. THANK YOU VERY MUCH.
Even though green objects outside the field led the robot to terminal damage, things happen, and it isn't the end of the world, thank God. I hope all who saw us enjoyed our little stunt of sorts, and I end my 3 cents there.
-Joe
__________________
Joeseph P. Smith
jpthesmithe.com
University of Michigan - Informatics (B. Sci. 2012)
General Purpose Programmer - Cooperative Institute for Limnology and Ecosystems Research (CILER) at NOAA-GLERL
  #17   Spotlight this post!  
Unread 25-04-2005, 23:25
dradius's Avatar
dradius dradius is offline
breakdancaz
AKA: ryan
FRC #0624 (CRyptonite)
Team Role: Engineer
 
Join Date: Jan 2004
Rookie Year: 2001
Location: Katy, TX.
Posts: 37
dradius will become famous soon enough
Send a message via AIM to dradius Send a message via MSN to dradius
Re: So close yet so far.

Quote:
Originally Posted by Biff
Other comments in the forums here lead me to believe that the numbers for calibrations were at best inconsistent.
Looking at the playing fields with the skills I have learned shading cameras lead me to conclude the only way to get a good set of calibration numbers was to put your robot out there and do it yourself. This was not accounted for.
In the rules, and question and answer system mentions about programming for interference from colored objects. It was also stated that deliberate color schemes and clothing that sole intent was to confuse the vision system were against the rules.
what actualy makes it interesting is that it was more like 4 out of 1650 teams total and 360ish at nats.

calibration values were utterly dismaying. every field was wrong. the lighting was horrible--not only inconsistent, but also one sided and not omnidirectional. the former made ugly shadows on the field which messed with the color recognition, and even had the ability to cast a shadow over the lens from the robot itself. in order to accomodate for the crap values, we simply demanded a calibration on archimedes field--nothing else to do.

and i brought up a few things about distractions at the drivers meeting on thrusday (i stayed till about 815, waay after the meeting, to talk to the judges). little was accomplished and the judges explained that little could be done about the yellow caution tape on the field and the bad lighting. something DID get their attention, though. i mentioned that in order to get the camera to recognize the yellow on the field, shadows in the hues and saturations had to be accounted for. interestingly enough, the TAN shirts of the field volunteers were a perfect target if the robot so chose. its sad that it took that much calibration to get it to work correctly, but archimdes head referee allowed for one less distraction on the field--the volunteers were required to stand behind the drivers boxes and away from the field during autonomous (this is why i talked to the head ref before every match if a few of you are wondering. i had to remind him) and then they could resume play.

the green was bad in some cases, and its too bad that not only did the hex values for the colors they gave us for the vision tetra differ from what was used in the competition, but there was little support for those whos sole purpose at competition was to cap in autonomous AT the competition. considering the new hurdles thrown at us at nats, one might go so far as to think that FIRST had completely forgotten, or even worse, stopped caring about us select teams. slightly discouraging. slightly. tis the case.

vision is the next big thing on robots--not color maybe, but some sort. this is the 101 for those interested. a starting platform, so to speak. next year will use the cameras again; im sure of it. i just hope they realize the completely unthoughtful mistakes they made this year regarding the use of it.
__________________
pessimist: glass is half empty
optimist: glass is half full
engineer: glass is twice the size it needs to be

--bringing green spikes and fishnets back in to style, baby--
  #18   Spotlight this post!  
Unread 26-04-2005, 11:58
Will Hanashiro Will Hanashiro is offline
GM/Kettering Mentor
FRC #0322 (F.I.R.E)
Team Role: Mentor
 
Join Date: Apr 2003
Rookie Year: 1999
Location: Flint, Michigan
Posts: 151
Will Hanashiro is just really niceWill Hanashiro is just really niceWill Hanashiro is just really niceWill Hanashiro is just really niceWill Hanashiro is just really nice
Send a message via AIM to Will Hanashiro
Re: So close yet so far.

First off, I would like to commend team 66 for their fantastic autonomous mode. In my eyes, they had the best vision seeking auto mode in the nation. Wait, let me correct that.... best vision seeking auto mode in the world.

What I do not understand is why some people are blaming team 66 for this incident. Why is it their fault that the camera locked onto the button of a person on the sidelines??? Had FIRST kept their lighting the same from regional to reagional, day to day, even hour to hour, then team 66 would not have had to program it so that the camera picked up all shades of green. Had the lighting been consistant, I am confident that this never would have happened, and I believe that they would have CAPPED THE CENTER GOAL ON A REGULAR BASIS. Thats how good they were.

66- I feel your pain, and all I can say is that things just arent fair sometimes. You guys did a fantastic job this year, and I'm glad that I was able to see your autonomous mode cap a goal in actual competition.
__________________

West Michigan Regional #1 Seed
2006 GLR Motorola Quality Award
2006 WMR GM Industrial Design Award

Kettering University ME Class of 2009
  #19   Spotlight this post!  
Unread 26-04-2005, 12:54
Dave Scheck's Avatar
Dave Scheck Dave Scheck is offline
Registered User
FRC #0111 (WildStang)
Team Role: Engineer
 
Join Date: Feb 2003
Rookie Year: 2002
Location: Arlington Heights, IL
Posts: 574
Dave Scheck has a reputation beyond reputeDave Scheck has a reputation beyond reputeDave Scheck has a reputation beyond reputeDave Scheck has a reputation beyond reputeDave Scheck has a reputation beyond reputeDave Scheck has a reputation beyond reputeDave Scheck has a reputation beyond reputeDave Scheck has a reputation beyond reputeDave Scheck has a reputation beyond reputeDave Scheck has a reputation beyond reputeDave Scheck has a reputation beyond repute
Re: So close yet so far.

Let me preface this by saying -
To team 66 (and to others that attempted vision): Congratulations on your successes and near successes. Using the vision system this year was a huge leap of faith and to get it to work is commendable.

Quote:
Originally Posted by Will Hanashiro
What I do not understand is why some people are blaming team 66 for this incident.
I don't think that anyone is blaming them. I think that people are just pointing out that the environment was what it was and that it should have been taken into account.
Quote:
Why is it their fault that the camera locked onto the button of a person on the sidelines???
In all honesty, there was no reason that the camera should have been looking 5 feet off the ground for a vision tetra.
As has been mentioned above, there are ways that this could've been avoided.
Quote:
Had FIRST kept their lighting the same from regional to reagional, day to day, even hour to hour, then team 66 would not have had to program it so that the camera picked up all shades of green.
That's a really absurd statement. Did you happen to notice that the entire roof of the Georgia Dome was blacked out? That probably cost them a pretty penny so that they could at least give teams that were using the vision system a fighting chance. That seems like a lot of money so that a few could show their stuff.

Granted the lighting was very different between the practice field and the dome, which was different from the regionals, but FIRST at least tried to help by giving us calibration values.
Quote:
all I can say is that things just arent fair sometimes.
Really? I thought that they were perfectly fair. Each team had the same opportunity to get the vision system to work. I didn't see one team having an advantage over another at all.
  #20   Spotlight this post!  
Unread 26-04-2005, 13:28
Unsung FIRST Hero
Al Skierkiewicz Al Skierkiewicz is offline
Broadcast Eng/Chief Robot Inspector
AKA: Big Al WFFA 2005
FRC #0111 (WildStang)
Team Role: Engineer
 
Join Date: Jun 2001
Rookie Year: 1996
Location: Wheeling, IL
Posts: 10,766
Al Skierkiewicz has a reputation beyond reputeAl Skierkiewicz has a reputation beyond reputeAl Skierkiewicz has a reputation beyond reputeAl Skierkiewicz has a reputation beyond reputeAl Skierkiewicz has a reputation beyond reputeAl Skierkiewicz has a reputation beyond reputeAl Skierkiewicz has a reputation beyond reputeAl Skierkiewicz has a reputation beyond reputeAl Skierkiewicz has a reputation beyond reputeAl Skierkiewicz has a reputation beyond reputeAl Skierkiewicz has a reputation beyond repute
Re: So close yet so far.

Quote:
Originally Posted by Biff
As a television engineer I have worked with cameras for over 25 years. Glib comments about "it's just another item you have to work around" don't help. Our attempts at getting anything useful out of our camera system were utter failures.
Biff,
I think that you have to agree that for all that was tried in vision systems, many teams didn't understand the variable nature of field lighting and white balance. At three regionals and the nationals, lighting was variable and different from either side of the field. I saw the vision tetra go black at one regional when viewed from the player station. You understand the intricacies of color matching with different light sources and know that the camera supplied just didn't have enough tweaks to get 100% but the majority of teams did not. I would be surprised that FIRST paid for the black out drape just to help out vision seekers based on regional results. It could have been more easily (cheaply) handled with lighting at the player station and would have been more easier to control the variations from field to field.
There were enough tools to compensate for most problems but they take memory and code to implement. Vision was not as easy as it appeared for the CMU project in a small room. I think it was a good exeercise for auto mode but would have liked to see the camera in the hands of teams in September. If we are going to use it next year, we should be told now, so some learning and testing can be performed. We don't need to know the game to do that just that the camera will be included and what the color(s) will be. I would like to see the same colors used next year so teams can build on what they already know.
For those who didn't try but would like to in the future...Green for you isn't the same green for a camera. Your brain gets in the way and can tell a lot about what is out in front of your eyes. A camera cannot make those decisions and the same color in slightly different light looks like a compleely different color to a camera. Green can be black or white or even blue under the right lighting conditions.
__________________
Good Luck All. Learn something new, everyday!
Al
WB9UVJ
www.wildstang.org
________________________
Storming the Tower since 1996.
  #21   Spotlight this post!  
Unread 26-04-2005, 13:29
Steve W Steve W is offline
Grow Up? Why?
no team
 
Join Date: Feb 2003
Rookie Year: 2002
Location: Toronto,Ontario Canada
Posts: 2,523
Steve W has a reputation beyond reputeSteve W has a reputation beyond reputeSteve W has a reputation beyond reputeSteve W has a reputation beyond reputeSteve W has a reputation beyond reputeSteve W has a reputation beyond reputeSteve W has a reputation beyond reputeSteve W has a reputation beyond reputeSteve W has a reputation beyond reputeSteve W has a reputation beyond reputeSteve W has a reputation beyond repute
Re: So close yet so far.

Quote:
Originally Posted by Will Hanashiro
Had FIRST kept their lighting the same from regional to reagional, day to day, even hour to hour, then team 66 would not have had to program it so that the camera picked up all shades of green. Had the lighting been consistant, I am confident that this never would have happened, and I believe that they would have CAPPED THE CENTER GOAL ON A REGULAR BASIS.
Let's be honest, you really don't know anything about lighting. It is impossible to give exact lighting at the same event hour by hour. Lights run at different temperatures which provide different colors. Different venues, different light sources, different bulbs all lead to a not so perfect world. FIRST had certain set lighting points that had to be met at a minimum. As far as I know those were met a t every regional (they were at the 5 I attended).

Should everything be perfect? It would be nice. However every floor at every venue was different. Did this stop people from running? No. Are football fields perfectly level, hockey rinks ice the same, baseball diamonds the same dimensions? No they are not but they are still used and the game goes on. Let's not put FIRST down on things that they cannot control. The lighting in the dome changed from last year because of an event before we arrived. The lighting people had to come up with lighting that would meet FIRST's minimum requirements. They did a good job. I deal with lighting at our church and know through training seminars and hands on experience that lighting is not an exact science. What works today may not be the same tomorrow. Let's try to improve things not always be tearing them down especially when they are things that are tough to control anyway
__________________
We do not stop playing because we grow old;
we grow old because we stop playing.
  #22   Spotlight this post!  
Unread 26-04-2005, 16:07
JoeXIII'007's Avatar
JoeXIII'007 JoeXIII'007 is offline
Pragmatic Strategy, I try...
AKA: Joeseph Smith
FRC #0066
Team Role: Alumni
 
Join Date: Feb 2004
Rookie Year: 2001
Location: Ypsilanti, MI (Ann Arbor's shadow)
Posts: 753
JoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond repute
Send a message via AIM to JoeXIII'007
Re: So close yet so far.

Since lighting is inconsistent by nature, and the calibration numbers given never really worked, I have a little suggestion. If the camera system is used next year, I hate to say this, but let's try and go analog on some of the components used in the camera. Perhaps some knobs to configure and calibrate RGB or YGR (whatever cam config FIRST decides on) intensity, so that when the robot is placed on the field, calibration can be done match by match. How? Turn the knob, and when the camera gets a good reading on that particular color in its opinion, it will signal with a green light, otherwise it would stay red and request calibration. Do that for each color.

In programming, it would just return a 1 for something or 0 for nothing, and the calibration wouldn't have to be handled by the computer. (How many times did our team have to reload the program due to just a simple change in calibration?) Quite frankly, the Java application provided was too slow, and not very dependable in my opinion. This way I suggested, turn knobs, and you're on the way, that's it. Not saying that I want the easy way out, I just want a more logical and practical way to handle calibration.

-Joe
__________________
Joeseph P. Smith
jpthesmithe.com
University of Michigan - Informatics (B. Sci. 2012)
General Purpose Programmer - Cooperative Institute for Limnology and Ecosystems Research (CILER) at NOAA-GLERL
  #23   Spotlight this post!  
Unread 26-04-2005, 16:15
Lil' Lavery Lil' Lavery is online now
TSIMFD
AKA: Sean Lavery
FRC #1712 (DAWGMA)
Team Role: Mentor
 
Join Date: Nov 2003
Rookie Year: 2003
Location: Philadelphia, PA
Posts: 6,589
Lil' Lavery has a reputation beyond reputeLil' Lavery has a reputation beyond reputeLil' Lavery has a reputation beyond reputeLil' Lavery has a reputation beyond reputeLil' Lavery has a reputation beyond reputeLil' Lavery has a reputation beyond reputeLil' Lavery has a reputation beyond reputeLil' Lavery has a reputation beyond reputeLil' Lavery has a reputation beyond reputeLil' Lavery has a reputation beyond reputeLil' Lavery has a reputation beyond repute
Send a message via AIM to Lil' Lavery
Re: So close yet so far.

Although Im not sure if anyone used them, there were also yellow triangles under the goal, and the colored loading staions that the cameras could look for, so If you banned green, youd have to ban blue, red, and yellow as well to be fair. That would make it so pretty much the only colors you could wear are black and white. Even the cream of the volunteers shirts appeared as green to the cameras, so because of that, the volunteers stood farther away from the field during autonomous.
Basically what Im saying is, if they banned green, they would be forced to ban more. And if all the team shirts, MCs, announcers, voulunteers and robots were nothing but black, white, and metal, it would be a far blander game. The idea was for your camera to be able to pick out the right green object, and if that meant starting your camera tracking once the autonomous mode started, as so the announcer would be farther from your bot by then, so be it.
__________________
Being correct doesn't mean you don't have to explain yourself.
  #24   Spotlight this post!  
Unread 26-04-2005, 16:17
Dave Flowerday Dave Flowerday is offline
Software Engineer
VRC #0111 (Wildstang)
Team Role: Engineer
 
Join Date: Feb 2002
Rookie Year: 1995
Location: North Barrington, IL
Posts: 1,366
Dave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond repute
Re: So close yet so far.

Quote:
Originally Posted by JoeXIII'007
and the calibration numbers given never really worked
People keep saying this and I don't understand why. I'm surprised that other teams' experience was so different than ours. We used the calibration values provided at 2 different regionals and the Championship event and the camera tracked the specified colors perfectly every time, from both sides of the field, all day long.

Can you explain how you are certain that the calibration numbers didn't work? Are you basing this assumption simply off the fact that your robot didn't go where you wanted it to? If so then you may be jumping to conclusions, which will not help this situation for next year. Instead, this situation needs to be examined carefully to really determine the root cause of your problem or else when FIRST does somehow manage to provide perfect lighting next year and precise calibration values your robot will still drive into a wall.
  #25   Spotlight this post!  
Unread 26-04-2005, 16:27
JoeXIII'007's Avatar
JoeXIII'007 JoeXIII'007 is offline
Pragmatic Strategy, I try...
AKA: Joeseph Smith
FRC #0066
Team Role: Alumni
 
Join Date: Feb 2004
Rookie Year: 2001
Location: Ypsilanti, MI (Ann Arbor's shadow)
Posts: 753
JoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond reputeJoeXIII'007 has a reputation beyond repute
Send a message via AIM to JoeXIII'007
Re: So close yet so far.

Quote:
Originally Posted by Dave Flowerday
Can you explain how you are certain that the calibration numbers didn't work? Are you basing this assumption simply off the fact that your robot didn't go where you wanted it to? If so then you may be jumping to conclusions, which will not help this situation for next year. Instead, this situation needs to be examined carefully to really determine the root cause of your problem or else when FIRST does somehow manage to provide perfect lighting next year and precise calibration values your robot will still drive into a wall.
We are VERY certain that some of the provided numbers didn't work. #1. We didn't use some of the provided numbers. We experimented around. #2. If we did use the provided numbers, and they didn't work, the robot would halt in the middle of autonomous, signifying that it didn't recognize or find the color. Sure, it didn't go where we wanted it to sometimes, but our team was testing that camera pretty much after every match, at every regional, and at nationals. There was no assuming as far as I'm concerned.

Quote:
Did you change any of the numbers in the code other than the exposure values?
No.
__________________
Joeseph P. Smith
jpthesmithe.com
University of Michigan - Informatics (B. Sci. 2012)
General Purpose Programmer - Cooperative Institute for Limnology and Ecosystems Research (CILER) at NOAA-GLERL

Last edited by JoeXIII'007 : 26-04-2005 at 16:34.
  #26   Spotlight this post!  
Unread 26-04-2005, 16:29
Dave Flowerday Dave Flowerday is offline
Software Engineer
VRC #0111 (Wildstang)
Team Role: Engineer
 
Join Date: Feb 2002
Rookie Year: 1995
Location: North Barrington, IL
Posts: 1,366
Dave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond reputeDave Flowerday has a reputation beyond repute
Re: So close yet so far.

Quote:
Originally Posted by JoeXIII'007
#1. We didn't use some of the provided numbers. We experimented around.
Did you change any of the numbers in the code other than the exposure values?
  #27   Spotlight this post!  
Unread 26-04-2005, 22:49
Biff Biff is offline
Registered User
AKA: Tom Cooper
#1227 (Techno Gremlins)
Team Role: Mentor
 
Join Date: Jan 2004
Location: Grand Rapids MI
Posts: 214
Biff is a jewel in the roughBiff is a jewel in the roughBiff is a jewel in the roughBiff is a jewel in the rough
Re: So close yet so far.

Quote:
Originally Posted by Dave Flowerday
It's not a glib comment - other teams (mine included) anticipated this problem and successfully solved it. Perhaps I missed it, but I didn't see any questions on this forum of how to solve it or I would have explained what we do. Part of engineering is anticipating possible problems and incorporating solutions into the design. One suggestion was already made (point the camera at the ground). Our solution was to have the camera look only at the exact locations that tetras should have been found and use the virtual window function of the camera to further restrict it's view. We did not have a single problem with our vision system at the Championship. We successfully tracked the tetras and goals in every single match we played using the calibration values that FIRST provided. Since our software knew which spots the tetras were located in, we could intelligently decide which one to attempt to pick up, and in the cases where we knew that we couldn't pick up either of them (due to the robot's design and not having enough time) we instead would drive over to the autoloader so that it was ready to go when driver control started.We used the provided numbers at the regionals and Championship and they worked perfectly. I remember hearing after Week 1 regionals that some teams had trouble with the numbers, but I haven't really heard anything like that since. Did other teams have trouble with the numbers in Atlanta?
Are you suggesting that the announcer was deliberately trying to confuse the camera with his clothing? I'm sure you're not, but FIRST made it clear that they would not go so far as to restrict people from wearing green or anything like that.
Dave, you hit he nail on the head with your description of how to use the vision system to get reliable results. By limiting where you are looking first and then tracking to a color in only those locations, it makes sense to me why the calibration numbers worked for your system. If your team would be gracious enough to share the code or a white paper more teams could come to an understanding of how to do it correctly. I was not suggesting the announcer was deliberately trying to cause interference, I was only pointing out that First did provide a rule to discourage intentional vision system confusion. Further reading in this thread indicates 66s' problems would have been solved if they had coded the way that your team did. Thanks for the insight Tom Cooper
  #28   Spotlight this post!  
Unread 27-04-2005, 02:33
Goldeye Goldeye is offline
Registered User
AKA: Josh Hecht
FRC #0694 (Stuypulse)
Team Role: College Student
 
Join Date: Jan 2005
Rookie Year: 2005
Location: New York
Posts: 145
Goldeye has a spectacular aura aboutGoldeye has a spectacular aura aboutGoldeye has a spectacular aura about
Send a message via AIM to Goldeye
Re: Camera Issues -- So close yet so far.

To everyone who has made a comment about how teams should have simply 'focused' their view on given areas, please realize how difficult a task this really is.

Even under ideal conditions, configuring the camera system was an uncanny feat. In many cases, testing conditions were extremely far from the conditions on a FIRST field, and often extremely detrimental to the camera's function. The java utility could not find any suitable exposure values in my school, due to poor lighting. Still, we used substitute colors that the camera managed to see, and attempted to continue working with those.

There were many problems occuring with many parts of the camera routines. In addition to the troublesome values, controlling the robot at a speed appropriate to the camera readings was as difficult, robot-dependant task. Endless amounts of tweaks were needed for that. As problems came up, we solved them, for example, requiring a minimum size of any object the camera found. Still, by the end of the 6 weeks, we couldn't get the bot happily tracking anything.

Between all these problems and the need to get the robot tracking on near-ideal conditions, there's almost no time to create solutions for changing environmental conditions off of the field. I congratulate teams such as 111 for their ability and time to create such a thorough, reliable code, but for those making rash comments making it seem like an easy thing to do... If FIRST plans to provide such a tool, they should take every reasonable step to avoid conflicting with it. Last year with the IR, and now with the vision, they're not doing their part to allow the teams to focus on the game challenge rather than the world around it.
__________________
Team 694

2005 Championship - Galileo Semifinalist
2005 New York - Regional Chairmans Award
2005 New York - Semifinalist (Thanks 1257,1340)
  #29   Spotlight this post!  
Unread 27-04-2005, 06:43
Bcahn836's Avatar
Bcahn836 Bcahn836 is offline
Iraq is fun.
AKA: Brad Cahn
no team (Robobees 836)
Team Role: Alumni
 
Join Date: Dec 2003
Rookie Year: 2003
Location: Camp Taji, Iraq
Posts: 1,774
Bcahn836 has a reputation beyond reputeBcahn836 has a reputation beyond reputeBcahn836 has a reputation beyond reputeBcahn836 has a reputation beyond reputeBcahn836 has a reputation beyond reputeBcahn836 has a reputation beyond reputeBcahn836 has a reputation beyond reputeBcahn836 has a reputation beyond reputeBcahn836 has a reputation beyond reputeBcahn836 has a reputation beyond reputeBcahn836 has a reputation beyond repute
Send a message via AIM to Bcahn836 Send a message via Yahoo to Bcahn836
Re: Camera Issues -- So close yet so far.

We got the camera to work in the shop, but we didn't really use it in competition. We picked up the vision tetra and almost put it on the left side center goal but that was an accident. lol. The tetra was placed right in front of us and our auto mode was to go straight while lifting the arm, turn left and drive to the human player loading station. It was awesome. In the shop we were having so many inconsistencies, so we stuck with our other 8 auto modes. I think the camera was mad at us for teasing it with green mountain dew bottles. Here robot, robot, robot, oops can't have the dew.
  #30   Spotlight this post!  
Unread 27-04-2005, 09:57
Mike's Avatar
Mike Mike is offline
has common ground with Matt Krass
AKA: Mike Sorrenti
FRC #0237 (Sie-H2O-Bots (See-Hoe-Bots) [T.R.I.B.E.])
Team Role: Programmer
 
Join Date: Dec 2004
Rookie Year: 2004
Location: Watertown, CT
Posts: 1,003
Mike has a reputation beyond reputeMike has a reputation beyond reputeMike has a reputation beyond reputeMike has a reputation beyond reputeMike has a reputation beyond reputeMike has a reputation beyond reputeMike has a reputation beyond reputeMike has a reputation beyond reputeMike has a reputation beyond reputeMike has a reputation beyond reputeMike has a reputation beyond repute
Re: Camera Issues -- So close yet so far.

For everyone that was saying that the camera saw the caution tape, volunteer shirts, volunteer badges, etc: I was the one who got the official Calibration numbers for Galileo field and I tried getting numbers on the shirts, etc to see if these made problems. It wouldn't retrieve the numbers though, so that disproves that they were distracting the camera.
__________________
http://www.mikesorrenti.com/
Closed Thread


Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Scripting Setup and the Camera + Serial Port Drivers CJO Programming 22 11-01-2006 17:42
Unresponsive camera neilsonster Programming 9 17-02-2005 08:51
Camera issues Todd Programming 0 16-02-2005 19:06
Kevin Watson's Kick-off Demo Code! Mr. Lim Programming 27 22-01-2005 03:38
CMUCam2 Camera Code - Are important parts commented out? Mr. Lim Programming 4 14-01-2005 12:11


All times are GMT -5. The time now is 17:16.

The Chief Delphi Forums are sponsored by Innovation First International, Inc.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
Copyright © Chief Delphi