|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
| Thread Tools | Rate Thread | Display Modes |
|
#31
|
||||
|
||||
|
Re: 2012 Lessons Learned:The Negative
One of my negitives is the FIRST Q and A system. It's horrible. Teams should not EVER have to wait until their first competiton to find out that their robot is illegal if they have tried to clarify it in the Q and A. Furthermore, teams should have a guarentee that their question will be answered. The Q and A's purpose is to help teams design robots within the rules, if team's are not getting helpful answers after waiting who knows how long, that is not acceptable and the Q and A is not doing it's job. When I compair the FIRST Q and A to the VEX Q and A, I am embarassed for FIRST.
Regards, Bryan |
|
#32
|
||||
|
||||
|
Re: 2012 Lessons Learned:The Negative
Quote:
|
|
#33
|
|||||
|
|||||
|
Re: 2012 Lessons Learned:The Negative
Game piece Consistency was a huge issue this year. Wear and tear were somewhat predictable by the time later regionals came around, but balls from different batches were not. From my experience, there were at least 3 or 4 different types of balls, those bought direct from the manufacturer, those bought from AM, those given in the KOP and those used at the Championship.
|
|
#34
|
||||
|
||||
|
Re: 2012 Lessons Learned:The Negative
Quote:
Also, look at all of the teams that won #1 seeds this year. They are consistently really good teams who deserved to be there, and the number of exceptions doesn't seem much different to me than it has been in past years. This indicates to me that it was either not that common or not that easy for teams to collude and hurt the best teams' rankings. |
|
#35
|
||||
|
||||
|
Re: 2012 Lessons Learned:The Negative
Quote:
The ones at St. Louis though had to be completely different. We got so many of those stuck in our loader mechanism it bordered on the ridiculous. It definitely wasn't just the fact they were newer balls. As you said, they were definitely a different batch. |
|
#36
|
||||
|
||||
|
Re: 2012 Lessons Learned:The Negative
Quote:
|
|
#37
|
|||||
|
|||||
|
Re: 2012 Lessons Learned:The Negative
1. Practice field at CMP. Almost all teams who use the practice field are looking to test specific things, in this game they are:
-Running their autonomous programs - This requires a basket and key, and possibly a bridge, all at the correct distances -Shooting balls - This requires a basket and marked key distance -Driving and shooting - This requires more open space than above, but the general key and basket area is usually sufficient to simulate a driver lining up -Triple balance before elims None of these require a field border or radios - A 50' tether is sufficient. At MSC, there was a large carpet area with 2 baskets, 3 bridges, and two movable bumps. Teams would tell the practice field queue what they needed to test and what their setup requirements were, and he would organize where each team could be. There were no practice field radios, all teams ran on tethers. Immediately after alliance selections, the three bridges were reserved for each alliance in the order they had to play (e.g. QF1 teams gets to practice before QF4 teams), allowing each alliance to practice their triple balance. The practice field seemed adequate for the volume of teams - a 64 team event with 2 baskets and 3 bridges, allowing multiple teams to use a set of baskets at the same time At CMP, there were two full practice fields which required several hours of advanced notice to sign up for an use. The radios also caused mass confusion due to the mis-coloring of the red and blue radios (at least on the Galileo practice field). They had a single set of baskets behind the practice field, and two more in the annex on the way to the dome. For an event of this size, there should have been at least 3 bridges and 4 baskets per division (total of 12 bridges and 16 baskets required, there were only 6 bridges and 7 baskets at the whole event this year) 2. I have to say it, the quality of teams at CMP this year was disappointing. The league is too large to hope to allow each team to go to CMP every few years, so even trying seems pointless. 3. The vision system this year was fairly good, but a different geometric shape should be chosen (how about a circle?) - There are too many bright rectangular objects but relatively few bright circular objects. I know our vision system was often confused by large white display screens directly behind fields. 4. I've already talked to a few NI people about this, but (at least in LabVIEW) most of the CPU load on the processor is overhead from LabVIEW and library inefficiency, not actual team code. 5. On a related note, the fact that teams are able to hit 90% CPU utilization without running vision on the robot amazes me. The processor is definitely powerful enough (the old IFI PIC and Vex Cortex systems run much smaller processors, and almost everything being done now could be done then), but the inefficiency is SO HUGE 6. Ball consistency on bridges. At all previous events, the field reset people were placing the balls on two holes in the center of the bridge. At CMP (Galileo division), we found that they were consistently placing them on the outside holes, or in various other symmetric but not centered positions, during our matches, but only on the center bridge. Our scouting team investigated the issue, and the balls were only being placed off-center during only our matches and in matches where teams were attempting bridge autonomous modes. When we asked the field reset and head ref on our field, they claimed that the balls had to be symmetric, nothing else, and started placing the balls on the alliance bridges differently as well. We brought a Q&A question which stated that the balls would be centered on the bridge, and they ignored it. We talked to Aiden Brown on the issue, and the ball placement stopped immediately after. |
|
#38
|
||||
|
||||
|
Re: 2012 Lessons Learned:The Negative
The practice field situation at the championships was pitiful. 400 teams, 2 full, and 3 wooden fields? Come on. It would be nice if they left an area for teams to set up their own goals.
|
|
#39
|
|||
|
|||
|
Re: 2012 Lessons Learned:The Negative
Quote:
The highlighted part is where you yourself can see why our alliance did not co-op balance. The co-op bridge is for mutual benefit. Our alliance, as stated by yourself, had no potential benefit from doing so. We all had 1 or 2 matches each left in qualifications, and we knew our standings would not improve. The co-op bridge goal is to mutually boost rankings. Since we did not have a legitimate chance at the top 8 seeds, we chose not to co-op and instead we chose to display the strong suits of each of our robots. We were not being ungracious or unprofessional about the situation. We were playing our game as we saw fit. We, in your best interest, had let you know of the strategy beforehand so that there would be no surprise. Also, none of the teams involved were pressured into agreeing on the strategy. 254, 415, and 3929 were pretty unanimous about it. PM me if you would like to continue the discussion further. ![]() Last edited by Akash Rastogi : 02-05-2012 at 12:57. |
|
#40
|
||||
|
||||
|
Re: 2012 Lessons Learned:The Negative
The coopertition award, yes the coopertition bridge was successful in mixing up the rankings but it made the award based way too much on luck. Last years coopertition award system was the best ever. To win it you had to: A make a minibot that was one that other teams wanted to use because it was a good performer, B approach and "sell" other teams on why they should use your minibot, C sometimes assist that team in either adapting their deployment system to accept your minibot or building or help them build a system to deploy it. For us it created some strong bonds between those teams to which we loaned minibots that continues, as well as being used by other teams to promote their teams and expand their programs since in many cases we sent them home with the teams after the season was over.
Kinnect, it essentially made the GDC look as if they were for sale. On a related note the Innovation in control award which at least at the events I attended was based on being the only team using the Kinnect. The Q&A system which was even harder to navigate and find what you were looking for than before. Never mind the "no comment" answers as in previous years. |
|
#41
|
|||
|
|||
|
Re: 2012 Lessons Learned:The Negative
1) Follow and enforce the rules that are in place. If teams are expected to follow the rules, as should field personnel, inspectors, and volunteers.
2) Don't ignore the students. Multiple times our students questioned a situation that did not agree with rules and procedures published, and were either ridiculed or ignored. |
|
#42
|
|||
|
|||
|
Re: 2012 Lessons Learned:The Negative
Quote:
1. The fields were too close to the spectator seating. You could not see the closest alliance bridge well from most of the seats. Moving them away 20-40 feet would help a lot. 2. I would like to see a move to more competition based entrance into the championship. I have no problem rookie all stars and HOF teams being there but it when there are 50% or more teams in all the divisions that have very limited robots and teams lack experience with the game it makes seeding matches a very "Luck of the draw" thing. Although there are many more very good teams in a championship division than a district or regional event the less experienced teams brought the seeding matches down to district event type play in my opinion. It didn't look like a championship to me until after alliance selection. 3. There should be a team entrance on the side of the dome where the trailers and parking was. I am not sure about others but our team likes to tailgate at these events and it is a pain to walk all around the dome to get back in. Just some of my opinions. |
|
#43
|
||||
|
||||
|
Re: 2012 Lessons Learned:The Negative
Quote:
That brings up something that's kind of irked me for a couple of years. Judges could CARE LESS if the programmers on a team built a custom display app out of Java or C++ that displays data to the drivers. They either dismiss is as "oh, anyone can build a web page" or "the driver's display means your stuff isn't automated". What a load of crap. Building a Java app with a custom layout tailored for the drivers is the closest thing a team will get to a full software development cycle in FRC (including Systems Engineering, Coding, I&T phases). It also requires some technical prowess to debug things in order to ensure the system runs smoothly while on the field. It's also more realistic since there isn't a single automated dynamic interactive system in the world that doesn't have some sort of human-in-the-loop control. Eventually our second driver will move solely to the app on a touchscreen panel when open-architecture tablets become reasonably priced. Maybe then the judges will think something of it ![]() Last edited by JesseK : 02-05-2012 at 13:34. |
|
#44
|
|||||
|
|||||
|
Re: 2012 Lessons Learned:The Negative
Quote:
Quote:
|
|
#45
|
||||
|
||||
|
The Seating for Einstein was indeed frustrating
i think it would be waaay better if the Awards and the Einstein field were positioned in the widest side od the Dome (were Newton and Archimides were set up this year) that way more people would get a chance to have a better view of the game. |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|