View Single Post
  #1   Spotlight this post!  
Unread 28-04-2013, 22:14
cadandcookies's Avatar
cadandcookies cadandcookies is offline
Director of Programs, GOFIRST
AKA: Nick Aarestad
FTC #9205 (The Iron Maidens)
Team Role: College Student
 
Join Date: Jan 2012
Rookie Year: 2009
Location: Minnesnowta
Posts: 1,543
cadandcookies has a reputation beyond reputecadandcookies has a reputation beyond reputecadandcookies has a reputation beyond reputecadandcookies has a reputation beyond reputecadandcookies has a reputation beyond reputecadandcookies has a reputation beyond reputecadandcookies has a reputation beyond reputecadandcookies has a reputation beyond reputecadandcookies has a reputation beyond reputecadandcookies has a reputation beyond reputecadandcookies has a reputation beyond repute
Re: 2013 Lessons Learned: The Negative

Quote:
Originally Posted by bardd View Post
And while we're at it I think FIRST should test their games actually playing matches. I don't know if they do or not, but I think if they did it would prevent this sort of things from happening. It'll be terrible logistically for the GDC though.
Not to derail too much, but from talking with Mr. Merrick about this at Northern Lights/ Lake Superior Regionals this year, he explained a bit about their testing process for games-- basically the have their challenge, and HQ builds some preliminary designs (he was very clear that these were not something any team would want on their robot) in order to see how "hard" a given challenge is. I was going to ask him how they tested something like a 30-point climb would be tested, but unfortunately our time ran a bit short.

So no, from talking with Mr. Merrick, they don't test actual match play, but rather individual mechanisms. I agree that some "internal" matches, or even some better analysis (for example, of the "blizzard") might be in order, especially for a game as complex and awesome as Ultimate Ascent.
Reply With Quote