Where’s the fun in that? 
I agree that it would be helpful to have access to info that FMS already tracks, such as RTS like you point out, but by saying to FIRST “redesigned FMS to track robots and game pieces, and field elements, and …” to me sounds like you’re saying “do it for me, I don’t want to.” For me, most of the fun in programming is the challenge of figuring out how to make things work myself using systems that I’ve designed. Forgive me for being ego-centric.
From another, often taken, point of view, FIRST to some degree is supposed to introduce (granted some control on the environment) real-life situations that robots might have to adapt to. Take unmanned search and rescue, for instance, a “hot topic” in the field of robotics right now. There’s no “God” system that automatically tells the robots “oh hey, there’s a person hidden under 20 ft. of rubble over there, go get him.” That’s the job of the robot to find out.
If tracking the positions of other robots is something that you want to do, figure out some sensors that could accommodate that. FIRST didn’t give us a powerful new control system for nothing. I’m personally very interested in exploring what vision systems are capable of; we’ve only scratched the very surface of what they can do. In the off-season last year (using a co-processor) I came up with a system that could very robustly identify the positions of the track balls when they were on the overpass. Yes, it took a lot of work, but it is possible. And that’s just with one sensor package; you can get real power once you start combining together multiple sensor readings.
To get you started, think about this year’s game. They gave you bright colored targets to track robots, and the orbit balls are pretty brightly colored against a white playing field. But even if you didn’t have the targets on the robots, you could still do a pretty good job of tracking robots: On the field, there are basically two classes of fast-moving objects: robots (I’m including trailers with robots), and the orbit balls. So by tracking movement in the scene, you can isolate out these parts. There will be issues because your robot is moving as well, but there are algorithms to track the ground plane so you could use these to isolate movement of other things. Then due to the fact that robots are much bigger than orbit balls, you can separate the fast-moving objects into classes. FIRST always requires an identifier in a relatively pre-defined location on the robots to show which alliance they’re on (trailer bumper color this year, flags in the previous couple of years), so you can even track that if you want to. Just my initial thoughts.
And now you still have essentially the whole offseason ahead of you, so you have plenty of time to experiment with these kinds of ideas. If you need any ideas on how to get started on sensing other things in the environment, talk to your mentors; that’s what they’re there for. Or, if you want, feel free to start a discussion on CD. There’s plenty of people with years of experience in FIRST and industry who would be glad to help you out.
Good luck,
–Ryan