|
|
|
![]() |
|
|||||||
|
||||||||
![]() |
| Thread Tools | Rate Thread | Display Modes |
|
#1
|
||||
|
||||
|
Making autonomous accessible to all teams
Okay, here's another brainstorm thread.
How can we as a community make autonomous accessible/achievable to the majority of the FIRST community? |
|
#2
|
||||
|
||||
|
Re: Making autonomous accessible to all teams
Create resources:
Host workshops:
|
|
#3
|
|||
|
|||
|
Re: Making autonomous accessible to all teams
What new methods do you think are needed? Or what methods supplied already need to be changed? A decent autonomous mode can be created by what is supplied by WPI, I think.
|
|
#4
|
|||||
|
|||||
|
Re: Making autonomous accessible to all teams
It's already dead simple to make an effective autonomous program using the Autonomous Independent framework in LabVIEW. Every team that asked me to help them was able to do it themselves only a few minutes after I showed them how it was intended to be used.
|
|
#5
|
||||
|
||||
|
Re: Making autonomous accessible to all teams
For teams without software mentors, the current FRC programming environments are just too difficult. While our team has never had a issue (since we have four years of computer science at our school) you only need to look at the number of robots on the field that just sit there during autonomous to understand that there's a problem.
There really needs to be something of the level of RobotC or even NXT-G (written in LabVIEW, by the way) for teams in this situation. LabVIEW is too much for some teams, forget about C++ or Java. A nice simple development environment and an easy to learn language is what it will take. Last edited by Dale : 15-05-2010 at 17:36. |
|
#6
|
||||
|
||||
|
Re: Making autonomous accessible to all teams
There was a toy I had as a kid that was called a Big Track. It was a toy tank that had a keypad in which you gave it a program to perform autonomously. Some sort of hand held easy computer with specific canned capabilities that plugs into the cRIO and programs it without other interfaces is the only way I can see EVERY team being able to do autonomous.
It may seem cold, underestimating the teams, but not every team can get an engineering or technical mentor, and not every team has students interested enough in computers to learn what they need to know to do autonomous. But this is a fact of life, that not everyone has the same resources and interests. I think that FIRST is getting better at making autonomous modes that are worthy enough to pursue so that the challenge is there, but not so game breaking that only autonomous mode robots can win. I think keeping that balance is about all that can be expected. |
|
#7
|
|||
|
|||
|
Re: Making autonomous accessible to all teams
Quote:
The lines consisted of the following in the c++ autonomous periodic loop: RobotDriveTrainObject->Drive(-1,0); Wait(1.0); RobotDriveTrainObject->Drive(1,0); Wait(1.0); This isn't hard to figure out how to use. The problem isn't that the code is obfuscated for simple controls (for other parts, yes, it is, hands down), but that the knowledge on how to do something like this isn't readily available. The WPI guides are pretty obscure in explaining how this works to teams that don't have students already proficient in c++/java. |
|
#8
|
|||||
|
|||||
|
Re: Making autonomous accessible to all teams
I think there is no problem with LabVIEW autonomous. I used Autonomous Independent, and ran my own loops within it, and see no reason not to. I had a system like NXTG that gave me high-level controls to do feedback on speed while driving straight and finish by distance, etc. and blocks to set data for the other modules to pick up (kick distance, shift state, chassis mode, kick, ball-O-fier). It worked really well. I am already planning on next year.
I wrote some code for a fairly new team at Troy. We were playing against 469 and wanted to try a sacrificial robot. They had mecanums, and volunteered. So in like 10 minutes (using their Classmate) I wrote a simple time-based routine that used Mecanum-Cartesian and Delay And Feed, in a flat sequence structure, and it worked perfectly. They made it into the tunnel, and 469 did not. 469 was able to get in in the last 20 seconds and win the match, and that was enough to push us from #1 seed. I also helped a team that we mentored last year, with some autonomous stuff before MSC. I told their programmer to use Autonomous Independent, and string together Tank Drives and Delay and Feed's, and connect their errors. Since data flows over the error line, LabVIEW executes the VI's sequentially and that's all you have to do. He was impressed as this was much easier then the Auto Iterative he had at Detroit, which didn't work. There is one giant flaw in the system that causes autonomous development problems, especially on LabVIEW. Every time you build code, it has to re-build the entire WPI library. Then it re-downloads the whole WPI library. This is painfully slow, and for minor autonomous fixes between matches this is often a giant problem. Example: While sitting next to the field in elims, I had a minor kick distance change to make. During auto, I wrote in the new number, and begun the build. It did not finish the build until after the robot came back to the "pits" (this is in Atlanta), the tether cable had been connected, and the classmate was booting. Then, it finished downloading in only like 2 minutes. It would be nice if it was easier to partition the WPI lib so it dosen't have to rebuild, or separate the autonomous code. |
|
#9
|
|||
|
|||
|
Re: Making autonomous accessible to all teams
Quote:
It just takes some time to learn the language, and read the documentation. There were some problems with the Camera and the tracking for us, so we decided to keep it simple. |
|
#10
|
||||
|
||||
|
Re: Making autonomous accessible to all teams
Okay, so we mostly agree that a sequential, time-based autonomous is extremely easy. But that doesn't require any sensors. Why are sensors useful?
Sensors increase the repeatability of an action as other factors change (e.g. battery voltage drops or mechanism gets jammed) Sensors also allow the robot to respond to changes on the field, meaning the robot can operate based on intent rather than actuating by rote. Higher levels of control are useful in connecting actuators to sensors in common and easily configurable ways. For example, in NXT-G, it allows you to tell the robot to go forward for a time, a distance (degrees), or until told otherwise. It even allows you to ramp the speed from the current value to the desired value. Likewise, the "wait" function is configurable for a time, or until a sensor is greater/less than a given value. Such high-level coding can save time and reduce errors. As has been pointed out, all robots are different. Such high-level control needs to be extremely configurable to allow for the differences in sensors, strategies, decision making, actuator control, and wiring configuration. In other words, it needs to be modular and extendable. I like the idea of separating it into Perception, Planning, and Control. (Linked are Chief Delphi threads about each one) |
|
#11
|
|||
|
|||
|
Re: Making autonomous accessible to all teams
Quote:
I've seen people on this forum complain that a team needs no programming skills, because everything is handed to them in the WPI Library. This might be true... If you want to use a simple tank drive or an arcade drive or holonomic drive or PID, that's all pre-programmed. I have to say thank you to the WPI Library, because without it, I would have had a much much harder time programming in LabVIEW. However I do think that in this advanced, high school level robotics competition with professional mentors, we should be using REAL programming languages and REAL programming environments. Not something like RobotC or NXTG that we're never going to see in our lives. Besides, we're learning about these more advanced languages in school and if not, the pre-knowledge of a language like C++ or Java or LabVIEW will vastly help for college courses and eventually careers in computers. Remember, this is a learning experience and preparation for college and careers in engineering, not just a robotics competition. Perhaps the real reason why close to a majority of robots do not move in autonomous is that the teams did not have enough time to program or test their autonomous modes. Or, maybe they couldn't find the room or manpower to make a practice field. I could imagine many teams at the end of week 6 were just thinking about getting their robot together, or making weight, or getting their kicker to work, or adding a ball possession mechanism, or doing anything that the team considers more important than getting an autonomous working. I think that any team that has at least one dedicated programmer from week one can figure out how to do an autonomous, but whether or not there is time to debug and test it at the end of the season is a different story. |
|
#12
|
|||
|
|||
|
Re: Making autonomous accessible to all teams
Quote:
|
|
#13
|
||||
|
||||
|
Re: Making autonomous accessible to all teams
I think it would be helpful if FIRST provided a code library with a similar interface/feature set as Tekktosu. Personally I find that using state machines to model robot behavior is much more intuitive over typical C/Java code. Also the Takkotsu vision library runs circles around what FIRST provides you guys.
|
|
#14
|
||||
|
||||
|
Re: Making autonomous accessible to all teams
Quote:
For example, with this framework I made, any action can be started or stopped with any of the following conditions:
However, this is just one method of abstracting autonomous control, and surely not the only method. I think the sorts of control people want to do are similar enough that they can be all part of a generic framework, and then programmers can start transitioning from preplanned actions to dynamic action planning. |
|
#15
|
||||
|
||||
|
Re: Making autonomous accessible to all teams
Quote:
I think the programming time tends to affect rookie teams the most, and isn't usually a big deal once programmers are familiar with the language. Our region holds pre-season workshops for such purposes, though many rookie teams are pulled together at the last minute. Releasing the WPI libraries before kickoff could be a big help as well. But lack of time to test is something every team runs into. What about encouraging modular control systems that can be removed from the robot intact and used on a test setup while the robot undergoes mechanical changes? Educating on practices of testing algorithms on the PC? Modular code implementation? I have a software development guide which might help with this. |
![]() |
| Thread Tools | |
| Display Modes | Rate This Thread |
|
|
Similar Threads
|
||||
| Thread | Thread Starter | Forum | Replies | Last Post |
| how many teams r making poppers for the end game | midway78224 | Rules/Strategy | 12 | 18-01-2008 00:42 |
| Help - Making the autonomous mode | razer | Programming | 5 | 31-01-2007 16:25 |
| pic: Team 716 Accessible Design | Jaine Perotti | Extra Discussion | 8 | 23-03-2006 08:03 |
| Making your website avaliable to all | lookslikeapuma | Website Design/Showcase | 9 | 05-02-2005 18:04 |
| To all CDI teams or teams that will be there.... | archiver | 2000 | 1 | 24-06-2002 00:24 |