Thread created automatically to discuss a document in CD-Media.
1153 Autonomous Recording Project (“Echo”)
Records motions in Teleoperated period, saves them, and allows them to be played back in Autonomous. For Labview.
“Echo” was designed to enable rookies and other teams that would otherwise sit still in autonomous, to at least have a simple routine to do.
It keeps track of joystick movements in teleoperated period, and saves them to a .xml file on the CRIO, along with the corresponding times for the recording. Multiple files can be played back in a row, in order to construct a full autonomous (for example: drive forward, turn left, hang tube). Since we are still working on it, we would appreciate any feed back on how to improve it.
Instructions (pdf form).pdf (40.4 KB)
Echo Autonomous Independent.vi (12 KB)
Echo Begin.vi (11.5 KB)
Echo Disabled.vi (15.2 KB)
Echo Global Data.vi (6.75 KB)
Echo Periodic Tasks.vi (10.3 KB)
Echo Record.vi (16.2 KB)
Echo Record Values.vi (12.2 KB)
This project, which was written by myself and Derek Caneja from team 1153, is to help teams that do not have much time to program create an autonomous mode anyway. Based off of some of our new programmers trying out the program with minimal help, we estimate that it takes approximately 30-45 minutes to set up the program, and about 10-15 minutes to record each autonomous mode (though this could be made shorter by reusing some actions from old autonomous routines). We would like any feedback people have on how to improve this. Some ideas for what we might incorporate next are:
-Running multiple actions (recorded separately) simultaneously (for example, turn and raise an arm)
-Using sensors to control actions (either instead of or in addition to time)
-Making the playback code more efficient
-Creating a better way to select Autonomous modes while at competition
Additionally, we will improve our set of instructions as people comment on what is clear and not clear. (Currently the instructions refer to unzipping a folder and dragging the folder into the project; instead you should put all of the individual .vi files in the project; that will be fixed soon). We also hope to make a video explaining how to use the program.
I’ve always wanted to try doing something like this. I look forward to testing this on a practice bot while Mechanical and CAD are doing their thing this season.
If you have any questions on how to set it up, send me a PM, or better, post it here since someone else might have the same question.
Currently we found the program typically gives good precision (playbacks are generally within about 3 inches of each other) but less than ideal accuracy (though the playbacks are close to each other, they center around a point that may be up to around 8 inches from the spot the driver drove the robot to). Essentially what we found was that it may take several tries to get a recording that accurately reflects the setpoint. However, once the recording is produced, it is usually consistent with itself (barring mechanical problems).
(This is another thing we want to work on, we haven’t really messed with the recording and playback rates enough to determine the ideal rates and number of datapoints. The sensor feedback we will likely add during the season, assuming that Kinect doesn’t replace autonomous, will also help with accuracy)
We’ve wanted to try this since we saw it in The New Cool
What language is it programmed in?
Hey John! First of all, great looking project! Quick question though: is there an actual project file for download, with a main.vi? The instructions seem to imply that there is a project.