Quote:
Originally posted by Abwehr
I have to ask this...did the team's students ever actually have a part in any of this? No offense is meant, I'm just curious...this is the sort of system that NASA would be proud of.
|
I can expand on the student involvement in the RC because that's what I worked on. After the team decided to create StangPS I drew a diagram for the overall architecture of the RC code - inputs & outputs of each subroutine, calling order, etc. I then started with a blank whiteboard and stepped through the design process with the students, asking them for input where they could help and telling them what I was doing when they didn't understand. Next I divided up the subroutines among the students and over the next couple of evenings they each worked on their part while the engineers floated around, helping when anyone got stuck. Then we all sat down and copied the subroutines into the code and held an informal code review to make sure it all looked good. Finally we tested the code using RoboEmu (thanks Rob, the tool allowed us to run some important unit testing and helped us be confident that our code worked) and a unit test driver that a student wrote. Then it was on to testing and debugging with the real robot for the next couple of months. So I guess we took the students from the initial design phase through coding & unit testing, all the way to system integration and deployment. An entire software cycle in 3 months.
All this time another group of engineers & students was debugging the communication between the RC & CC while another group was playing around with the gyro & wheel encoder.
To expand on what Dave said about theta correction... Because of the differences in turning rate of the crab modules on our prototype, during our initial testing up the ramp the bot always rotated when it made the first turn. By the time it reached the bins it was usually 45-90 degrees off. Steve (engineer) and Matt (student) played with the code for a few days but came up with a routine that rotates the robot and seeks a specified orientation. We played around with setting waypoints at 45 and 90 degrees off our starting position just to see it rotate as it travelled across the field. We went so far as to shove the robot while it was running in order to purposefully knock it off orientation & course and it recovered almost perfectly.
Here are some videos of our theta correction progress:
-
no theta correction
-
with theta correction
-
theta correction test
I really wish I took video earlier on so everyone could see our many failures and milestones & see how far we came.