|
Re: An interesting EduBot...
Well, I think the way I spaced out the sensors just gets around the "eratic" movement. Yes, I have to walk slower than usual, it would work best when walking around a room though. I have 3 banner sensors, one centered and the other 2 on either side of the center. The programming at this point is simple. I'm actually quite suprised it worked on the first try! Basically it checks what the sensors are picking up. For example, the left and center sensors are getting feedback. That means that it's not far off course, so it goes forward with a slight left turn to re-center itself. If the center is lit, then it goes full forward. If only the left is lit, it goes forward with a harder left turn. It works really good, and I'm implementing a buffer of sorts to compare the current sensor inputs with the data it gathered over the last second or so. That way, if the left sensor was getting feedback and lost it, it knows that the object is probably far off to the left.
It's not the autonomous course correcting for the Spirit, but it's still pretty neat.
I video clip is probably feasable. Chack back for one after kickoff, or I may load stills from the clip. Either way, I hope to have media up after kickoff.
Ways it could be improved:
Well there are many. Most of them I didn't have the time to implement. One thing that would probably be the biggest help at this point is a true strip of retro-reflective tape that can go all the way around the back of my shoes. More sensory was planned to go in (ultrasonics, the works) but I didn't have the time to mess with, and It works just ok for showing off at kickoff.
I guess I'll sortof be using this as my business card to rookies and other teams in need of programming/electronics assistance. I've already hooked up with a few in the area this past fall and I think I'll be hearing some "WOW"s at kickoff from them...
__________________
-=Sachiel7=-

There's no such thing as being too simple!
Look for Team #1132, RAPTAR Robotics at the VCU Regional this year!
|