pic: Before the days of VEX...

a8c10d468ff65016bfd385c191e0ba87_l.jpg

Back in the days before Vex, I built a few Lego robots. Of course I still do, but that is while mentoring FIRST Lego League teams. Here is one of my last creations, a dual-RCX, line following, robotic arm robot. It took a few days to build, and a few hours to program in RIS 2.0. Basically, the lower RCX was the "master" and the upper one was the "servant" and was only used for the additional motor and sensor ports.

The robot used three four motors, light sensors, two limit switches, and a rotation sensor. The three light sensors were used as follows: one was used underneath the robot and was attached to a pivoting from wheel. This was used to follow a line on the floor. The second light sensor is visable on the side of the robot. That was used find a line perpendicular to the line the robot was following, which marked when to pick up/put down stuff. The last light sensor was on the claw of the arm, and helped sense when the arm was over something to pick up.

The two touch sensors were used on the robotic arm to tell when it was at the bottom and top of its reach. The rotation sensor was used for the rotating torret, which was attached to the base of the arm to tell where it was. The four motors were used to as follows: one to steer the robot, one to power the rear differential axle, one to power the torret, and one to operate the arm raising/lowering and the claw opening/closing (at the same time).

Basically, what the robot did was begin following a line. When a prepenticular line (to the one being followed) was detected, the robot would stop, the arm rotate about 90 degrees until it saw something to pick up off the floor (which were Lego beams), lower the arm, opent he claw, close claw while grabbing the beams, raise the arm, rotate the arm back over the robot, and then continue driving until it saw another perpendicular line (which was a designated loading zone.)

Also, with the line following software, the robot was able to detect if it was seeing too much white (I.e. the robot lost the line and is going in circles), and it would then retrace it's steps until it found the line again. The robot almost never lost the line, and if it did, the software enabled it to find it again. (I'm not much of a software guru, but when stuff is drag-and-drop programming (like EasyC or Lego RIS), then it comes real easy for me.)

I am not trying to take credit for the robotic arm, which is one of the "stock" designs that comes with RIS 2.0 software. Everything else, including the software, I designed and built.

P.S To anyone else involved in FLL or just interested in Lego robots, I am working on writing a White Paper on building an "Ultimate FLL Robot Drivetrain". It probably won't be ready for a few weeks, as I am so busy at the moment. The idea was used very successfully last year by one of the Connecticut FLL teams I helped mentor. Basically, it takes the two worst variables for a FLL robot out of the picture: drift and battery life. It involved both hardware AND software. If you want a hint, Think "Gompei" in 2004. :wink:

awesome! when are you going to put up a video?

Okay, okay! :wink: I had to dig through a pile of old DV tapes to find this one. This video was from a FIRST demonstration our team did at a local mall in December 2004. We brought our 2004 FRC robot, an EDUbot robot, and the Lego robot that I built. Here is the link to the video: (hosted on Putfile)

WMV - 3.9 Mb - http://media.putfile.com/roboticarmrobot-demo

Wicked cool,

BTW, did you know that if you have two RCXs you can use them for IR rangefinding?