View Single Post
  #10   Spotlight this post!  
Unread 08-03-2005, 19:55
Venkatesh Venkatesh is offline
Registered User
FRC #0030
 
Join Date: Jan 2003
Rookie Year: 2002
Location: USA
Posts: 260
Venkatesh is a splendid one to beholdVenkatesh is a splendid one to beholdVenkatesh is a splendid one to beholdVenkatesh is a splendid one to beholdVenkatesh is a splendid one to beholdVenkatesh is a splendid one to beholdVenkatesh is a splendid one to beholdVenkatesh is a splendid one to behold
Re: Let's have Linux Robots Next Year!

Lets ignore the rules/rulings concerning intelligent dashboard-connected devices for a second.

We can use the dashboard port as the output from the IFI stack to the laptop. And we can use the digital inputs of the Operator Interface as the inputs from the laptop to the IFI stack. The parallel port on a laptop would be well-suited to that.

Now what do you gain? A high-latency link to all robot operations and the flexibility to do whatever you want with them with the full power of a PC.

For many common tasks on the robots, this approach is unnecessary and doesn't add any advantages. As an example, suppose certain sensor inputs directly influence some motors. To have the sensor data flow all the way up to a PC and back down would be pointless when local operations could have taken care of the task.

Before this year, I would have said there was very little incentive to stack a PC on to this system. The only tasks that the PIC can't do natively are floating-point math, and you can purchase a PAK-II or III coprocessor to do that far more easily than you can have a PC do it (in the off chance that you absolutely need very accurate floating-point math).

The CMUcam changes this. If we could read the raw data coming from the camera (as opposed to processed data), we could feed it to a computer for analysis and processing and whatnot. It is very difficult to outfit a PIC (even an 18f) for analyzing a video stream in near-realtime. If a PC could be used, sudden magic possibilites open up. We could have robots analyze the entire field in autonomous mode and react to different field conditions. The variable-position tetras this year would just be a small step. Imagine autonomous mode with variable starting locations and robot interaction. Other than the CMUcam, new and powerful sensors could appear. Robots might sport miniature radar/sonar/ranging tools, to create a view of the field.

If you are very interested and have experience with esoteric electronics, there is an option. Get a book called "Troubleshooting IBM PCs". It is an old blue book, from the early 1980s. Included are the complete circuit schematics of the IBM PC, XT, and PCjr, CGA and MDA video boards, and printer control systems. You can try playing with an 8086 or 8088 CPU and linking it to the IFI controller. Then, you can add the 8087 coprocessor. Such a setup would allow you to use 16-bit PC C compilers and linkers to generate code. But a warning - it will be exceptionally difficult and yield little more than some cool demos.

Things like this will be seen in the future. Right now, however, they would be too hard to implement, for many reasons, some technical, some logisitic.

The idea of using linux to assist robot control is definately not a bad one. Gaining access to the full firepower of an x86-platform would give electronics and programming teams a field day and an acute migrate at the same time.

And good luck!
__________________
-- vs, me@acm.jhu.edu
Mentor, Team 1719, 2007
Team 30, 2002-2005