Programming College Courses

Ok, So I do not know if this is the correct place for this thread so work with me. I a allready reall good at mechanical and electrical things. But seeing as how I want to do alot of my own robotics projects and be self-sufficient at them I want to get into programming more. I want to sign up for a programming course at college but I do not know how much I will get out of it. Understand that I know nothing and I mean NOTHING about programming. Well maybe a little…It only took me til now to get the timer working on my Atmel AVR. So if I take a first level or begginers programming course in college do you think I will get enough to do simple programs from just that one semester’s worth of learning? I want to be able to do like timed actions, basic driving PWMs and reading and interpreting sensors and data. basically what we do for our robots in competition.

I basically hasd the same skills as you, a lot of mechanical knowledge but no programming knowledge. I am currently taking an entry level programming course (in java) and really all we learned are the basics (arrays, loops, ifs etc.). Right now? I wouldn’t really know how to program a robot. What we learned wasn’t based on that. You seem to have specific things you want to be able to do.

My suggestion is that even thought you want to be self-sufficient, you should start out working with someone else, perhaps someone on your team. They should have knowledge that may be more specific to your needs. That way, you’ll learn what you want to learn and won’t be stuck in a class that isn’t relevent to your what you want to accomplish with code.

I have degrees in both Aeronautical and Mechanical engineering but I’m done mostly software (long story). One thing you need to remember that programming a robot falls into the category of “embedded” software which is quite a bit different that the common garden variety programming taught in most beginning programming classes.

However, having said that, you have to start somewhere. Most of the programming constructs are the same. So my advice would be to take a C or C++ class to start. Don’t bother with toy languages like Java or C# that are in vogue now.

When you’ve learned the basic things like using varibles, loops, etc. Start taking special care to learn how to “fiddle bits” as those are generally very important in embedded programming. Then you can graduate to understanding and using interrupts. From there learn about memory and processor architectures because the kinds of bugs you’ll run into are sometimes esoteric. You have to much more understand the type of hardware your program is running on.

The biggest thing to remember is that there are very hard time constraints in most embedded programming. In a regular desktop program if you have a long calculation to do, you just put up the hourglass and tell the user you’ll get back to them when you’re through crunching the numbers. In an embedded system, you need to monitor what’s going on. If you go away to calculate pi to seveal million digits, your robot is going to crash into a wall or IFI is going to shut it down for not tripping the watchdog timer soon enough. You’ve probably heard from the programmers on your teams about the infamous 26.2 millisecond main loop.

As That is basically what I was thinking. Right Now I am working on an embedded system. The basic programming such as variables loops and all that are easy to figure out and learn about. One of my biggest problems is bit fiddling. Curse those data sheets. It is an atmega16 and about 95% of coding I have done or gotten from people to learn from has been defining bits. To configure it to do what you want.

Also to Tim, I have approached people on my team asking for their help and assistance but I have had no luck with trying to have people help me.

Last time I checked Mike from 237 was pretty knowledgable with AVR’s. I’m sure he wouldn’t mind helping. Actually, I’m positive he’ll lend you a hand. He owes some favors after the past 2 build seasons.

A little history on me and Mike: Mike was a freshman last year, and needed help with interfacing the camera to the bot, and controlling outputs. I taught him all I knew. My camera got ripped off at the first regional. He capped the vision tetra in NJ. Go figure…

If you have specific questions, feel free to contact me directly. I haven’t worked with AVRs but I’ve poked around at the processor level on PCs of all ages and, of course, the PIC robot controllers. It can’t be that obtuse.

If you’re looking at C/C++, pay particular attention to the difference between the logical operators, &&, ||, and !; and the bitwise operators ~, &, |, and ^. Look up the descriptions of “masking” for isolating and manupulating individual bits.

In a somewhat related vein, it’s helpful to know if your system represents multibyte variables in little endian or big endian form. (The origin of those terms is kind of interesting)

Another architectural detail is how I/O is addressed by the CPU. Is there a separate I/O address space or is it mapped to the memory space of the processor. Desktop applications generally don’t care about this but with a lot of smaller embedded systems, you’re on your own.

I don’t know about other embedded systems but with AVRs and what I am doing now the code is pretty much 90% bit deffinitions. So right now it is just hard for me to know what I have to define for what options, and finding all the bits I need in the datasheet.

AS for taking the college course, I was wondering also because I hear that alot of people work with motorolra someting or other and I wanted to ask if that would be better for robotics applications or if something like an AVR would be fine for robotics. I do not know what kind of bit definitions need to be done for motorola’s or even PICs.