Quote:
|
Originally Posted by Anthony Kesich
Thanks IrisLab. Thats exactly what i thought it was, but i wanted to confirm. Seeing as 17ms is just faster than 1/59th of a second, it seems to me that interupts are overkill and you could make much better use of your programming space (800 lines), but i'll still keep them in mind if they ever become a necessity.
|
Actually, that 17ms contol loop is just off the top of my head. I can't remember the time, but I believe that's close. However, that number is only approximate. The control loop is not guaranteed to be every 17ms. If your code becomes too long, you'll stretch beyond that 17ms cycle easily. The 17ms is NOT guaranteed.
With interrupts, you are pretty guaranteed to get things done with greater time precision.
A common thing to do is to set an interrupt timer. Like a high-precision time clock. You could, for example, set the timer to go off at every 6 ms. The interrupt will occur at that regular interval, regardless of the length of your code. You are guaranteed to have the interrupt to run every 6ms.
The key with interrupts is GUARANTEE and precision TIMING. I wouldn't casually write these capabilities off. Especially for a good autonomous mode.