View Single Post
  #15   Spotlight this post!  
Unread 31-03-2004, 02:00
TedP TedP is offline
Registered User
#1014
 
Join Date: Mar 2004
Location: Dublin, OH
Posts: 19
TedP will become famous soon enough
Re: counting in seconds for the autonomous mode??

Quote:
Originally Posted by KenWittlief
hmmmm.... I dont understand TedPs adversion to counting SW loops.

someone has tested the timing of the statusflag.NEW_SPI_DATA event- its controlled by a timer in the communications loop and testing has shown its dead accurate every time, unless you have so much code that it takes more than 26mS seconds to execute.

statusflag.NEW_SPI_DATA will be true 38 times a second - you can even round it off to 40 to make it simpler - I cant imagine anyone needing finer resolution that 1/38th of a second while setting up the sequence of events in auton mode ?
For one, there's no reason your autonomous mode code has to run at the slow rate. Only Getdata() when that flag goes high, and otherwise rattle around as fast as you can. You can Generate_Pwms() as fast as you can as well because those are the FAST PWMs. They can be handled specially. Additionally, Putdata() can be called as fast as you'd like. IFI specifically says that there is no harm done by calling Putdata() too fast; it also says that there's not necessarily anything gained. It's in the documentation.

Having interrupt-controlled timing that is triggered by an interrupt generated by the terminal count of a hardware timer timed fairly precisely to go off every 100th of a second (or even finer) seems much simpler than using a counter to do this timing. Additionally, you can use this counter later on in your competition mode code.

It just seems much cleaner to have an interrupt manage all your timing for you. You can get fairly fine time resolution, and you don't have to worry about making sure you increment in time. Run code as slow as you'd like; the interrupt will jump in when it needs to increment your "clock."

If you do things the more naive way, then you gain a second every twenty seconds, and you lose the ability to differentiate in time among a great number of cycles of your code. That may not be that big of a deal, but it definitely could use some improvement. Using the simple interrupt-driven timing does this. That's all I'm saying.