Quote:
Originally Posted by brianafischer
What is the processor overhead for an analog input signal compared to a digital input signal with a low-priority interrupt? Discussing the connection of the MaxSonar-EZ1 sonar sensor brought up this topic.
I know this may depend on what code is used, but I would like to stir up a discussion regarding this topic for all variations of code (Kevin's modified ADC code, the default code, ...)
Also, please recommend some good documents to read on this issue? The "Interrupts for Dummies" did not cover the topic of analog interrupts...
Thanks!
|
The analog isn't really an interrupt rather you just call it up and it will get the current value of the analog pin. Kevin Watson's code uses timer and adc finished interrupts to get multiple samples and average but to get a value you still just call up Get_ADC_Result(X). The Digital interrupt for the sonar will probably take more process power because you are trying to decode a pwm signal.
If you are not using Kevin's ADC code then it will be faster to you just the regular IFI analog. If you want increased analog accuracy then use his code wonderful ADC code though digital IO might be faster. You will need to play with the a timer to decode the pwm right...
Sorry don't know of any docs on this specific subject.