Analog vs Digital Interrupts and Processor Overhead

What is the processor overhead for an analog input signal compared to a digital input signal with a low-priority interrupt? Discussing the connection of the MaxSonar-EZ1 sonar sensor brought up this topic.

I know this may depend on what code is used, but I would like to stir up a discussion regarding this topic for all variations of code (Kevin’s modified ADC code, the default code, …)

Also, please recommend some good documents to read on this issue? The “Interrupts for Dummies” did not cover the topic of analog interrupts…

Thanks!

The analog isn’t really an interrupt rather you just call it up and it will get the current value of the analog pin. Kevin Watson’s code uses timer and adc finished interrupts to get multiple samples and average but to get a value you still just call up Get_ADC_Result(X). The Digital interrupt for the sonar will probably take more process power because you are trying to decode a pwm signal.

If you are not using Kevin’s ADC code then it will be faster to you just the regular IFI analog. If you want increased analog accuracy then use his code wonderful ADC code though digital IO might be faster. You will need to play with the a timer to decode the pwm right…

Sorry don’t know of any docs on this specific subject.

Making a couple assumptions about how you’ve set up the MaxSonar sensor.

  • holding Rx high so in continuous operation
  • using analog output instead of rs232 serial or pwm output

In this case the sonar goes through a 49ms calibration cycle and then outputs the first range 49ms later (~100ms before first output). This analog output voltage is held until the next sample is ready ~49ms later and then changed to this new value, and so on. So the analog voltage of 10mv/inch is output and held continuously and updated at ~20hz (50ms) after an initial setup time of 100ms.

This means you can just read the analog voltage when you need a range anytime after the 100ms setup as long as the sensor is in continous operation mode. Using the standard C18 get analog libary calls, the analog conversion cycle is done in-line, not via interrupts. The analog conversion cycle takes some setup - called a pre-charge time - before the analog conversion can actually can start. A conversion is started by setting the ADC “GO” bit. When the conversion is complete, the GO bit is cleared and the ADC IF (interrupt flag) bit is set. If the ADC IE (interrupt enable) bit is set then an interrupt occurs, but the interrupt flag is always set regardless.

Using standard C18 library calls, it is something like:

OpenADC(…) // done once to setup adc
:
.

SetChanADC(channel);
// make sure we wait at least 2.4us here before starting conversion…
ConvertADC();
while (BusyADC() != 0);
range = ReadADC();

In EasyC:

range = GetAnalogInput( channel );

If you are doing this a lot then the waiting around for the ADC operation to complete is just a bunch of wasted time. So instead, using Kevin’s or similar code many folks move the ADC conversion to being interrupt driven. That way all the analog conversions needed are done in the background. A table of values is constantly refreshed with the latest conversion results and users can just read this table when needed to get the last available information. Still, the interrupts are digital, and in this case the interrupt routine is being driven off the ADC IF (interrupt flag) which is set when the ADC operation completes.

In you are only using the MaxSonar analog input and none or few others, then just calling a routine to do the conversion when you need it is pretty simple. Especially since the analog range output will only change 20 times a second. Putting this in the 26.2ms “slow” loop means you’d be sampling at about 38-40hz - more than fast enough.

One way of making the conversion go faster is to hide the required analog input pre-charge time. Once a conversion operation is started, the analog input section is disconnected leaving it free to start the next conversion pre-charge cycle. If you were only doing one analog conversion, then the pre-charge is automatic since you’ve alread set the input channel into the ADC. If you are doing a bunch of analog conversions, then within a couple instruction cycles of setting the GO bit for the current conversion you can select the next channel you want to be converted so that it’s pre-charge time is occuring while the current analog to digital conversion operation is taking place, etc.

Important information can be gleened by reading the 18F8722 data sheet available from Microchip - particularly Section 21 “10-bit Analog-to-Digital Converter”. This file is also easily available at Kevin’s website. reading Kevin’s ADC code is highly recommended.

The following is except from ADC interrupt code done a couple years ago. There are two interrupts that need to be serviced, the ADC complete interrupt, and a timer interrupt. The timer is used to start an ADC operation at specific intervals. The interval chosen depends on how many analog channels you want converted and at what sample frequency. You don’t want the sample frequency too high or you can swamp the processor with too many interrupts.

  // INT_adcclock();   adcclock currently set to 130us resolution 
  // ------------------------------------------------------------
  // timer3 (adcclock) overflow bit interrupt
  if (INTTMR3_flag && INTTMR3_enable)       
  {
            INTMR3_flag = 0;

            // NOTE: We set up the currently selected channel pre-charge
            //        after we kicked off the last conversion request.  Pre-charge
            //        time has been satisfied so just start the next conversion...
            ADCON0bits.GO = 1
								
            // need minimum 2 instruction cycles before new channel select
            // need to figure out the next channel which takes at least that long
            if (adc_channel) ndx = adc_channel-1;
            else             ndx = MAX_ADC_CHANNEL;

            // select the next channel for pre-charge setup so when current
            // operation is complete, next channel is ready to start...
            ADCON0 = 0x03 | (ndx<<2)

            // restart ADC interval timer.
            TMR3H = 0xFA;
            TMR3L = 0xEA;                   	
            T3CON = 0x81;
  }

  // INT_adc() 
  // ----------------------------------------------------------------
  if (INTADC_flag && INTADC_enable)
  {
            INTADC_flag = 0;

             // Save the results of conversion
            adc_data adc_channel ].hi_byte = ADRESH;
            adc_data adc_channel ].lo_byte = ADRESL;

            // Setup for next channel index, we're scanning down MAX-0
            if (adc_channel) --adc_channel;
            else               adc_channel = MAX_ADC_CHANNEL;
  }

Just a snippet to think about.