# Analogs.....

I probably know the answer to this question, but I’m not that good with techincal terms. What is an analog input? What’s the difference between an analog and, say, a digital.

In the FIRST controller there are analog and digital inputs/outputs. The analog ones accept a voltage 0 - 5v, and give a value of 0 - 1023 in code. The digital ones give a 0 [zero] on the contact [between GND and signal pin], and a 1 [one] when the pins are left open.

Jake,
Basally a digital signal is an analog signal that follows certain rules.

An analog signal is basically ANY signal. It has no restrictions. Digital signals are (in the case of the 5v CMOS levels used on the RC) signals whose voltages ideally only equal 5V or 0V.

Those signals, because they can have only 2 values, can be encoded as a 0 or 1 (a single bit) in the microcontroller. The analog signals (bounded to 0V to 5V for this microcontroller) can be anything in between. The values are therefore encoded as a number between 0 and 1023 (if using the ADC in 10-bit mode) or 0 to 255 (8-bit mode). The values are evenly distributed across the range. So the Voltage = Code * 5V / 2^(bitwidth). Bitwidth being the setting of the ADC (8 or 10).

Cheers!
-Joe

As many have said, an analog input is a range of values. If you have a 10 bit ADC (Analog to Digital Converter) it will be a 10 bit number (0 to 1023), 8 bit ADC is 0 to 254, etc. Our controller has a 10 bit ADC.

Analog devices work by taking a voltage in, applying resistance to it depending on a mechanical device and then outputting the modified voltage. That modified voltage is connected to the input pin of an ADC. The ADC is also connected to what is called a reference voltage. This reference voltage is what the inputted voltage will be compared to. It’s kinda hard to explain, examples show it better:

Reference voltage: 5v
Input voltage: 5v
Output given: 1023

Reference voltage: 5v
Input voltage: 5v
Output given: 254

Reference voltage: 5v
Input voltage: 0v