An analog to digital converter (ADC) acts as an interface between the continuous values of the analogue world and the digital numerical values used in a calculation system. An ADC accepts an unknown analogical signal (typically, a voltage) and converts it into a digital word with n bits, representing the ratio between the input voltage and the full scale of the converter.
Generally, ADC converters include, or must be preceded by, a sampling and holding circuit to avoid changes in the input voltage during the conversion operation. The input to output relationship for an ideal three-bit monopolar converter is shown in the following diagram:
The output assumes that the encoded values are ...