This is the simplest form of ADC and the fastest, as the name implies.
The general rule of thumb is that speed and accuracy are more or less inversely proportional depending on the application, it is necessary to choose an ADC. If a very precise reading is required, the ADC usually spends more time looking at the input signal (usually a sample-and-hold or integrated type input), and if accuracy is not a concern, the reading can be quick and dirty. The speed of the sample depends entirely on the type of converter and the precision required. This is to be read per second as mega-samples, which means a million samples per second. A very good ADC, for example, can have a sample rate of 300Ms/s. The sample speed is called the number of analog to digital conversions that the converter will perform per second. There are available ADCs up to 24 bits, but conversion frequencies are low, in the order of a few hertz. Using an ADC with a higher resolution will avoid this. What if the change in voltage is below 4.9mV per step? This places the ADC in a dead zone, so there is always a slight error in the conversion outcome. The resolution of the ADC is called this measure of 'volts per bit.' So each binary step up is around 4.9mV since 1024 possible digits are available in 10 bits. For eg, in a 10-bit converter with 5V as the reference voltage, 1111111111111 (all bits one, the highest possible binary 10-bit number) corresponds to 5V and 0000000000 (the lowest number corresponds to 0V. No ADC is absolute, of course, so the voltage mapped to a maximum binary value is referred to as the reference voltage. Here are some essential features of ADCs that we will learn how they work while going through them. Anything has to behave as an interface between the logic and the analog input voltage. Of course, registers can only accept logic levels as inputs themselves, so the results wouldn't be nice if you were to connect the signal directly to a logic input.
What we need is something that, for example, in a register, can convert a voltage to a series of logic levels. Scaling is simply mapping values from one set to another, so an ADC maps a binary number to a voltage value. Imagining it as a mathematical scaler is a good way to look at the process of an ADC. Instead, we use an ADC to convert the analog voltage input to a set of bits that can be directly connected to the microprocessor's data bus and used for computation. If we link this directly to a digital input, depending on the input thresholds, it will register either as high or low, which is completely useless. For example, a temperature sensor such as the LM35 outputs a temperature-dependent voltage, in the case of that particular unit, 10 mV per degree of temperature increase.
Unfortunately, the world in which we live is still analog and full of color for digital systems, not just black and white. Today's electronics are solely digital the good old days of analog computers are gone. These ADC circuits can be found on their own as individual ADC ICs or integrated into a microcontroller. Ⅱ Definition of ADC (Analog to Digital Converter)Ī circuit that converts a continuous voltage value (analog) to a binary value (digital) that can be interpreted by a digital computer that could then be used for digital computing is an analog to digital converter.