0

I've just started learning about Quantization and I'm a little confused. Currently my understanding of it is that we should use the formula (Vmax-Vmin)/2^n where n is the amount of bits used.

However, using the example of Vmax=5, Vmin=0 and n=3, we would get 625mV as my step size and this is where I'm confused. If 000=0V, then 001 would be 1*625mV=625mV, and so on. But at 111, it would be 7*625mV, which only be 4.375V. Shouldn't 111 be equals to approximately 5V in this case?

If I were to use the formula of (Vmax-Vmin)/2^n-1 instead, it feels like it makes more sense but information online is rather conflicting on this case.

toolic
  • 5,637
  • 5
  • 20
  • 33
tthh
  • 3
  • 1

1 Answers1

0

Your example is correct. For a 3-bit DAC the maximum output voltage would be 4.375 V. For a simple ADC the output value changes from 110 to 111 when the input voltage exceeds 4.375 V.

This is just the way these converters work. For large values on \$N\$ (the number of bits) the difference between \$2^N\$ and \$2^N-1\$ does not limit the use of the converters in practice. Here is the formula provided in the datasheet for the Atmel ATMega 328, where \$N = 10\$:

enter image description here

Elliot Alderson
  • 31,192
  • 5
  • 29
  • 67