0

Can somebody please let me know how to solve this problem:

A 12-bit A/D converter has an input range of ± 10 V and is connected to an input amplifier with a programmable gain of 1, 10, 100, or 500. The connected transducer has a maximum output of 7.5 mV. Select the appropriate gain to minimize the quantization error, and com­pute the quantization error as a percentage of the maximum input voltage.

ocrdu
  • 8,705
  • 21
  • 30
  • 42
  • 1
    Does this help? 'Minimise the quantisation error' means the same as 'maximise the signal at the ADC without overloading it'. – Neil_UK Aug 12 '22 at 05:27
  • Another approach, as there are only four gains, is for each gain value, calculate the signal at the ADC, and the ADC quantisation error as a percentage of that signal. First things first though, are you aware of how to compute the quantisation error of an ideal 12 bit ADC? – Neil_UK Aug 12 '22 at 05:49
  • Thanks @Neil_UK ...what I can see is the Reolution of ADC = 20/2^12=4.88 mV...SO now how to correlate the gain, o/p of transducer and ADC resolution ? – LearnerABC Aug 12 '22 at 13:27
  • with a gain of 1, what is the transducer's full scale range at the ADC? What is the quantisation error as a fraction of that? With a gain of 10, ..........? With a gain of 100, ....? With a gain of 500 ......? – Neil_UK Aug 12 '22 at 20:24

0 Answers0