5

I'm interested to sample an AC signal in the range of 5kHz - 20KHz with the STM32L432KC microcontroller. This microcontroller has a hardware oversampling feature, i.e it can take the average of several ADC samples without CPU overhead, increasing the number of bits.

The ADC runs at full speed (5.33 MSPS) when in oversampling mode. With an oversampling rate of x64 I should get 15bits of resolution at 83kS/s although the ADC is only 12bits.

My questions about designing the ADC driver are:

  • Should I design the ADC driver so that it can drive the ADC at 5.33MS/s although my signal is only 20kHz? I ask because the ADC is a SAR type and the internal capacitor needs to be charged quick enough.

  • Should I design the ADC driver for 15bits although the ADC only has 12bits?

EDIT:

I attached a snippet from the datasheet + schematic of an ADC driver. My main question is how fast does the ADC driver need to be, the signal changes much slower than the ADC is sampling.

ADC input impedance

ADC Driver

sled
  • 249
  • 2
  • 8
  • Your driver needs to be able to drive the load the ADC represents at 5.33 MS/s. But the analog bandwidth doesn't have to be designed for 5.33 MS/s. – Arsenal Aug 22 '17 at 10:59
  • At 5.33MSPS, the sampling time is ~31nS. This means the driver needs to be able to charge the internal capacitor from 0V to Vref within 31nS (worst case) ? – sled Aug 22 '17 at 11:05
  • Look at table 62 in the datasheet. Input impedance should be less than 50 kohm, input capacitance is typically 5 pF. Just make sure your driver can drive 5 pF without significant signal loss and you're done. – Bimpelrekkie Aug 22 '17 at 11:06
  • What internal capacitor ? Why does the input need to charge it ? Going from 0 V to Vref in one sample is a nonsense requirement. Why make up such requirements when the datasheet is clear what the input impedance is ? – Bimpelrekkie Aug 22 '17 at 11:10
  • How would you design the driver for 15 instead of 12 bits ? The only thing I can think of is noise and perhaps bandwidth (for some anti-aliasing). – Bimpelrekkie Aug 22 '17 at 11:11
  • @Bimpelrekkie You write "input capacitance is typically 5 pF" and then you scold the user for asking about the internal capacitor. – pipe Aug 22 '17 at 11:12
  • @Bimpelrekkie that 5 pF **is** indeed the sampling capacitor. At least the table description says: \$C_{ADC}\$ Internal sample and hold capacitor. – Arsenal Aug 22 '17 at 11:21
  • Yep noticed that now, I was expecting they'd include a buffer there. Still charging and discharging it in one sample is not needed. Just follow the R_Ain values from the table. – Bimpelrekkie Aug 22 '17 at 11:22
  • @Bimpelrekkie There's also the issue of kick-back when the switch to the capacitor is closed. Settling time of the driver affects the bits. My question is just how fast the driver needs to react, since the signal changes much slower than the ADC is sampling. – sled Aug 22 '17 at 11:26
  • Right, but I think that is why the datasheet mentions what maximum R_Ain you should use. I'd make the buffer with an opamp as a voltage follower and then feed the signal to the ADC input. Then the output impedance of the buffer should be less than the worst case R_Ain (which is 100 ohms). I would not chose a fast buffer based on the kick-back etc. but base it on the bandwidth of the input signal. In your case that is audio so 100 kHz BW is more than enough. Note that the **charge** on the sampling cap. matters, not if there's kick-back or not. – Bimpelrekkie Aug 22 '17 at 11:45
  • 1
    If this is a true average, it will have a nasty frequency response, so expect higher-frequency components of the signal to be distorted. – Simon Richter Aug 22 '17 at 11:55
  • I agree with @SimonRichter if it just averages then the upper frequency response may be affected by a dB or so and this pales all other arguments into insignificance. – Andy aka Aug 22 '17 at 12:14
  • 1
    The capacitor for the reliable results must to be charged quicker than that minimum calculations. IMO you should consider if you really need the 15 bits resolution and if your input circuit is precise enough for it. In most designs I see that people care about the resolution, but do not about the quality of the signal provided. And eventually they end up with the practical (real) 5-8bits resolution. It is same as very high resolution of the digital camera with low quality lenses. – 0___________ Aug 22 '17 at 13:04
  • Using averaging techniques does not really yield a great improvement in resolution in practice: See https://www.microsemi.com/document-portal/doc_view/131569-improving-adc-results-white-paper – Peter Smith Aug 23 '17 at 09:53
  • @PeterSmith Thanks for the link, the article calls it decimation, the STM32L4 sums up multiple conversions and then truncates by right shifting. – sled Aug 23 '17 at 12:47

1 Answers1

2

You design the buffer amplifier to fit your signal, not the sampling rate of the ADC. But the buffer's output impedance has to match the ADC's input impedance requirements at the sampling frequency.

As a side note: Oversampling will reduce noise (which the STM32 ADCs have a lot of) but not the non-linearities of the ADC. You will still get just 12bit resolution, but with less noise.

Attila Kinali
  • 1,216
  • 5
  • 12
  • 1
    "You will still get just 12bit resolution" no, you get the 15 bits. Even if the output is only accurate down to 12 or fewer bits or due to adc nonlinearities, it will have a *resolution* of 15 bits. – jms Dec 05 '17 at 09:43
  • You've told to design amplifier just for signal. But I think maybe OpAmp's settling time can contribute to ADC perf. – mohammadsdtmnd Sep 16 '21 at 07:04