2

When a data acquisition system digitizer (DAQ,) or analog-to-digital converter (ADC) samples a continuous signal, how can one estimate the true measurement duration of the sampling action? By true measurement duration I mean the window of time during which information is being collected from the continuous signal under test.

Given a sampling rate R and small signal bandwidth SSB, I can imagine two extremes:

  1. 1/R, the total time between samples
  2. 1/SSB, the Fourier transform-limited duration for a delta-function-like measurement

My research suggests that ADCs use a sample-and-hold circuit. Presumably the 'hold' time is what I would like, but I don't see this typically reported in datasheets.

For an ADC with user-chosen sampling rate up to a maximum sampling rate, is the 'hold' duration typically a constant or a fixed-fraction of 1/R (duty-cycle-like)? What are typical values of the 'hold' duration for an ADC, given a specified sampling rate?

As a practical case, the NI DAQ USB-6361 has a variable sampling rate from "no minimum" to 2 MS/s and a small signal bandwidth of 1.7 MHz. If I set the sampling rate to be R = 100 kS/s, then 1/R = 10 microseconds, but the duration corresponding to the analog input channel bandwidth is 1/1.7 MHz = 0.59 microseconds. Would each sampling action be effectively integrating over 10 microseconds or 0.59 microseconds of the analog signal being measured?

JRE
  • 67,678
  • 8
  • 104
  • 179
higgy
  • 63
  • 2
  • 8
  • 2
    I can certainly be wrong, but your question feels like it's an intermediate step -- a side road that you've run down on the way to answering your *real* question. If this is the case, you might get a better answer by telling us what the real question is. – Scott Seidman Jan 23 '23 at 18:58
  • 2
    FWIW, the Data Conversion Handbook from Analog Devices is a great reference for this sort of stuff: https://www.analog.com/en/education/education-library/data-conversion-handbook.html – Scott Seidman Jan 23 '23 at 19:04
  • 1
    This is typically described in the ADC datasheet, although maybe not for some cheap DAQ product where you don't know the actual ADC. You could probably estimate it as roughly 0.35 times the 1/3dB bandwidth. – user1850479 Jan 23 '23 at 19:11
  • @ScottSeidman Sure, though I'm also interested on a fundamental level. The application that spawned this involves a pulsed signal-of-interest on a continuous noisy background, where integrating more background than necessary needs to be minimized. – higgy Jan 23 '23 at 19:17
  • Using a multichannel DAQ, it seems that the most important in that case is the "settling time for Multichannel Measurements" which can be 1.5 us (±1 LSB for Full-Scale Step, 15 ppm) to 8 us (range dependent). – Antonio51 Jan 23 '23 at 19:56

2 Answers2

4

By true measurement duration I mean the window of time during which information is being collected from the continuous signal under test.

Since there is only one value per sample, the extremes are:

  1. The average over the entire measurement window.
  2. An instantaneous value at the end of the window.

Most data acquisition systems fall somewhere between those two, unless they were designed to prevent antialiasing across the entire sampling frequency range.

There's also a question of whether the measurement window is equal to the sampling period or shorter than it.

To prevent antialiasing independently of the output range, you either need tunable front-end filters - e.g. switched capacitor filters, or the filtering is done post-sampling. The letter is the cheaper and better performing approach for general purpose data acquisition. The input signals have to be heavily oversampled, e.g. with a sigma-delta converter, and are then filtered and output at a desired data rate. Since the output samples are synthetic anyway - in the sense that there are filters and decimators involved, it doesn't matter much what the output data rate is - just a matter of setting the filter coefficients and update rate correctly.

Be mindful though that the output sample rate is not intrinsically tied to the actual rate the ADC operates at. In advanced data acquisition systems, the ADC runs at the maximum sampling frequency, synchronized to a quartz oscillator, and the output rate is synthesized numerically on an FPGA or an MCU or APU, and is referenced to an external clock source that drives an internal numerical timebase with performance mostly limited by the internal clock source in terms of phase noise.

Unfortunately, it seems that $2k DAQ devices are not great in that respect, and to get that sort of performance the price tag goes up significantly - unless you're designing your own product, where you can get an awful lot of performance for $500 worth of A/D and signal chain.

As a practical case, the NI DAQ USB-6361 has a variable sampling rate from "no minimum" to 2 MS/s and a small signal bandwidth of 1.7 MHz

Unless they explicitly state that the sampling is alias-free, you must assume that it's the typical "low effort" option where the ADC rate is same as the output sampling rate. This comes with serious performance compromises if you run the ADC at any rate that's not the maximum - and even then the small signal bandwidth is too large for the sampling rate.

It's one of those dubious designs that are really hard to use to their full potential. They are almost universally pretty noisy and EMI sensitive since if you are sampling at, say 100Hz, the whole bandwidth's worth of junk is aliased into the 50Hz output bandwidth. For that particular device, the noise specs are for full bandwidth, and the overall performance is hardly impressive at all.

Also, the analog inputs look like an unbuffered MUX input. Driving those inputs is not trivial, since they inject short, high-bandwidth charge pulses onto the signal lines.

  • Accepting because most complete, but look to answer by SteveSh for the technical term for 'hold time' useful for interpreting spec sheets. – higgy Jan 23 '23 at 21:52
  • I don't understand the part about anti-aliasing filtering -- which can never be done post-sampling. – Scott Seidman Feb 01 '23 at 18:29
3

I think what you want is the aperture time of the ADC. This is the amount of time, or the duration of which the sampling window is open actually open. It is only loosely related to the sampling, or conversion rate.

Many ADCs use a sample and hold technique, in which the input voltage is applied to a holding capacitor. See picture below, from Analog Devices document MT-007, "Aperture Time, Aperture Jitter, Aperture Delay Time".

enter image description here

The digital value that the ADC outputs is the digitized value of the voltage that's on the holding capacitor when the sampling switch opens. The voltage on Chold is to a first order approximation, the average of the input voltage over the sampling window.

SteveSh
  • 9,672
  • 2
  • 14
  • 31