3

I have researched and I understand why there are no 32 bit Analog to Digital converters; it is because basically when you go beyond 24 bit, you begin to pick up on the noise generated from the vibrations of the atoms. With that in mind, I predict that the same or similar would be true for Software Defined Radios.

Question: There are 16 bit ADCs for software defined radios. Why aren't there 24 bit ones?

-- Thanks in Advance--

  • 3
    Each bit gives you 6.02 dB of dynamic range. 24 * 6.02 = 144dB. – Kaz Apr 18 '13 at 22:46
  • There are. However they are narrowband, using audio converters to sample a 0 or low IF. Such 24 bit converters are not entirely linear to the LSB, but then neither are the hundred megasample per second 16 bit variety. – Chris Stratton Apr 19 '13 at 04:05
  • The inherent noise floor of FM is considered to be -80dB in mono and -63dB in stereo. That's 14 and 11 bits respectively. – user207421 Apr 19 '13 at 23:50

4 Answers4

5

Take a look at the equation for minimum detectable signal here:

http://en.wikipedia.org/wiki/Minimum_detectable_signal

Calculating the noise in a 20MHz bandwidth shows that it will be around 73dB. Meaning that the noise floor in 20MHz will be around -101dB (or -96 or less in a practical sense). Thus, the higher the bandwidth, the lower the dynamic range will be. 24 Bit is only useful in bandwidths around 1KHz.

  • 1
    I would think a major limiting factor would be the presence of other radio-frequency sources. If one were near Chicago IL trying to pick up 1020Khz (KDKA Pittsburgh PA), I would think the much stronger 1000Khz signal for WMVP would represent so much noise that I'm not sure one would get full benefit from even a 16-bit DAC. – supercat Apr 19 '13 at 15:40
  • 3
    @supercat - on the contrary, the presence of strong interfering signals in the ADC input passband is why you would want a high-resolution ADC - you need a very large instantaneous dynamic range so that the signal of interest (which has only the noise integrated over its narrow bandwidth) can be recovered after the strong interfering adjacent signals are removed by digital filtering. However, for a narrowband result there is some comparability between a slow high resolution ADC or a faster moderate resolution one. – Chris Stratton Apr 19 '13 at 17:31
  • So higher frequency = more noise? –  Apr 20 '13 at 00:05
  • The larger the sampled bandwidth, the more noise it will contain. This figure is defined by temperature. So, freezing your equipment is the only way to increase the dynamic range. At 40MSps or more, the design should use 16 bit or less. Below 20MSps, at least some of the extra dynamic range provided by 24 bit could be used, but we are still only talking a theoretical -111dBm at 2MHz (4MSps). – RadioGuy999 Apr 20 '13 at 12:39
  • The "should use 16 bit or less" makes the unjustified assumption that the noise is going to be integrated over the entire ADC bandwidth - quite often that, that is not the case in that the channel of interest is a narrower bandwidth digitally selected after the ADC. However, such selection can also effectively extend the bit width some - so while a higher resolution ADC is not useless, it may not be necessary either. – Chris Stratton Apr 28 '13 at 16:10
1

The figure of merit is not number of bits, but effective number of bits times bandwidth.

20ish bits is about as good as it gets in a 20Khz bandwidth (there are no real 24 bit converters in that bandwidth), and by the time you are looking at a 60MHz bandwidth you might on a good day manage 13 bits or so.

But there is a trick to all this, iff the ADC is correctly dithered (And the RF ones usually have more then sufficient uncorrelated noise at the input to do this), then every time you halve the bandwidth in your digital filters and decimators you lower the total noise power in your remaining bandwidth by 3dB, so dropping the bandwidth by a factor of four gets you an extra bit....

Consider a 14 ENOB bit ADC @ 125Ms/s, by the time you have decimated down to ~8KHz you have gained 7 bits, giving you 21 bits effective into your post processing in a reasonable SSB bandwidth, or nearly 20 bits in a 16khz bandwidth.

FM does of course suffer from rather large channel bandwidths, nature of the beast.

Dan Mills
  • 17,266
  • 1
  • 20
  • 38
0

I think that the minimum detectable signal formula can be used for the demodulator block of and SDR, not with the whole sampled bandwidth.

I have seen tests where an 8 bit SDR can see signal down to -130 dBm but with the MDS formula the limit would be -110 without counting NF. See https://www.youtube.com/watch?v=sDLl7Mpa08I

I you raise the FFT precision you actually lower the noise floor. For example, with rtl-sdr sampling at 2.4 MSPS a FFT with size 512 gives you 4.7 KHz resolution bandwidth. with FFT size of 8192 you have a resolution bandwidth of 293 Hz. With better resolution bandwidth you signal pop out form the noise floor. This means that the MDS formula cannot be applied with the full bandwidth of SDR unless you actually have a signal using the full bandwidth of SDR.

in the end, I think a 24 bit ADC would be useful (even more on VHF and UHF where man made noise is lower), but linearity of actual audio ADC can be a problem.

Luigi
  • 1
-1

Well, because there aren't any 24 bit ADC's digitize fast enough, for one. What advantage would 24 bits of data provide? Keep in mind that doing that just bumped up your filtering/Signal processing overhead by 33%. 16 bits gives you ~144 dB of dynamic range, which is significantly higher than an analog front end will be able to provide. (Thanks Kaz for the quick Calculation)

rfdave
  • 1,821
  • 10
  • 7