2

I know that, in a telecommunicacion system, the AWGN power after going through the receiver filter is $$ P_z = KTB $$ where B is the bandwidth of the filter, T is the operating temperature of the receiver and K is Boltzmann's constant.

One of my professors has used this fact to show that a bigger bandwidth implies worse SNR in the received signal, but I'm still confused about what are the implications this has. As far I know, in a digital communication system, when a matched filter is used at the receiver, under AWGN conditions the received symbol distribution will be

$$ b/A=a_i \sim N(a_i, \frac{KT}{2}) $$

where b is the received symbol, A is random variable for the sent symbols and a_i is a particular realization of A. Looking this distribution, it is that variance of the received symbol $$\frac{KT}{2}$$ what affects the probability of symbol error and it is independent of the bandwidth of the receiver matched filter. For that reason, I would like to ask: Is there any reason why a bigger bandwidth could affect the BER in digital communication system?

rmac
  • 21
  • 2
  • Question, is the SNR your prof is talking about measured through a matched filter? Are oranges and apples being compared? – Neil_UK Nov 18 '20 at 18:07
  • At first, I though he was talking about the SNR of the analog signal before it goes through the matched filter, but he was talking about it as if a bigger bandwidth was worse for the error performance (what would mean the SNR after the matched filter gets worse when the bandwidth is increased) – rmac Nov 18 '20 at 18:16
  • The SNR that *matters* is that between the desired signal, and the noise that cannot be distinguished from the desired signal. In a narrowband receiver, that would be the noise that gets through the filter. In a receiver looking for some more complex coding scheme, it would be the noise that is sufficiently signal-like to leak through that detection. Arguably depending on how a receiver is constructed, you could even have an SNR applicable to signal (preamble) _discovery_ and a distinct SNR applicable to signal _reception_. – Chris Stratton Nov 18 '20 at 19:49

2 Answers2

1

Does the bandwidth at the receiver really affects the noise in digital modulations?

Yes; noise is white, i.e. it has constant power spectral density, and cutting out a smaller part of that that spectrum means lower power (by the definition of density!).

What you forget to mention in context is that the formula is normalized to the symbol rate and thus bandwidth of your system; that means that while increasing the bandwidth does increase the observed noise power, the amount of noise energy per symbol is unaffected.

Marcus Müller
  • 88,280
  • 5
  • 131
  • 237
  • Thank you! So, as the energy per symbol is what really matters for the probability of symbol error, increasing the bandwidth will not affect the probability of symbol error right? – rmac Nov 18 '20 at 18:29
  • Increasing the bandwidth lets more noise in @raul but doesn't do anything to improve the symbol energy. – Andy aka Nov 18 '20 at 19:04
0

Yes ABSOLUTELY The “optimal receiver design” theory is one that matches the BW and flat group delay or linear phase shift of the rising spectrum.

In many cases the transmit BW is pre-conditioned in time domain or frequency domain to compensate for situations where noise amplitude increases with frequency rather than flat AWGN or the noise is pink (-3dB/dec) or brown (-6dB/dec). So it depends on the noise spectrum to choose the optimal BW shape.

Such methods are used in HDD magnetic recording, Radio pre-emphasis curves, RIAA audio recordings on vinyl. Phase correction is also done to compensate for medium phase distortion in wide and orthogonal modulation.

But generally for modems be they 56k or giga-bps wired types, the spectrum is decimated into many small bands so that the group delay and channel BW slope is relatively flat and narrow. Then DSP’s use optimal filters to maximize the SNR and flatten the curve and group delay (GD) with a training set of signals and feedback messages to tweak the matched BW for flatness and minimal phase distortion. Also in radio designs the passive filters are chosen for this property but sometimes must extend the BW to flatten the GD in the signal to avoid distortion as a compromise.

In binary data communication, the phase shift and jitter determines the BER per symbol pattern. To optimize this the best filter is often chosen with a Raised Cosine amplitude response such that the BW is limited but the ringing occurs gently (damped) at the data zero crossings. This is the ideal case and not Chebychev, Butterworth or Cauer filters or even 1st order RC filters. The shape of the filters can be used to predict the performance of the BER or energy per symbol, yet the spectrum of each symbol changes with bit patterns like 001100 vs 0110110110 vs 010101

Tony Stewart EE75
  • 1
  • 3
  • 54
  • 182