I know that, in a telecommunicacion system, the AWGN power after going through the receiver filter is $$ P_z = KTB $$ where B is the bandwidth of the filter, T is the operating temperature of the receiver and K is Boltzmann's constant.
One of my professors has used this fact to show that a bigger bandwidth implies worse SNR in the received signal, but I'm still confused about what are the implications this has. As far I know, in a digital communication system, when a matched filter is used at the receiver, under AWGN conditions the received symbol distribution will be
$$ b/A=a_i \sim N(a_i, \frac{KT}{2}) $$
where b is the received symbol, A is random variable for the sent symbols and a_i is a particular realization of A. Looking this distribution, it is that variance of the received symbol $$\frac{KT}{2}$$ what affects the probability of symbol error and it is independent of the bandwidth of the receiver matched filter. For that reason, I would like to ask: Is there any reason why a bigger bandwidth could affect the BER in digital communication system?