What is the proper procedure to measure RMS noise with an oscilloscope?
I have 2 standard resistors, 1k and 100k and a 1.5GHz BW, 5GSa/s Keysight DSO. I have a probe which offers 1:1 (1MOhm//100pF) and 10:1 (10MOhm//15pF) ratios.
I use AC coupling, 1:1 probe and "AC RMS - Full Scale" measurement - which 1mV/div and 100ms/div.
I measure 340uVrms for 1k and 940uVrms for 100k (just clipping the resistor between the probe).
However, I cannot reproduce this result using equations. Three attempts:
The resistor is in parallel with the probe which is 1MOhm and 100pF for 1x. Hence the bandwidth will be 1MOhm//R//1pF ~ 1/(2*pi*RC). Since the total integrated noise is given by 4kTRB = 4kTR/(2*piRC) = 2kT/(piC) = 2/pi kT/C the result should be independent of the resistor 5.1363uVrms. This is far off from the numbers above but even worse - the numbers above are different
I assume the bandwidth limitation comes from the probe itself which is 6MHz for 1x. The result would be sqrt(4*kT*100e3*6e6)=100uVrms for 100k and 10uVrms for 1k. Again, both far off.
I assume the bandwidth is limited by the bandwidth spec of the scope which is 1.5 GHz. This gives 157.68uVrms for 1k and 1.5768mVrms for 100k.
Again, not consistent.
How do I measure noise with an oscilloscope?