5

I have seen many spectrum analyzers with the warning on the RF input port: “!!! Avoid electrostatic discharge and DC voltage !!!”.

Out of precaution I always connected my signal source with a DC block but now I find the DC block filters my signal too much.

The signal comes from a line driver, there should generally be no DC voltage but I can’t completely guarantee that.

Since I have seen this multiple times I assume that this is something generic.

I understand that excessive ESD might be bad ... but DC voltage?

Does anyone know the reason for it and how seriously to take the DC voltage?

I mean, some amount of DC voltage (uVs) may always present.

Am I too over cautious? Is it safe to have a low DC voltage, say +/-1V on the signal?

divB
  • 1,312
  • 14
  • 32
  • I'm voting to close this question as off-topic because the answer is in the user manual or possibly printed on the instrument. – Vladimir Cravero Jan 08 '18 at 08:36
  • 1
    I think it's an interesting question. Why is it so picky, and why can't it block DC internally like any old scope? But it would _really_ be good to know the exact model. – pipe Jan 08 '18 at 08:47
  • Since I saw this multiple times I assume it is something generic and hence my question. I updated the question to reflect that. I will also check my particular model, just in case. – divB Jan 08 '18 at 21:05
  • +1 You are definitely not being over cautious, DC voltages (even very small ones) are very harmfull to spectrum analyzers if they don't include an internal dc-block. The very sensitive input stages/filters of the analyzer are not designed to have dc voltages on them. I think it is an interesting question though and I would love for someone with more knowledge hereof to explain in great detail what typically happens when you put dc on the input of a spectrum analyzer – Vinzent Jan 09 '18 at 00:37
  • Spectrum analyzers are actually so sensitive to dc that you can destroy them even though you have a dc-block on; if the dc-block is big enough and you have something with a high enough dv/dt, this can actually cause something that looks like a small momentary dc on the input even though you have a dc-block in on it. That happened to one of my previous colleagues, he burned off two spectrum analyzers this way before he realized that his choise of a very big dc-block ment that a dc that was being swichen on coupled though the block to the input.. – Vinzent Jan 09 '18 at 00:46

1 Answers1

2

in your example, the input power level is 10dBm max. A DC signal could be considered to be an RF signal at 0Hz. 10dBm into 50 Ohms is 0.707V RMS, so 1 VDC is above the +10dBm maximum. 1 VDC is +13dBm into 50 Ohms.

This huge signal at 0Hz can cause measurement accuracy problems with signals at other frequencies, for example compression due to front-end overload. To have DC and still have measurement accuracy, 125mV is probably the absolute maximum that I would ever put on the front end, and I would stay under 70.7mV to make a good measurement. But you should just use a DC block unless there is a good reason not to. The rating is 0 VDC for a good reason, so don't put DC on it. This is not generic advice.

I don't speak for Keysight Technologies.

Tom Anderson
  • 1,789
  • 10
  • 10