6

My question relates to the accumulated AC voltage that will appear on the screen of a length of coax:

enter image description here

If the cable is properly terminated at one end and fed with data (or any signal) at the other end, what will be the voltage on the screen relative to ground? Will it be ground potential(?) or will it be some other potential(?), and if so, will it be potentially acting as a "radiator" of the data?

Andy aka
  • 434,556
  • 28
  • 351
  • 777

2 Answers2

4

Plant leakage is a well known effect - The system of cables and connectors in distribution systems is known as "plant" for some reason - and causes signal to leak out and be received via antennas.

In an ideal coax, the wave travels between the conductors and imposes local current eddies and voltage nodes along the length of the transmission line, but most notably on the inner surface of the outer conductor and on the outer surface of the inner conductor (the skin effect). Again ideally, no signals that travel on the outside of the outer conductor can interact with the signals on the inner surface of the outer conductor.

Plant leakage arises from the divergence from this ideal, when you use wound or wrapped coax, poor connector termination and also mismatched termination loads.

In your drawing above, assuming the termination also is properly shielded, the return signal should travel along the inner surface of the outer connector if the system is ideal. So you shouldn't actually see any radiation. Evidence of this is how non-coaxial waveguides (TEM- mode) work - everything flows along the inner surface and there is very low leakage.

If this is beyond a hypothetical situation and you are seeing leakage, it can arise from the quality of the coax, if it is solid foil or braided, and that depends on the frequency you're driving it at too (wavelength vs. mesh size). Connector shield continuity, termination matching all are radiation points. Also, the attached eqt. on the receiving end may inject a current mode return signal into the grounded shield causing the whole outer surface to act as an antenna and if this is a signal derived from the feed signal (amplifier power rail bounce as one example) then it might appear that the coax is leaking.

In general cable television plant leakage arises because of the less expensive cables they use (they have many many miles/KM installed cost really matters), the difficulty in maintaining a connector over many years and damage, nicks and knicks etc.

There are some systems that use gas dielectric and solid conductors that transport 10's of MW to antennae with almost no leakage so ideal performance is approachable.

placeholder
  • 29,982
  • 10
  • 63
  • 110
2

If the load end of a coaxial cable is not grounded, then yes, there will be a voltage drop along the length of the cable created by the signal current interacting with the nonzero resistance of the shield (screen). The voltage will be very low, since it's created via a voltage divider effect between the load impedance and the shield impedance, and the latter should be orders of magnitude lower than the former.

This also means that the cable can radiate, although the radiation will be very weak, because there is no associated magnetic field outside the shield — it is nearly all cancelled out by the current in the center conductor. The only imbalance in the current is created by the current required to charge and discharge the free-space capacitance of the far end of the cable. Again, this will be very, very tiny with respect to the signal current going to the load.

Dave Tweed
  • 168,369
  • 17
  • 228
  • 393
  • On non-perfect lossy-cable, I can envisage a scenario where the p-p amplitude across the terminator is (say) 50% of the sending end; how fair is it to say that the "loss" of 50% is shared equally down centre conductor and screen? Will the screen voltage "drop" 25% and centre conductor 25%? Is the voltage "dropped" on the screen purely due to it's resistance? That voltage loss has to appear somewhere I would have thought? – Andy aka May 16 '13 at 13:50
  • @Andyaka: The loss in lossy cable is mostly due to dielectric absorption -- the energy gets turned into heat in the dielectric between the center conductor and the shield. Some cables (oscilloscope probes) are made deliberately lossy, but the resistance is all in the center conductor, and the shield is still built with as little resistance as possible. None of this contributes to external radiation. – Dave Tweed May 16 '13 at 14:38