2

I am not into high frequency much and just know about ringing caused by reflections due to impedance mismatching in transmission lines.

So I recently saw some non-engineer researchers in a lab that they were measuring 3MHz signal with 1 meter 50 OhM coax cable and the cable was directly coupled to the scope input.

In such electronic instrumentation I am wondering when would a 50 Ohm termination required. Is there a practical way to roughly tell whether 50 Ohm termination is necessary? (Assuming source impedance is negligible, 1 meter 50 Ohm coax cable, and a 3MHz signal, scope input impedance is 100Meg)

hacktastical
  • 49,832
  • 2
  • 47
  • 138
pnatk
  • 1,389
  • 12
  • 33
  • Are you sure the scope's input impedance is 100 MΩ? That's quite high. – Hearth Mar 22 '21 at 23:12
  • Basic transmission line theory. –  Mar 22 '21 at 23:17
  • 3
    Did they have the scope set to 50Ω mode? – Aaron Mar 22 '21 at 23:21
  • 3 MHz square wave? The rise and fall time are the important parameters. Are you motivated to learn? Get this book: High Speed Digital Design: A Handbook of Black Magic by Howard Johnson & Martin Graham – Mattman944 Mar 22 '21 at 23:24
  • It is much more complicated than you realize. It is really not a signal integrity question. It is a system design question. If the system is designed to feed a 50 Ohm load, then you have to probe it with a 50 Ohm cable and termination. If the system is NOT designed to drive a 50 Ohm load then you must not probe it with a 50 Ohm termination, and adding a 1 meter cable may dramatically distort the signal, so keep that in mind, too. Finally, if it is designed to drive 1 50 Ohm load, adding a second one just to probe it will distort the signal also. – user57037 Mar 22 '21 at 23:41
  • Unfortunately, it is not simple and you have to know what you are doing and what you are trying to accomplish and how the system was designed. – user57037 Mar 22 '21 at 23:42

2 Answers2

2

In such electronic instrumentation I am wondering when would a 50 Ohm termination required. Is there a practical way to roughly tell whether 50 Ohm termination is necessary?

If you are transmitting a clock signal i.e. a square wave to a high impedance load at the end of a cable (aka transmission line) AND all you are interested in is receiving a "decent enough" waveform at the end of the cable then we have a rule of thumb: -

Terminations are only required if the highest frequency of interest (that we need to keep a decent-looking waveform) has a wavelength that is at least 10x longer than the electrical length of the cable.

1 meter 50 Ohm coax cable, and a 3MHz signal

So, with a 3 MHz square wave, we might consider that the 7th harmonic of 3 MHz (21 MHz) is the highest frequency we need to be interested in maintaining. That has a wavelength of 14.28 metres and that is 10x more than the electrical length of the cable. Under these circumstances (clock transmission) we would be OK.

However, we can look at it from the perspective of the 1 metre cable and say that the we should not try and transmit a signal (without a proper termination) that has a wavelength of greater than 10 metres. That would be a frequency of 30 MHz.

Of course, different applications have different requirements and this rule of thumb won't perfectly suit every application.

Andy aka
  • 434,556
  • 28
  • 351
  • 777
1

The scope input (standard BNC type) is either 50 ohms or 1M ohm. The user selects the type as part of the input setup, or the scope may auto-detect the kind of probe that is attached.

For scopes, 'practically' then:

  • Passive probes use 1M
  • Active probes use 50 ohm
  • Direct RF cabling uses 50 ohm

For the active probe and direct cases, the scope is terminated in the cable's characteristic impedance to absorb any reflections and maintain good signal fidelity (integrity) at the input. Without that termination, the waveform will be severely distorted and render the measurement unusable.

If 'some non-engineer researchers' (easy there, don't be an elitist) were using a 50 ohm source (like a signal generator), and were measuring the waveform with the 50 ohm termination switch on, it would have worked just fine. In fact if they were not using the high-impedance probe it's the right way to do it. The downside would be that the signal would be loaded down so the voltage swing would be lower (for a 50 ohm source, by 50%).


So you'll hear talk about 'transmission lines' and 'characteristic impedance'. What this means is that a given cable has, per unit length, a unit inductance and unit capacitance. The cable's net impedance is, roughly speaking:

  • \$Z_0 = \sqrt{\dfrac{L}{C}}\$

where L and C are the unit inductance and capacitance, respectively.

What determines these L and C values? The unit inductance is mostly about how thick the coax wire is, and unit capacitance is the dielectric thickness and its material type that in turn determines its permittivity.

More here: How is xΩ impedance cable defined?

Why 50 ohm? This cable impedance was settled upon as a compromise in the early days (1930s) of cable development for radio transmitters. More here: https://www.microwaves101.com/encyclopedias/why-fifty-ohms

Today, 50 ohm is the impedance of common cables like RG58a/u and remains the main choice for microwave and high-speed signal link work like PCI Express, SATA, USB3.0 and other fast serial interfaces.

75 ohm is the the other widely-adopted choice, used mainly for UHF/VHF and cable television. It is preferred for that use owing to its lower capacitance and reduced signal loss. More here: http://cablesondemandblog.com/wordpress1/2014/03/06/whats-the-difference-between-50-ohm-and-75-ohm-coaxial-cable/

To use 75 ohm directly on a 50 ohm scope, an impedance matching network is used.

hacktastical
  • 49,832
  • 2
  • 47
  • 138