13

I'm only speaking for equipment interesting in voltage measurements.

Multimeters, oscilloscopes and conventional lock-in amplifiers seem to have mainly input impedances of 10 MΩ. I understand the need for a high input impedance to stop drawing of current and avoiding the voltage divider effect. Why don't manufactures opt for input impedances of higher ranges in their design?

Wouldn't a 100 MΩ input impedance multimeter be of more value for the customer?

JRE
  • 67,678
  • 8
  • 104
  • 179
BlackPenguin
  • 161
  • 1
  • 6
  • 3
    100 MOhm won't be of much advantage vs 10 MOhm. And for yet higher values, bipolar input amps become unfeasible. If people *need* really high input impedance of TOhm it anyway requires special cabling and such – tobalt Oct 31 '21 at 09:08
  • 5
    The model 640 electrometer from Keithley uses guards and sapphire insulation for an input resistance greater then 10^16 ohms shunted by less than 2 pF. Just FYI. (I've achieved similar using dice and manual wire-bonding.) – jonk Oct 31 '21 at 09:09
  • 1
    @tobalt So, do you mean that 10 MOhm is just a conveniently good number? – BlackPenguin Oct 31 '21 at 09:26
  • 3
    @BlackPenguin yes exactly. It is low enough that it can be set by a simple resistor and will not depend on variations of the actual input amp. – tobalt Oct 31 '21 at 10:10
  • @jonk aren't those 2 pF input capacitance an absolute showstopper? If you indeed wanted to measure on e.g. a 1e15 Ohm specimen, you are looking at hours of settling time. – tobalt Oct 31 '21 at 10:14
  • @tobalt look at the datasheet. It is all there. (Sorry I did not include a reference to it.) – jonk Oct 31 '21 at 10:17
  • 12
    Most scopes are actually 1 Megohm, not 10. Only when used with a 10:1 probe do they give 10 Megohm input resistance. Not all scopes do that either. – Kevin White Oct 31 '21 at 11:10
  • 100Megohm would cost more and not be any significant extra value for most uses. I rarely have any issue with 10Megohm being too low. – Kevin White Oct 31 '21 at 11:12
  • 1
    @KevinWhite Thank you, I just started to wonder why my all meters and scopes are 1MOhm and are they in any way substandard. (Well, most of them are substandard but not in regard to the input impedance) – fraxinus Nov 01 '21 at 12:50
  • I am not sure, are we talking about scopes or DMMs? A fairly common (and old) DMM such as the Keythley2000 has >10GΩ impedance for the 10/1/0.1 V ranges. – Vladimir Cravero Nov 02 '21 at 13:27
  • Is nobody gonna talk about how a multimeter’s input impedance can be used as a current shunt to measure small currents? That’s the real reason why input impedance it’s always a “nice” value like 10M or 1M: A DMM in voltage mode is a nanoammeter – Navin Nov 10 '21 at 22:03

4 Answers4

21

Having worked with test equipment with a 100 MΩ input impedance, I can say that it does not only have advantages. It requires much more careful handling than normal voltmeters or scopes. For example, touching a lead with your bare hand will charge it, causing voltage offsets which take tens of seconds or even minutes to fully dissipate. In many cases, errors caused by such effects would outweigh the increased precision bought by having higher input impedance.

As Neil_UK and Vladimir Cravero have pointed out, the input impedance is not (at least not the only) cause for such behaviour. As I realized now, another reason for this effect is that I used a high-gain amplifier to measure very small voltage differences. In this situation, touching a lead can drive the amplifier into saturation, which then takes a long time to recover.

Roman
  • 311
  • 1
  • 3
  • 5
    To add onto this, I often take a voltage measurement between two points which turn out to be floating with respect to one another. With some input conductance, this condition is easy to recognize: the voltage starts out at some nonzero value and then decays towards zero. If the input resistance were much higher, then I would instead see an arbitrary constant voltage, and that would make it much harder to figure out what's going on. – Cassie Swett Nov 01 '21 at 01:35
  • 1
    So with too high input impedance, you would have too high of an RC time for a convienient measurement? – BlackPenguin Nov 01 '21 at 03:51
  • Taking 10s of seconds for readings to stabilise is a function of your source impedance, not the meter's high input impedance. Without the meter's high input impedance, you wouldn't even be able to read your source. – Neil_UK Nov 01 '21 at 14:31
  • 1
    As an even bigger issue, if a scope drew zero DC current from the probe, but the probe and cable had some parasitic capacitance (inevitable), an intermittent probe connection could behave like an unpredictable sample-and-hold. – supercat Nov 01 '21 at 22:51
  • 1
    This answer must be inaccurate, in the sense that some information is missing. I don't know what cabling you are using, but 100 MΩ x 10 nF = 1 s and 10 nF is a HUGE stray capacitance. I work with >10GΩ instruments, and while I can certainly charge the inputs, it takes a few seconds for them to discharge. On top of that, I expect the instrument to also have some DC bias current flowing in/out of the terminals, which probably contributes to the charge/discharge in a not negligible way. – Vladimir Cravero Nov 02 '21 at 13:33
  • Vladimir Cravero, you are partially right. Now that I think about it, the main reason for such long timescales is that the measurement device I was talking about contains a high-gain amplifier. It runs into saturation and then requires a long time to operate normally again. – Roman Nov 02 '21 at 20:26
20
  • Remember that the actual ADC in the meter doesn't have infinite input impedance. It will source or sink some current on the voltage divider circuit. That means that raising the voltage divider resistor values will cause increased bias current offsets which will affect accuracy.
  • The potential divider resistors will be more difficult to manage as leakage across the PCB, switching contacts, autorange selectors, etc., will become more difficult to manages.
  • Standardisation of the input impedance. Users expect 10 MΩ now.
  • It's high enough for most applications.
  • I don't know if it's a factor but they have to work on AC as well as DC.

Wouldn't a 100 MΩ input impedance multimeter be of more value for the customer?

For some, perhaps. The increased sensitivity to stray fields could be a problem for others.

Transistor
  • 168,990
  • 12
  • 186
  • 385
  • 1
    The question is how much is infinite. I have an 8846A multimeter here and this MM draws less than 10nA @ 10V, so this device presumably has more than 10 GOhms input resistance and it is digital and certainly has an analog-to-digital converter. How would you explain that? – arnisz Nov 01 '21 at 21:29
  • 4
    @arnisz, like this: [Fluke - 8846A - Starting from €1931.10 - rs-online.com](https://ie.rs-online.com/web/p/multimeters/6159978/). – Transistor Nov 01 '21 at 21:45
14

Infinite input impedance would be ideal. 'High enough for most people' turns out to be commercially more practical.

It's relatively straightforward and cheap to make practical amplifiers with 1 MΩ and 10 MΩ input resistances with a reasonable bandwidth, and these satisfy a huge segment of the market.

Where a user needs a higher input impedance, it's more sensible for those few users to use a custom input amplifier, dedicated to their particular application. For instance, if you want to measure input currents of fA, then charge storage on insulating surfaces, and cosmic ray ionisation of air-spaces becomes significant. You don't want to start engineering tolerance of those effects into every $10 multimeter.

Neil_UK
  • 158,152
  • 3
  • 173
  • 387
  • 8
    *"Infinite input impedance would be ideal."* No, it would not. This is one of those half-truths we really need to put to rest. Infinite input impedance buys you *one thing* and one thing only - that your meter has zero influence on the circuit under test. Everything else that comes with infinite input impedance is a nightmare that makes measurements difficult and error-prone. If I could actually make a meter with infinite input impedance, I'd be willing to bet that the first thing you did to it when you actually started using it would be to add a small load. For your sanity. – J... Nov 01 '21 at 13:17
  • @J... I think that *other things being equal*, if a 10 Mohm input DMM didn't draw the input current it did, it would be 'better'. In fact, my ANENG AN8002 relatively cheap DMM unloads its input on the 200 mV range and looks as near infinite as I can measure. Presumably it has a series 10 Mohm input resistor, with switchable shunts to create the input voltage divider, and with no division, I'm just seeing the buffer amplifier input. With a low source impedance, it's just wonderful, even autoranges down to 20 mV full scale with noise-free resolution to 10 uV. – Neil_UK Nov 01 '21 at 14:27
  • 4
    I think *"other things being equal"* is another one of those qualifiers that puts us *out of the realm of reality*. I mean, here's me frequently reaching for the "LoZ" mode of my Fluke because a meter impedance of 3k-ohm will drown all of the spurious signals I *don't want to measure*. And when we're talking about automated machines it's common to see ADCs with 10k, 5k, or even 1.5k input impedances. There's no such thing as "ideal" - every application has different challenges and different goals and absolute accuracy is only one of those goals. – J... Nov 01 '21 at 14:37
  • @J... If I don't want to measure something, then I'll put an explicit external filter or shunt in the way of my ideal meter. That's why I gave the example of my very 'realm of reality' AN8002, £12 on BangBad, with shipping! With a high source impedance DUT, it's all over the place. – Neil_UK Nov 01 '21 at 16:11
11

It's a happy middle ground for most users that allows satisfactory accuracy while still allowing sloppiness in use.

Too low and your signal source gets loaded and distorted more than you can tolerate. Too high and your signal currents get reduced so much that they start to approach the magnitudes of the leakage currents in the insulating materials everything is made of (and contaminants on them). Signal-to-noise ratio in a sense. The neat, well defined signal current in your circuit traces start to blend in with the leakage currents flowing into and out of those traces from the surroundings.

DKNguyen
  • 54,733
  • 4
  • 67
  • 153