5

Ohmmeters don't appear to make LEDs conduct which gives overestimates in the resistance measured (tested with a R1//(R2+LED) parallel circuit). So what is the voltage they apply to the circuit, assuming it is conventional? I assume it depends on the range selected (I seriously doubt it will apply 1V to a mOhm branch), but to what extent?

Mister Mystère
  • 9,477
  • 6
  • 51
  • 82
  • I am not asking for a solution, as I am already measuring those resistances with a DC power supply that shows the current. – Mister Mystère Nov 27 '14 at 18:21
  • 2
    Impossible to say, it depends on the DVM. – Leon Heller Nov 27 '14 at 18:22
  • Your question implies you are trying to measure the _resistance_ of an LED. This is unlikely to be useful as an LED is a _diode_, not a _resistor_, and treating it as a resistor will only lead you astray. – John U Nov 27 '14 at 18:40
  • Do you mean what is the test voltage of an *ohmmeter*? Voltmeters measure voltage, so why would they have a "test voltage"? – Phil Frost Nov 27 '14 at 19:16
  • @PhilFrost: good catch, that's a "typo". Corrected. JohnU: actually diodes have a resistance, it only varies nonlinearly with the voltage applied. In the present case, I want it to be close to zero so that the resistor in series with the diode is also taken into account, and that happens only when the voltage is higher than the LED voltage drop. – Mister Mystère Nov 27 '14 at 19:51
  • An alternate method to measure resistance, apply a constant current through your device and measure the voltage across it. Then apply ohms law. – Optionparty Nov 27 '14 at 21:32
  • @Optionparty: exact, that's what I'm already doing (see my first comment). – Mister Mystère Nov 28 '14 at 00:50
  • Sorry for the misunderstanding I should have said "Constant Current Source". As to the LED's resistance, the diode has a voltage barrier to overcome. Increase a voltage to the LED until it conducts, note the voltage and current. Increase the voltage slightly, note the voltage and current. The LED's rate of resistance will be the difference in voltage divided by the difference in current. Plotting that slope from a zero current, offset by barrier voltage. I hope this is what you are looking for. – Optionparty Nov 28 '14 at 15:14

3 Answers3

4

The obvious answer is to measure it with another meter.

Other than that, this can vary by meter but is usually around a volt or two. By default, most meters put out enough voltage to turn on ordinary silicon diodes. Some have a special low voltage mode meant to specifically avoid turning on silicon diodes, but the accuracy is lower.

Many hand-held meters simply apply the battery voltage. One meter I have takes a separate D cell just to power the resistance test. It takes 4 AA cells to power the amplifier and the rest of the meter, and the single D cell provides the voltage for the resistance sense. The one-cell voltage is about right in that it will turn on ordinary diodes, but is very unlikely to cause any harm, even of something is high enough impedance to not drag down the voltage.

Olin Lathrop
  • 310,974
  • 36
  • 428
  • 915
  • nice configuration!! Is it constructed by yourself? – GR Tech Nov 27 '14 at 19:19
  • So the voltage does not change with the range of the meter? That's interesting. – Mister Mystère Nov 27 '14 at 19:52
  • 1
    @Mist: In some types of meters. The one I mentioned in particular works on the voltage divider principle. Different scales switch in different fixed resistors for the bottom of the divider. The result is amplified, then applied to a analog meter movement. Full scale is always 0 to infinite Ohms, but the range with good resolution in the center varies by scale. Other meters inject a fixed current. There are many ways to measure resistance. – Olin Lathrop Nov 28 '14 at 13:56
  • Thanks for your insight on what's inside. If I could accept several answers I'd do it. – Mister Mystère Nov 28 '14 at 14:32
3
  1. It depends on multimeter

  2. It doesn't have to be a constant voltage.

You have to measure it, as others suggested, or search for some documentation about your multimeter.

Kamil
  • 5,926
  • 9
  • 43
  • 58
2

If it's a DVM I'd think along these lines: -

Smallest resolution in ohms is probably 0.1 ohms and smallest resolution in volts might be 1mV. This leads to the conclusion that the current used in the lowest ohm range is probably: -

\$\dfrac{1mV}{0.1\Omega}\$ = 10mA.

Given that the lowest ohm range will probably go-over range at 200 ohms, the maximum voltage it likely produces is 4 volts across 200 ohms.

Andy aka
  • 434,556
  • 28
  • 351
  • 777
  • 1
    +1 because I agree with the approach, but most of the meters I have are 0.1mV resolution (3-1/2 digits) so it would be 1mA and 0.4V. The couple I checked didn't make a diode conduct on the 199.9 ohm range. – Spehro Pefhany Nov 27 '14 at 19:44
  • Good approach, thanks. So that means the voltage changes with the range used? Because Olin Lathrop seems to say it doesn't (at least for some meters). – Mister Mystère Nov 27 '14 at 19:53
  • 1
    @MisterMystère I can't say it applies to all meters but for DVMs they inject a definable constant current and measure volt drop (hence my answer). For the 200 ohm range I'll use spehro's ratio and say they inject 1mA therefore for each 0.1 ohm there is a voltage produced equal to 0.1mV. For the 2kohm range the injection current will be 0.1mA yielding the same voltage range but each mV measured will equate to 1.0 ohms of resistance etc.. – Andy aka Nov 27 '14 at 23:28
  • @Andyaka Constant current, calculated for yielding the same voltage over each ohm setting, might explain why the lower ohm settings on my multimeter taste stronger than higher ohms. (Because getting voltage drop N over 1 ohm requires more current than voltage drop N over 1000 ohms.) – Erhannis Jan 29 '19 at 19:11