8

Datasheets of multimeters contain accuracy specifications.
One parameter is usually accuracy over period of 1 year. I understand that it means multimeter can be off by the specified value.

For example Keithley 2000 for 100mV range:
1 year accuracy = 0.005% (of reading) + 0.0035% (of range)

or Siglent SDM2055 for 200mV range:
1 year accuracy = 0.015% (of reading) + 0.004% (of range)

But question is what accuracy I have to consider over period of 10 years?
Do I have to multiply "1 year accuracy" by 10? Or it is not nearly that easy (and not that bad)?

No hobbyist is going to do calibration of his equipment every year. It would be useful to know how does the accuracy shift in longer period of time.

Chupacabras
  • 5,394
  • 2
  • 23
  • 50
  • 3
    The manufacturer does not want to specify accuracy for a 10 years period, the necessary tests will take too much time. – Uwe Feb 23 '18 at 18:04
  • Please improve the question by stating WHY exactly this question is useful. I was about to point out SE's general policy of asking questions about issues which you're actually experiencing, when I found your comment on an answer below which shows why you consider it a practical and not-hypothetical question. – Beanluc Feb 23 '18 at 22:58
  • @Beanluc I have added the reason why did I ask that question. – Chupacabras Feb 24 '18 at 06:28
  • "No hobbyist is going to do calibration of his equipment every year". If you've bought expensive equipment then follow the manual. Otherwise all you earn is bragging rights. You can't pay once and have something sorted out forever, nothing in life works this way. – Agent_L Feb 24 '18 at 09:28

4 Answers4

19

Generally that figure is defined because you are supposed to calibrate your equipment annually.

If you don't.. all bets are off.

You can not extrapolate from one to the other, plus aging will not be linear.

Trevor_G
  • 46,364
  • 8
  • 68
  • 151
  • 1
    I know it's generally recommended to calibrate every year. But that was not the question. – Chupacabras Feb 23 '18 at 15:39
  • 2
    @Chupacabras But you cant tell more than a year... it could be anything.. that's the point – Trevor_G Feb 23 '18 at 15:46
  • 1
    Well, the key is here: how does the non-linearity of aging look like? – Chupacabras Feb 23 '18 at 15:55
  • 2
    @Chupacabras hard to say if the manufacturer does not publish that information, which they likely don't since they expect people to calibrate. You have to remember those numbers are also worst case drift, the 0.005% they quote could be for a 10 year old meter.... who knows. Worse. some meters may start out drifting one way then after a few years flatten out then later in life go the other way. – Trevor_G Feb 23 '18 at 15:58
  • 6
    For nonlinearity of aging, consider the "bathtub curve". Some components like electrolytic capacitors change value quite fast once they hit the end of their useful life. Others may drift linearly. You're looking for a simple answer : Trevor's is about it. –  Feb 23 '18 at 16:51
  • Doesn't a DMM use a dual slope technique? an integrator converts an unknown voltage:reference voltage ratio to a ratio of time periods. This avoids errors in comparator offset, capacitor tolerances, integrator non-linearity - at the cost of slow conversions and the input can't vary during the reading time. You're really asking about the quality of the voltage reference and how it ages. – D Duck Feb 23 '18 at 22:32
  • Better quality meters like Fluke use EEPROM look-up tables to account for temperature drift and aging, but even that has finite limits. Most parts do not 'age', but electrolytic capacitors do. –  Feb 24 '18 at 03:15
  • 2
    @Chupacabras It's not recommended to calibrate, it's **required**. Otherwise, the manufacturer takes no responsibility for the accuracy whatsoever. In programming we call it "undefined behavior". The very point of "undefined behavior" is that it cannot be defined nor predicted. – Agent_L Feb 24 '18 at 09:23
6

Do I have to multiply "1 year accuracy" by 10?

Well if you could use it without a calibration being needed it's not strictly the case of multiplying by ten because it's like compound interest that a bank might charge.

So if it drifts +1% per year, over ten years you get \$(1.01)^{10} - 1\$ = 10.46%.

Doesn't sound too bad and for tighter tolerances you can certainly approximate to multiplying by ten.

But you do need regular calibrations for this type of equipment, else what is the point of using it?

Andy aka
  • 434,556
  • 28
  • 351
  • 777
  • Well, hobbyist is not going to calibrate every year. Possibly never. That is the point of my question. Your calculation means you expect that aging is linear. – Chupacabras Feb 23 '18 at 15:59
  • 5
    My answer only explains that the theoretical method is compounding rather than multiplying. – Andy aka Feb 23 '18 at 16:04
3

The simple legal answer is they owe you this accuracy for a year. It the meter fails this within a year you have a warranty claim. After a year (absent another specification) you are on your own. The extreme engineering approach would be for the manufacturer to require drift specs from every vendor and do an error analysis that supports the claim. You can guess as well as I whether they have done that. After a year they have not made a promise. Maybe there is a drift proportional to time^2 or a higher power so things go to pot shortly after one year. In an extreme theory even frequent calibration will not solve this problem.

Practically, shorting the leads together will detect offset errors. It won't help with gain errors. We might measure 1.456 volts on one point and 1.358 on another. Sometimes what we care about is that the first is higher than the second. In practice any time I got that from a meter I would count on the ordering of them, but I wouldn't count on the difference being 0.098 volts. Usually the first is the important fact, not the second. Relative values are much easier than absolute. If you need absolute, you need to be calibrating often and doing careful error analysis. Otherwise you need to develop the skills to understand what you know and what you don't. In practice a 10 year old meter is very useful, but you can't justify it from the specs.

Ross Millikan
  • 260
  • 1
  • 7
2

Assume you bought the meter 10 years ago, or calibrated it 10 years ago, then you get a year of measurements within spec, or the meter is broken.

After 1 year and 1 day? The Manufacturer makes no claim. If you want to claim a spec and can support it, go ahead. But it's on you.

IF you measure the same thing with the same meter for 10 years, without re-calibrating, and then re-calibrate and measure again, then you've got a one-point study of long term drift. Don't forget to include the long term drift of whatever you're measuring. You could look at the data in the uncalibrated interval and draw conclusions about it. But that's your calibration, in the interval, not the manufacturer's calibration.

Re-calibrate the meter after 10 years, measurements will be within spec. for one year, again. Drift over the 10 year period isn't an issue. If you measure with a calibrated meter 10 years ago and measure with a calibrated meter, today, each measurement will be within spec and its LIKELY any difference in the measurements is less than the maximum allowed, in opposite directions. But the maximum allowed, in opposite direction, is the worst possible case.

IF you use a meter that was calibrated once, for 10 years, without re-calibrating, measuring various values, then you've got 9 years of data from an uncalibrated meter. It may be better than random numbers. To know how much better, you need to measure references to establish accuracy now, or re-calibrate and repeat prior measurements to characterize repeatability, allowing for source drift. Either way, accuracy in the uncalibrated interval on your shoulders.

The specifications quoted are really good. If you expect to realize that performance, you have to maintain calibration. If you want a hobbyist quick check, short the two inputs together. That better be 0.0000 volts, 0.0000 amps and 0.0000 ohms. Beyond that, you need a voltage reference, a current reference and a resistance reference. A low drift resistor is not an unreasonable thing for a lab, but why not just get the meter calibrated, or learn to calibrate it yourself, at that point? Before you start shopping for voltage and current standards that are 2-10 times better than the meter spec. They aren't cheap, and they have calibration requirements themselves!

Bill IV
  • 161
  • 5
  • Many (I would assume most or all) multimeters would display the probe resistance if you short the probes together, though, so a 0.0000 ohm reading isn't to be expected. – exscape Feb 27 '18 at 19:54
  • My experience is that they read 0.00, 0.000 or 0.0000, although the OP mentions instruments much nicer than I typically use. So I have to question "most or all". "Probe resistance" is real, but small, as is the microvolt EMF from dissimilar metals. But the $10 multimeter I keep in my desk drawer reads 00.0 on it's 200mV range, 000 on the 2V range, 0.00 on the 20V range, etc. I encourage you to test your assumption on actual hardware and report your results. – Bill IV Feb 28 '18 at 21:27
  • I was referring to resistance, not voltage. My multimeter shows 0.13-0.14 ohms unless you use the relative/null function, depending on how you hold the probe tips. – exscape Mar 01 '18 at 06:45
  • Well, I took my own good advice and discovered I was overgeneralizing too. My desk-drawer meter shows about 1.0 Ohm, and it bounces around a lot depending on how hard one presses the probe tips together. My $35 all singing and dancing meter is more stable and shows lower resistance, but not nothing. – Bill IV Mar 05 '18 at 12:49