14

A recent question on here asking about how to calculate the accuracy of a circuit got me thinking about calibration in general.

In particular, as EEs we routinely use volts and amperes as a unit and yet these are both rather vague and hard things to quantify.

It used to be that a volt was defined by a "standard cell" kept locked up in a vault somewhere, but that changed to using the "Josephson voltage standard" which is a complex system that uses a superconductive integrated circuit chip operating at 70–96 GHz to generate stable voltages that depend only on an applied frequency and fundamental constants.

The latter is not exactly something one could throw together in a basement, or even in the test engineering departments in most companies.

The Ampere is worse. It is defined in SI as "That constant current which, if maintained in two straight parallel conductors of infinite length, of negligible circular cross-section, and placed one metre apart in vacuum, would produce between these conductors a force equal to 2×10−7 newtons per metre of length."

I have NO IDEA how anyone could measure that.

The ohm used to be defined by a specific height and weight of mercury, but that was abandoned in favor of being a derived unit from 1V and 1A.

This all brings me to wonder how much of what we use is calibrated to someone else's meter. And how many of those meters are calibrated to yet someone else's... and on and on. Seems like one big house of cards.

Is there some sort of intermediate standard of measures or equipment you can buy to use as calibrated references for 1V, 1A, and 1R? (Obviously you only need any two of those.)

Bonus question: Is there some sort of certification sticker one should look for when buying a meter or other equipment that indicates it is indeed tested to the actual SI values vs tested against, say, a Fluke?

psmears
  • 678
  • 4
  • 6
Trevor_G
  • 46,364
  • 8
  • 68
  • 151
  • 2
    The Ampere definition is not vague at all, quite the opposite. But yes, it is impossible to measure in practice. All in all an interesting question. Metrology in general has me scratching my head more often than not. – Dampmaskin Apr 03 '17 at 14:32
  • I think the Ampere definition you quote is actually in the process of being replaced with simply the current that is so-and-so-many elementary charges per second (i.e. counting electrons). Not that this makes things a lot easier... – Marcus Müller Apr 03 '17 at 14:33
  • @MarcusMüller you are right there is a proposal for that. Though it really just adds the.. ok so how do we do that instead factor, probably why it keeps getting shelved. – Trevor_G Apr 03 '17 at 14:35
  • @Trevor hm, for very small currents, one can actually count the individual electrons as "events", so that's not all that impossible to do in a lab, but might be a bit hard to scale up to be used as an actual reference (your "you only need 2 of 3" breaks down if your references are orders of magnitudes away from what you want to calibrate, and there's no linear way of extrapolating over that range). – Marcus Müller Apr 03 '17 at 14:37
  • @MarcusMüller, true enough, but then that makes the current definition of 1R a moving target... Jenga anyone ? LOL – Trevor_G Apr 03 '17 at 14:39
  • If you get a (set of) 100mΩ or 1Ω reference, that would actually be something that can be stacked very linearly to get integer multiples :) I'm really mostly worried about multiplying currents, because I've never seen a transistor with a current gain of *exactly* x (and my guess is these things are going to be sensitive and don't want to be put in parallel). – Marcus Müller Apr 03 '17 at 14:41
  • @MarcusMüller, did you happen see the number for 1A... "6.2415093×10^18 charges". I'm guessing that weird no is to bring it into line with what we currently call 1A.. – Trevor_G Apr 03 '17 at 14:42
  • So, back to your question: I feel like I'm not completely sure what you're asking for. Examples of voltage, resistance or current references that one can acquire? Info on how factory calibration of calibration equipment looks like? – Marcus Müller Apr 03 '17 at 14:47
  • @MarcusMüller, can you buy a box that has a sticker on it that says this device is certified under SI#XXXX to produce 1V +-N.Ne-N%. And is there a certification level to look for when buying regular equipment. – Trevor_G Apr 03 '17 at 14:52
  • 2
    Directly implementation of "The Definition" probably is too difficult and excessive for most applications, so various "good enough" implementations of reference devices are available. e.g. http://nvlpubs.nist.gov/nistpubs/Legacy/TN/nbstechnicalnote1239.pdf Also if you pay for more money then you can shorten the "chain of calibration", e.g. instead of using a fluke, you use what fluke uses. – user3528438 Apr 03 '17 at 15:09
  • @user3528438, unfortunately that document (1987) became outdated a year later (1988) when the Josephson model was adopted. Some good points all the same though. – Trevor_G Apr 03 '17 at 15:14
  • 1
    @MarcusMüller: Interesting page at NIST on measuring the newly defined ampere: https://www.nist.gov/news-events/news/2016/08/counting-down-new-ampere – Peter Smith Apr 03 '17 at 15:24
  • @MarcusMüller, nice find. Fascinating stuff huh. – Trevor_G Apr 03 '17 at 15:36
  • 1
    @Dampmaskin, anything with the words "infinite" and "negligible" in it is vague in my book ;) – Trevor_G Apr 03 '17 at 19:12
  • 1
    Related: http://electronicdesign.com/test-amp-measurement/whats-all-femtoampere-stuff-anyhow ; a smart engineer can build amazing things from first principles. – pjc50 Apr 03 '17 at 19:46
  • 2
    Just remember the old axiom: If it works it's a Fluke. – Hot Licks Apr 03 '17 at 21:28
  • *snort* @HotLicks – Trevor_G Apr 03 '17 at 21:29
  • 1
    You also should remember that calibration is always a thing between two points/measurements in time. For applications like frequency its possible to get primary standards that are good and cheap enough (e.g. rubidium clocks) for most labs. We can measure frequency really good, which is why that is part of the base for many other units, like volts, which will be reachable for lower costs once they perfected semiconductor based josephson junctions. For the rest it just gets harder and harder, thus nobody really uses primary or even secondary references due to their cost. – PlasmaHH Apr 04 '17 at 09:25
  • One foot of #10 AWG = 1 milliOhm. Comes in handy for making shunts to read currents of 1 - 100 amps. Assuming the wire was made correctly, and you can chop off one foot accurately, and bond to it with a low resistance connection... In practice, there are so many confounding factors that "good enough is good enough." I once heard a joke that said: if you have one clock you know what time it is, but if you have two or more you are never sure. –  Jun 05 '17 at 16:34

3 Answers3

14

This all brings me to wonder, how much of what we use is calibrated to someone else's meter. And how many of those meters, are calibrated to yet someone else's.. and on and on. Seems like one big house of cards.

and...

Is there some sort of certification sticker one should look for when buying a meter or other equipment that indicates it is indeed tested to the actual SI values vs tested against say a Fluke...

You have described precisely what happens. You don't need to have an exotic, expensive, "golden standard" in your company's in-house lab, as long as you have certified calibration traceability (if you need it) down to an accredited lab that actually has it.

And yes, they actually put in your instrument a sticker issued by the accredited lab, with the date of expiracy of the calibration validity on it. I've seen it myself.

In-house, you'll find yourself in one of these situations:

Traceability chain of calibrations

I recall that when I was working in the aerospace industry we were required to have all measurement instruments calibrated, with their sticker and traceable calibration certificates and associated documentation. The test procedures ("work standards"), instruments to be used, and their calibration traceability, all were required to be exhaustively documented and submitted for customer approval well before any actual delivery testing was performed on the product.

Of course, every industry has its own requirements and quality levels. I don't think anyone can reasonably expect that a Chinese manufacturer of cheap multimeters has such a calibration program in place, because it wouldn't make sense to them or to their customers.

Back to the traceability chain: the NMIs in the figure above are the National Metrological Institutes. The NMI for USA is the NIST. A closer look at the "food-chain" of metrology is illustrated in the figure below, in case you wonder:

Structure of traceability of calibrations

Source of images: Calibration and traceability in measuring technology

Enric Blanco
  • 5,741
  • 6
  • 22
  • 40
  • 1
    This begs to ask another question. When examining a component that includes an internal voltage or current reference, or really ANY of the electrical specifications, should one expect said specification to be NIST certified. – Trevor_G Apr 03 '17 at 15:40
  • 3
    One should expect that a **legitimate** (non-counterfeit) semiconductor manufacturer has a decent calibration program in place for their instrument pool (probably involving NIST certification and good in-house testing procedures / work standards), and that **as a result of that** they can stand behind the specifications of their products (voltage accuracy, etc.). However, that doesn't mean they can claim that their product specifications are NIST-certified! – Enric Blanco Apr 03 '17 at 15:49
  • 2
    @Trevor, if you buy your instruments from Keysight or Tek or other high quality brands, they will offer a NMI-traceable calibration certificate as a cost-added option. If you have your instruments calibrated at a good cal lab, they will offer a similar certificate. – The Photon Apr 03 '17 at 15:57
  • @ThePhoton Indeed. Furthermore, and depending on the country, Keysight and the like are accredited labs themselves that can offer a full calibration program for ALL your in-house instruments regardless of the brand. However, they're not the cheapest lab around :) although their calibration work is truly excellent. – Enric Blanco Apr 03 '17 at 16:05
  • 3
    Now, in an exercise in insanity, check out the Wikipedia page on the definition of the [kilogram](https://en.wikipedia.org/wiki/Kilogram), which is the ultimate in examples of why we rely on chains of certifications. The actual kilogram, the IPK stored in France, is rarely brought out because it is too important. It also seems to be drifting from its copies by a few micrograms... – Cort Ammon Apr 03 '17 at 16:51
  • @CortAmmon this is why the US system is better: a pint of water is one pound, and an inch is exactly 3 barleycorns long. (It is an *integer*, you can't do better than that!) If only all of our standards were that sane. –  Jun 05 '17 at 16:37
  • @nocomprende I would love to see a discussion of the International Kilogram Prototype right next to a discussion of the US standard length barleycorns kept in a safe in Ft. Knox (all 3 of them!) – Cort Ammon Jun 05 '17 at 16:50
  • @CortAmmon but that is the great thing about it: *any* 3 barleycorns will do. That is part of the standard. Why choose a standard where the requirements are so hard to meet and which have a single point of failure? It is like how the Interstate Highway System was designed to have at least one straight mile of road every 5 miles, to function as a runway in wartime. It would be completely impossible to destroy all the available runway space in that scenario. Standards should be set to be impossible to break, like the speed of light: "*It's not just a good idea, it's the law!*" –  Jun 05 '17 at 16:55
  • @nocomprende Such a great plan. In case the enemy attacks us, we ensure they have a free runway every 5 miles! Fortunately, the enemy is using km, so that should confuse them! – Cort Ammon Jun 05 '17 at 18:49
  • @CortAmmon yeah, they would get lost on the highways, not be able to figure out the "inner" and "outer" beltways and confused by the gas prices. If that didn't stop them, they would probably get lost in Wal-Mart. But US people have no problem with those things, because we are just *better*. –  Jun 05 '17 at 19:14
4

Is there some sort of certification sticker one should look for when buying a meter or other equipment that indicates it is indeed tested to the actual SI values

If you buy your instruments from Keysight or Tek or other high quality brands, they will offer a NMI-traceable calibration certificate as a cost-added option. If you have your periodic calibrations done at a high quality lab, they will offer a similar certificate

If you don't order this certificate, your instrument will likely still be tested with the same procedures, but the vendor won't take on the liability of keeping records of that test and its results.

I know that in my industry, our customers do regularly audit our manufacturing site and check that all instruments used on our test line have traceable calibration certificates.

vs tested against say a Fluke...

This could still be a traceable calibration, if the Fluke has a valid certificate. Of course each step away from the primary standard increases the possible errors propagated from measurement to measurement, reducing the accuracy you'd be able to claim for the instrument being tested.

This all brings me to wonder, how much of what we use is calibrated to someone else's meter. And how many of those meters, are calibrated to yet someone else's.. and on and on. Seems like one big house of cards.

At the top of the metrology heap are the reference standards owned by the national standards laboratories. And these are indeed mainly only tested by comparison with each other. For example, NIST verifies the accuracy of its atomic clock by comparing it with the atomic clocks maintained by the British Standards Institution, the Association Française de Normalisation, the Chinese National Institute of Metrology, etc. This is because when you're making the most accurate instrument in the world for some measurement, the only thing there is to compare with is other people's attempts to produce the same thing.

The Photon
  • 126,425
  • 3
  • 159
  • 304
2

The label you want is "traceable to NIST" (in the USA, at least).

The National Institute of Standards​ and Technology maintains primary standards, and all other standards (at calibration labs, etc.) are periodically checked against them, either directly or indirectly. If you care about the absolute accuracy of your instruments, you will have documentation that describes all of the steps by which your calibrations can be traced all the way back to NIST. This would include what transfer standards or instruments were used, along with how long ago each one was checked against the next higher standard up the chain.

Dave Tweed
  • 168,369
  • 17
  • 228
  • 393