25

Something that has always puzzled me is how we know the accuracy of voltmeters. From googling around, it seems that today's digital voltmeters use an ADC. An ADC works by comparing the voltage that is to be measured against a highly precise known voltage reference. However, how is the voltage of that highly precise voltage reference known? It seems to be like a chicken vs. egg kind of problem - knowing the accuracy of the voltmeter ADC reference voltage relies on the accuracy of another voltmeter using another reference voltage which relies on the accuracy of another voltmeter using another reference voltage and so on and it continues infinitely.

From looking at the Wikipedia page for a voltmeter, it seems like something such as a Weston cell, which uses a reproducible and stable chemical reaction, is used as the ultimate reference voltage for calibration. But still, that falls prey to the chicken vs. egg dilemma - how do we know the voltage of the Weston cell without again using a voltmeter?

Peter Cordes
  • 1,336
  • 11
  • 16
John Allen
  • 415
  • 4
  • 7
  • 7
    Ultimately there's a whole field devoted to deciding which we consider to be fundamental physical units and which we consider derived, eg SI, MKS, CGS, etc It's mostly not about metal bars in guarded rooms anymore... – Chris Stratton Dec 03 '20 at 01:52
  • 6
    A multimeter should be calibrated according to a voltage standard that can be traced to the NIST. – S.s. Dec 03 '20 at 02:14
  • 21
    This question applies to ALL measurements we make. Not just voltmeters. For example, length, temperature, mass, time, brightness. DO some Googling into metrology. And I'm just sitting here trying to figure out how to make sure my square is square and my straight edge is straight without needing a squarer square or straighter straight edge. – DKNguyen Dec 03 '20 at 05:15
  • No need of voltmeter. There must be some mathematical equation to calculate the emf produced by the chemical reaction as calculated in voltaic cell and all. Trust maths. Btw ultimately you have to trust something and use them as reference. – Mitu Raj Dec 03 '20 at 07:56
  • 1
    @DKNguyen yes. How do you make the first flat reference surface? The answer is triple lapping, I guess. Once you can make something straight, then you can also easily make something square using the 3, 4, 5 triangle. – user57037 Dec 03 '20 at 08:24
  • @S.s. ideally, yes, but in many (even most) cases that's excessive – Chris H Dec 03 '20 at 09:58
  • 2
    A Weston standard cell has the advantage of being affordable and you could make one yourself if keen enough. You can buy (usually) very olde Eppley versions on ebay for $US50 or so - or much more if you wish. [Here](https://www.eevblog.com/forum/metrology/weston-standard-cell/) is a page with some excellent related links. || Note that a saturated cell maintains its voltage with time, while an unsaturated cell loses voltage at a relatively consistent rate with time BUT has a usefully lower temperature coefficient. – Russell McMahon Dec 03 '20 at 10:56
  • Of course you don't know the voltage of the reference - you _define_ it. The numbers are ultimately arbitrary - even the sign is just a convention. The only important thing is to make sure everyone uses the same reference and scale. The number itself doesn't have any meaning. – Luaan Dec 03 '20 at 18:05
  • Something I read somewhere: If you have one multimeter, you know what the voltage is. If you have more than one multimeter, you are never quite sure. – Andrew Morton Dec 03 '20 at 21:42
  • You do realize that nearly all units are fundamental arbitrary and subjective in nature? – Eric Brown - Cal Dec 03 '20 at 22:03
  • 6
    Welcome to the rabbit hole of instrument calibration. No affiliation, but [Fluke](https://us.flukecal.com/literature/about-calibration) has a good page on it. – J... Dec 03 '20 at 23:51

7 Answers7

56

These days, you build a primary voltage standard from a bunch of Josephson junctions and a microwave source. That generates a voltage that's dependent only on the defining constants of the International System of Units (SI base units).

As a more economical alternative, you send your voltmeter to a lab that compares it against a voltage standard that's traceable back to a primary voltage standard. In the US, that primary voltage standard is probably at NIST.

Basically, every physical quantity can be mapped back to a physical constant that is defined, rather than measured. Seven of 'em (read the Wikipedia article) are base units; the rest are derived. The volt, in particular, defined as the amount of electromotive force necessary to impart exactly one Joule on one Coulomb of charge. In SI base units, \$\mathrm{1V = \frac{kg\cdot m^2}{A \cdot s^3}}\$. So just build any old dingus that lets you generate a volt as long as you know what those four quantities are, and you're done!

As of May 20, 2019, all of these base units can, in theory, be reconstructed from first principles (i.e., the second is defined by a number of oscillations of a cesium maser, a meter is defined from the second and the speed of light, etc.). Ultimately all you need is a one-page reference guide, an astonishingly deep understanding of physics and metrology, and a staggeringly large gift certificate for a whole lot of lab time.

TimWescott
  • 44,867
  • 1
  • 41
  • 104
  • 5
    "astonishingly deep understanding of physics", yup, that's the one I'm tellin' my kidz :D – mishan Dec 04 '20 at 10:30
26

From looking at the Wikipedia page for a voltmeter, it seems like something such as a Weston cell, which uses a reproducible and stable chemical reaction, is used as the ultimate reference voltage for calibration. But still, that falls prey to the chicken vs. egg dilemma - how do we know the voltage of the Weston cell without again using a voltmeter?

Back in the day when a Weston cell was used as a primary reference, we didn't need to know what the voltage was, we define the voltage of a Weston cell, under certain physical conditions like temperature, and whether it is saturated or not, as exactly equal to 1.018638 V +/- corrections. This was the case from when this definition was adopted in 1911 to when it was superceded by the Josephson Junction in 1990.

To guard against one primary standard breaking or misbehaving, each major international laboratory maintains a whole bunch of these things (an ensemble), comparing one with another, and taking the average as the true reading. If any particular cell starts reading much higher or lower, it is removed from the ensemble. When a new cell is brought online, it's not added to the ensemble until it has demonstrated some long period of good behaviour. From time to time, a travelling standard is taken from country to country to compare each other's standards.

Commercial calibration labs check their standards against international ones. Manufacturers check their internal standards against commercial calibration labs. Manufacturers measure their products before they arrive with you to make sure they're within their specification. So your humble DMM is several steps down the chain of accuracy. But there is a defined chain.

Neil_UK
  • 158,152
  • 3
  • 173
  • 387
  • 1
    "we didn't need to know what the voltage was, we **define** the voltage of a Weston cell" -- this reminds me of Veritasium's video on the kilogram standard. They periodically reweigh the "second degree" standards (not sure what to call them) against the original standard and they no longer weigh the same. I remember Derek asking something like which one weighs different now -- but that's the exact problem they explained. The kilogram standard ***is the kilogram.*** In theory if you cut part of it would still "weigh the same" (obviously they would correct for it, but that's the idea). – Captain Man Dec 04 '20 at 21:47
  • 4
    @CaptainMan - What you state about the kilogram was true until they changed the definition last year. – Justin Dec 04 '20 at 21:53
  • @Justin yes -- that's true. This video was made before that and they've done videos on the new standard as well. Original one I mentioned for the curious: https://youtu.be/SmSJXC6_qQ8?t=409 – Captain Man Dec 07 '20 at 19:06
12

Specific to a Digital Multi-Meter (DMM): the type of Analog-Digital Converter (ADC) used inside a typical DMM, is called a dual-slope integrating ADC. This technology has been around since the 1970s, take a look at the Intersil 7106. I previously wrote about how this device works here.

But about your question, which is basically how can we trust that the numbers reported by the DMM are accurate...

An instrument manufacturer such as Fluke will publish a user manual which describes how to use the instrument, and defines how accurate the instrument can be (when it is properly calibrated). Separately, they also publish a service manual for use by third-party calibration service providers, detailing exactly what instruments and calibration standards are required, and exactly what procedures must be used, to achieve the designed performance of the Unit Under Test.

I can't find the URL at the moment, but here's an excerpt from a service manual that I had handy, just to show an example of the type of information provided to a company that would perform the calibration service:

Excerpt of Fluke 724/725/726 calibration manual

Excerpt of Fluke 724/725/726 calibration manual

It goes on like this for quite a while, with step-by-step instructions on what plugs into where and which buttons to press, as well there are also specifications for having the equipment 'soak' at a specified temperature range prior to calibration (to avoid temperature-dependent inaccuracy).

Note that in this example, even when the instrument is provided with a best-possible input reference of 30.000V, the instrument is only expected to display any number in the range of 29.992V to 30.008V, any reported value in that range is considered close enough.

Each part of the instrument is calibrated in a certain order, such as first basic 2V measurement offset/gain/linearity, then the 200mV and 20V ranges which depend on the 2V measurement, and only then move on to current measurement which depends on voltage measurement of a known resistor. The procedure can be done manually if you have all the right gear, and if all of that gear has itself been recently calibrated so that it too is trustworthy.

The analog semiconductor company that I work for periodically sends our lab equipment out to a third-party calibration vendor who has all this certified-calibrated-standard gear, and runs through all the procedures for us. It only costs money... But my own personal DMMs that are 'for indication only', 'not calibrated', I don't bother sending them out, I just accept whatever level of uncertainty the users manual says it is good for. So if my 3.3V supply measures 3.29V or 3.32V, I don't worry about it, it's within report tolerance and probably right on.

There's an important principle in Statistical Process Control where trying to make small adjustments that are less than the system's standard deviation, will actually make it less accurate than leaving it alone... this is why target shooters and archers always first try to get a tight cluster, before adjusting their aim. Same with instrument calibration. Making a small adjustment to favor the 30.000V test point, will affect everything else, so they can only adjust it to within a certain range before it negatively impacts the overall accuracy of the system.

MarkU
  • 14,413
  • 1
  • 34
  • 53
6

There are two issues here and even though multiple people have discussed them I'll try to summarise them succinctly.

The first is that something like a voltmeter will have an internal reference voltage generated by a well-understood physical process, which by definition results in a known voltage with well-understood temperature (etc.) characteristics.

The second is that if you're doing any sort of precise work (quality control etc.) you will have your voltmeter calibrated on a regular basis. That doesn't necessarily mean that somebody gets inside it and tweaks anything, but it does mean that an external calibration voltage is applied to it and you're given a certificate saying what your voltmeter displayed.

And the voltage source used for external calibration is typically a tertiary reference, i.e. it has itself been calibrated against a secondary reference, which was itself calibrated directly against your jurisdiction's primary reference at the NPL, NIST or whatever lab holds it.

1

The general approach with all standards is to use some "reference source" that produces about the same voltage if made following the documentation. It may be expensive to build and not last for long but this does not matter as a number of more practical devices can be calibrated against it.

In other words, a voltmeter used for various tasks may be calibrated over high precision laboratory voltmeter, an expensive device mostly used mostly for calibration of other devices. This one was probably calibrated using some chemical or other reference voltage source. Maybe there were more "intermediate generations" between these two voltmeters.

h22
  • 881
  • 6
  • 14
0

The voltage of a battery is predictable from knowledge of chemistry; certain batteries like the Weston cell (thanks @PeterMortensen, I'd forgotten the name of it!) are exceptionally good at maintaining a steady voltage and these are used as voltage references.

This was, in fact, how the volt was defined prior to 1990. Nowadays, the volt is defined using an array of Josephson junctions, which I don't understand well enough to explain here.

From there, you can calibrate other physical voltage references that are more convenient, such as a Brokaw bandgap reference or a zener diode, which you can then use to make voltage regulators, voltage references, power supplies, and the like.

Hearth
  • 27,177
  • 3
  • 51
  • 115
  • Re *"certain batteries are exceptionally good at maintaining a steady voltage"*: What level of accuracy? In % or ppm. – Peter Mortensen Dec 04 '20 at 19:05
  • The [Weston cell](https://en.wikipedia.org/wiki/Weston_cell#Characteristics) seems to have an astonishing accuracy on the order of 1 ppm (though a 40 ppm / °C temperature coefficient) – Peter Mortensen Dec 04 '20 at 19:20
  • @PeterMortensen Thank you, I'd forgotten the name of the weston cell! – Hearth Dec 04 '20 at 23:44
0

how do we know the voltage of the Weston cell without again using a voltmeter?

You don't have to. You just write down what it is supposed to be, and everyone agrees, and that's that. Totally arbitrary, but the idea is that everyone is informed about the agreed upon value. Of course the Weston cell is obsolete, but the reason Weston cells were used is that anyone who could build one well, would automatically get a fairly accurate Volt standard. They were used to define what Volt was, and were usable as such a definition because they were reproducible with nothing more than good lab technique: you didn't need a Volt reference to build a Weston cell.

You might have also asked: how do we know that the ADC is linear - that if it measures 1/2 of the reference voltage, that it's not somehow off? This is solved by using reference voltage dividers, also known as Kelvin-Varley Dividers (KVD). They are, in a nutshell, a potentiometer, but with the wiper replaced with switches. The resistive elements in such dividers only need to maintain ratiometric tracking - i.e. that their ratios are maintained, while their absolute values can be off by a couple percent with no undue effects. Note that we don't need to know what the ratio is - only whether it's 1:1 or not. If they are off, you tweak them until they are back to 1:1 ratio. And to verify that, all you need is a null meter and a Wheatstone bridge: with that you can very accurately compare resistor values. Once you got a bunch of resistors that you know all have same value (no matter what it exactly is - as long as it's the same), such resistors can be used as building blocks of a KVD, and get transformed into a digitally adjustable voltage divider. Such a divider's output can then be used with a null meter to verify ADC's performance: set the divider to 0.50000 with input across the ADC's reference, feed its output to the ADC input, and see how close that ADC is to mid-range value.