4

Possible Duplicate:
Why are deep cycle batteries rated in amp hours instead of watts hours?

I don't understand why it is so common for specifications of battery capacity to be given in mA·h, when you have no clue how much power that is and how long a given device can be powered through such a battery, or to compare batteries with different voltages (and try doing that when the voltage is conveniently omitted from the spec altogether).

Why don't everyone give out capacity specs in W·h?


Similarly, although seemingly not nearly as common as with batteries, some devices likewise only list their power consumption in mA again, where you again have no clue how much power they'd be consuming, unless you happen to know what voltage they operate on.

Why isn't consumption always given in watts instead of milliamperes?


Isn't it error-prone to be giving capacity and power consumption specs in units which cannot be directly compared without, (a), knowing the voltage of both the battery and the to-be-powered device, and, (b), having an expensive calculator or a piece of paper.

Why aren't Watts universally used here, and what kind of sense does it make to be giving mA or aA·h ratings without ever mentioning the voltage (happens all the time when you look at various specs online, apparently).

cnst
  • 227
  • 3
  • 9
  • It would behoove you to get some in-depth understanding of batteries. I can recommend [Battery University](http://batteryuniversity.com) web site. Doing some experimentation would behoove you as well. That could help stave off analysis paralysis. Finally, I don't see why you are complaining about the cost of paper and a calculator. After all, you seem to own a computer. – Nick Alexeev Jan 23 '13 at 21:09
  • -1 for attitude. You don't know why it's being done the way it is, but are sure it's "useless"!? Rants are off topic here and need to be closed. Asking why is fine, but passing judgement before understanding why is immature at best. Grow up. – Olin Lathrop Jan 23 '13 at 21:29
  • @OlinLathrop, I've edited the post to change the wording to make it more open-minded. Thanks for your comment. – cnst Jan 23 '13 at 21:53
  • No, the title is still quite judgemental. Do you want to rant, or truly ask for information? Pick one. Hint: rants are off topic and will be closed. Only two more close votes to go. Time is quickly running out as you are trying to play games instead of asking properly. – Olin Lathrop Jan 23 '13 at 22:50

3 Answers3

2

If you assume that a battery or power supply is a perfect voltage source, than the two are equivalent. Multiply capacity in milliamp-hours by the battery voltage to get capacity in milliwatt-hours, or multiply current by input voltage to get power in watts.

For many devices, the current (mA) will be the same regardless of input voltage. This is true of anything with a linear voltage regulator; current remains constant, but as input voltage goes up, power (W) increases because it is the product of current and voltage. The linear regulator simply converts excess voltage into heat.

For batteries I don't have a good excuse. We could also give capacity in watt-seconds or joules (same thing, a unit of energy), but that would introduce other problems. I suppose a battery is a complex device, and the available energy depends on the load (high current? low current?), temperature, and many other factors. In the absence of a simple and accurate model that covers all cases, the convention has favored units that make common calculations convenient.

Phil Frost
  • 56,804
  • 17
  • 141
  • 262
  • So if the voltage is different, you can still do the calculations with amperes alone, without any regard to the difference in voltage? – cnst Jan 23 '13 at 20:20
  • @cnst which calculations? – Phil Frost Jan 23 '13 at 20:21
  • the calculations to get the unit of hours – cnst Jan 23 '13 at 20:22
  • @cnst I'm not sure what you mean. A specific example would help. – Phil Frost Jan 23 '13 at 20:24
  • Some device with the model INS-TK103B has the following specs: <>. If I connect it to the always-on circuitry on an average car, how many hours can I leave the car undriven for? (Apparently, this question now became http://electronics.stackexchange.com/questions/55925/specing-an-always-on-alarm-tracker-for-a-car-battery/55926#55926.) – cnst Jan 23 '13 at 20:35
  • @cnst Unless the manufacturer specifies exactly how current requirements change with input voltage, you will have to make some assumptions. In all cases, we aren't going to predict the runtime within the second. We are working with rough estimates here, and if you want to do better, you will have to beat up the manufacturer until they give you more detailed specifications. – Phil Frost Jan 23 '13 at 20:42
  • So, are you basically saying that one should do the calculations with amperes alone here, and it would be the most accurate estimate that is possible to obtain without further info? – cnst Jan 23 '13 at 20:46
  • @cnst yes, exactly. – Phil Frost Jan 23 '13 at 20:49
  • Well, but thinking further about this, car battery voltage is usually way above the actual 12V, and here they say that the device will even work with 10V; doesn't this imply, with the car voltage being 13 or 14V, that we will have nearly one-third of provided energy converted to heat, or, in other words, nearly one-half of energy consumed by the actual silicon converted to heat? This would sound like a lot of heat and wasted energy! Shouldn't some kind of 14V-to-10V ratio be used to do such calculation here? – cnst Jan 23 '13 at 20:57
  • @cnst How much wasted energy is it really? Dropping 30mA from 13 to 10V is \$ 3V \cdot 30mA = 90mW \$. Not that much, really. Also, your specifications say the device uses _less than_ 30mA. And whatever energy is wasted as heat, it still drains your battery. – Phil Frost Jan 23 '13 at 21:03
  • One could measure the quantity of energy held by a charged battery, rather than the charge, but the amount of energy that can be *harnessed* from a charged battery before it is depleted may vary considerably depending upon temperature, load conditions, etc. Knowing that a battery holds 10 joules internally isn't very helpful if one doesn't know how much of that one can actually get. By comparison, the amount of *charge* that may be harnessed is much more constant. If a battery holds exactly 1AH, one can probably get 3600+/-1% coulombs out of it under a very wide range of conditions. – supercat Jan 23 '13 at 21:34
  • @supercat The amount of charge we can get out of a battery also varies considerably in practical situations. Are we willing to run the voltage to zero, where our circuit won't work, and the battery may be damaged? Is the temperature high enough for the necessary chemical reactions to occur in the lifespan of a human? It's rare that actual discharge tests can extract the rated charge from a battery for these reasons. – Phil Frost Jan 23 '13 at 21:45
  • @PhilFrost: True, a battery won't supply current nicely and happily until it's depleted and then suddenly stop, so the +/-1% was a little overly cute, but if drawing one amp from a warm battery for an hour would discharge it by 10%, drawing one amp from a cold battery for an hour would probably do discharge it by about 10%, even if the latter operation would harness a lot less useful energy. – supercat Jan 23 '13 at 22:27
1

The Ampere-hour units actually express charge. An ampere is a Coulomb of electricity flowing per second, so when that is multiplied by time, you get Coulombs. Ampere-hours express how many electrons are displaced in the battery.

Why these units are used may be traditional, and also because certain formulas related to batteries, such as Peukert's Law are based in them. Peukert's Law is tied to current because that is convenient, since the variable which varies is discharge rate measured in Amperes.

Ampere-hours are more convenient for electronics calculations. We often know how many milliamps a circuit draws and so if we have an ideal battery of such and such mAh capacity, we can almost instantly tell how many hours of life we can expect, without any conversion back and forth to energy units.

Kaz
  • 19,838
  • 1
  • 39
  • 82
0

In the olden days people used a galvanometer to measure current and voltage. To measure power one would have to read a needle on a dial for two measurements and multiply them together on paper or using a slide rule. To measure total energy delivered from a battery one would have to do that repeatedly and integrate over time using Newton's method by hand calculation.

Being able to take just one reading was a big shortcut, especially for people not trained in engineering, or for whom a rough estimate was good enough. Digital hand calculators, let alone multimeters with built-in microcontroller to do multiplication, did not exist for most of the history of electrical engineering. Considering that lead-acid or nicad battery voltage only varies some 20-30% from full to empty, such may have been an acceptable measurement error in order to allow technicians to take usable measurements without having to do math.

Even battery manufacturers faced the same effort to deliver a W-hr measurement. A constant-resistance load for runtime measurement requires one component: connect a resistor across a battery and measure how long it takes for the battery to drop below the "dead" voltage. For extra credit, plot the voltage over time on a graph. In the 1970's a consumer would have approximated this test by putting new batteries in a flashlight or AM radio and leaving it on to see how long until it stopped working.

A constant-current load requires a few active parts, maybe one opamp and one power transistor, plus a few resistors and capacitors. Measure the time to dead and you have, exactly to the extent conditions are duplicable, a mA-hr rating. Again, measure and plot the voltage over time (ideally at constant temperature) for a more complete picture of battery performance. Such data is sufficient to estimate battery runtime to within maybe 20% under what used to be ordinary conditions: runtime of 5 to 100 hours at room temperature. Just measure current from the battery and divide mA-hr by mA = hours. Good enough for choosing between D cells and AA, quality vs cheap junk alkalines, are your NiCd worn out, etc.

A constant-power load requires multiplication in the load itself (voltage x current in the feedback path), complicating the design and driving up the cost of battery test equipment. Measuring energy delivered requires logging voltage and current, multiplying at each data point, and integrating over time. Possible in a spreadsheet and perhaps trivial with one of today's microcontrollers in the equipment, but a tremendous amount of work 30+ years ago.

In practice, the amount of energy a battery can deliver to a load is highly dependent on the discharge rate anyway, so there would never be one single W-hr rating. If you do enough testing you'll get a chart with power on the x axis and energy on the y axis, where you'd see a moderate to severe downward slope. Additional factors affecting energy storage are temperature, age and usage history of the battery, etc. Further complicating the picture is that modern devices may present an infrequent but heavy pulsed load, which may not "average out" or integrate to the same response from a chemical battery as it would from an ideal voltage source.

Which is not to say that more data from battery manufacturers wouldn't be useful, certainly it would be to those who can make sense of it. Rather that more accurate information would look like a 3D graph, energy vs power and temperature, and that's just the for first charge cycle. The graph could look very different at 100, 300, 1000 charge cycles, under lab conditions where variables are held constant. As most novice engineers don't understand that batteries are not ideal voltage sources, such a tide of data might serve to confuse rather than enlighten most users. In real life, load varies depending on what the multifunction device is doing, temperature varies by tens of degrees, usage varies from minutes to hours per day, on and on. Building a reasonably accurate model of battery behavior (energy storage and state of charge) under such varied circumstances is complex and time-consuming and thus expensive, the domain of university researchers and companies with a business designing battery fuel gauge chips.

In summary, a device does not generally present a constant-power load to the battery irrespective of voltage, and a battery does not deliver a constant amount of energy irrespective of load. Measuring power is (or was) harder than measuring current, and measuring total energy stored/delivered is much harder than measuring rate of energy flow. To a rough estimate current can tell most people what they need to know.

Matt B.
  • 3,805
  • 2
  • 24
  • 28