5

I have an arduino, beaglebone black, and an arndale octa board and they all work 5V DC (albeit different amp rates). Most of my electrical sensors seem to operate around the 5V range.

Why is this the case? How is it I can power a linux computer with 10 watts yet my light bulbs need 60?

Sean Nall
  • 165
  • 1
  • 2
  • 8
  • I cannot add an answer because it's been closed already, but ... I have a sneaking suspicion that when TTL was developed, high current 6.3VAC supplies were still very common. Multiply by 1.4, subtract 1.2V for a bridge rectifier, factor in a really large capacitor to keep ripple to a tolerable level, say 5% for supply tolerance, and a small amount for a linear regulator (NOT the 2.5V required for a 7805!) and what's left over is about ... 5V. –  May 24 '14 at 20:09
  • 1
    The light bulb might need 60 W, but it also shines _much_ brighter than the 10 W Linux computer. – Andreas Bombe May 24 '14 at 23:32
  • Because you have inefficient lightbulbs? – user253751 Jan 03 '20 at 16:09

4 Answers4

9

In the (very) old days things were slow and designers of logic circuits used discrete transistors and strange plate voltages like -15V to better confuse the vacuum tube guys (also because germanium PNP transistors were better for a period).

Then came vast (for the time) military demand for computers for navigation, and RTL logic was developed. enter image description here

It was slow and power hungry, but it could be crammed into flat pack packages with a NOR gate or a flip-flop. It used 3.6V (my memory, verified by the RTL Logic Cookbook. which says up to 4.5V but 3.6 nominal) or 4V (schematics of the Apollo guidance computer power supply). Some other early documents indicate a 3V nominal supply.

RTL begat DTL, which was the first, I believe, to use a single 5.0V nominal supply, in the form of the 930 series.

enter image description here

Millions of parts were being made at this point, mostly for military and such applications, so lots of ceramic packages (NASA Apollo documents indicate more than 700,000 pieces of the flat pack (1.27mm lead pitch) as they were phasing out TO-5 10-lead packages). Then came TTL, which was backward compatible with DTL, and rapidly replaced DTL, starting in the early-to-mid 1960s. For many years afterward, one would write "TTL/DTL compatible".

As an aside, around the same time, ECL logic with a -5.2 volt supply became mass-produced, but it was never used as widely as TTL.

Subsequent bipolar families (some more popular than others) such as 74L, 74H, 74S, 74F and 74LS all adopted the 5V supply.

Early MOS circuits used high voltages. To allow then to be used with TTL, they would have three supplies (+12, +5, and -5V). When early CMOS circuits were developed, a single 5V supply was a major selling point (early CMOS circuits would work at 5V but not very well). Eventually, it became possible to make CMOS equivalents of the 74xx series that worked as well, and better, than the originals.

Thus began a long period of relative stability during which 5V was the supply of choice for digital circuits (excepting ECL). This was fine from about 1965 to 2000 or so, but gradually it became less and less optimal as devices shrunk and power consumption requirements became more important commercially. We have again Balkanized supply situation with 1.8, 2.5, 3.3, and 5V supplies common. Choosing 5V for USB (and thence as a charger/power supply standard) means we'll continue to see 5V for many more years, if not decades.

So, it would seem to boil down to the design decision of the folks making a logic family aimed at missiles and such like that they would best use another volt or so beyond what RTL commonly used, and the tremendous pull of backward compatibility with legacy requirements. Why the extra volt or volt and a half, you ask? DTL inputs had effectively three series Si diode drops (see schematic above), and you'd want some reasonable voltage above that for the pull-up resistors to work, even with a supply 10% low and -55C temperature (military) so 4V was a bit too low, 6V unnecessarily high, and 5V about right.

It's not quite in the same class as the claim that space shuttle booster dimensions were determined by the width of two Roman horse butts, but it's already spanning two human generations.

A 60W light bulb is 1900s technology (very inefficient) emitting only a few watts of visible light but also a whole bunch of IR, according to Planck's law). An equivalent (in visible light) LED light uses little more power than 10W. Eventually we may get down to somewhat less, perhaps 5 or 6W. A high end PC CPU might use around the same amount of power as an incandescent bulb (tightly regulated DC power) so it's pretty hard to cool. The cost of computing in power per FLOP tends to drop every year, as does the cost per lumen of light, but the technology changes. Incandescent lamps are a mature technology (the most recent major improvement was halogen).

Spehro Pefhany
  • 376,485
  • 21
  • 320
  • 842
5

When the original integrated circuit logic families were developed, TTL (transistor-transistor-logic) became the most popular. It was designed to use 5 volts because that provided the best combination of noise immunity, power consumption and speed with the existing technology. Naturally, connecting circuits such as sensors and other devices tried to use the same voltage to avoid the need for extra power supplies. Through the years, newer technology has allowed (in fact required) lower operating voltages to lower power consumption and also keep voltage gradients inside the integrated circuits to safe levels as the circuit dimensions have been reduced considerably. The legacy of the 5 volt decision still lives on although that is diminishing as newer technologies take hold. As for the light bulb, you are comparing apples and oranges. The computer tries to use the least power to perform its functions. The light bulb is trying to produce a usefull level of light with the available technology of tungsten filaments. Newer fluorescent and LED technologies produce the same amount of light as the tungsten bulb with much less power. In fact, the best LED bulbs produce as much as the 60 watt tungsten bulb while only comsuming about 10 watts.

Barry
  • 15,733
  • 1
  • 26
  • 28
5

There is still a lot of old stuff out there left over from when 5 V was the standard logic voltage. This was the case for probably about 30 years (roughly 1970-2000). Nowadays 3.3 V and lower voltages are more common. However, 5 V is still used, particularly in industrial environments where the extra noise immunity is worth a little more power usage.

I don't know why exactly 5 V was chosen for TTL logic, but it was probably high enough to solidly turn transistors on easily, have decent noise immunity, but not take too much power.

Nowadays, logic is implemented with totally different technology than the old TTL chips. What voltage to run a modern processor on is a careful tradeoff to minimize power. Too high, and each voltage transistion has to move more charge onto or off the inevitable parasitic capactiance of the node being switched. This causes current proportional to the voltage for the same switching frequency, which makes overall power dissipation proportional to the square of the voltage. On the other end, too low a voltage doesn't allow enough difference between the on and off characteristics of the MOSFETs, so that you get more leakage current when off. Each transistor only leaks a tiny amount, but a few million here, a few million there, and pretty soon you get some real power dissipation.

Computers and Lightbulbs

Computers and lightbulbs are two completely different things, so the real question is why would anyone assume the power requirements of one had anything to do with the other.

A lightbulb is a power converter. It's job is to convert electrical power into light power. Since it has to put out light power, it will obviously require at least that much power as input. In reality, old LEBs (light emitting bulbs) based on the principle of black body radiation were horribly inefficient. Well under 10% of the power you put in actually came out as light. A old "60W" LEB therefore put out well under 6 W of light, the rest went into heat at the bulb and infrared radiation we can't see.

Modern LED bulbs are still inefficient on a absolute scale, but much better than the old LEBs. Such a bulb might only need 12 W, for example, to produce the same light.

Computers are totally different things from lightbulbs. Their defined job has nothing to do with emitting power in some form. They therefore inherently require almost no power theoretically. All of our ways of performing digital logic require some power to operate, but newer technology keeps making that less and less for the same computational capabilities. Back in the 1970s, a small minicomputer with less capability than a medium microcontroller of today would take a couple of racks of electronics and its own room with its own airconditioner.

Today your small single-board computer might take 10 W. In a few years, the same 10 W will run something 100 times more powerful, or you can run the equivalent of today's computer for a week from a couple of AA batteries. The inherent physical limit on the amount of power required for computing is very very small.

Olin Lathrop
  • 310,974
  • 36
  • 428
  • 915
3

The specification for the 5V power rails for early logic supplies was quite tight (basically +/- 5%) and the voltage regulators required would need to keep in-specification despite things warming up. A regulator therefore needed a fairly stable voltage reference and I suspect this was provided by zener diodes. I'm not saying this is the definitive answer but this might be worth a read. Basically it is demonstrating that for a range of zener diodes, those that have a 5V rating are closest to having zero temperature drift: -

enter image description here

I guess the implication of this is that 5V voltage regulators were likely to be more stable than regulators at other voltages (back in the good old days).

Regarding the light bulb question, the energy conversion efficiency of a tungsten filament lamp means that the light output power is a tiny fraction of the power going in so I'll bounce this back and say that for LEDs the power is probably comparable with your linux computer and I'll cheekily add "so what" and point out that this last question is probably off-topic for this site (but not your first question)!

Andy aka
  • 434,556
  • 28
  • 351
  • 777