Why does the USA use 110V and UK use 230-240V? What are the advantages? Explain me with calculation. Why do they use different frequencies like 50Hz, 60Hz? What is the reason?
-
11USA uses 230-240 VAC, too. The only difference is that we ground it in the center, creating "split" phases, reducing the peak voltage relative to ground and making it easier to interface low-power loads. But high-power loads (stoves, water heaters, clothes dryers, etc.) operate across the full voltage, reducing the current required. – Dave Tweed Jun 13 '14 at 12:06
-
@Raggles Car hits pole and there may be 7.2kV wire on the street. Tree falls on line and here we go - little disaster. Thats kind of dumb for me. Here where I live we have 15 or 30kV lines too, but in my town - there are just few lines like this, they are far from roads, trees etc. – Kamil Jun 13 '14 at 18:44
-
Can mention that not only the UK, but most (all?) countries in Europe uses 220-240V/50-60Hz. – Baard Kopperud Jun 14 '14 at 10:52
-
5230 V is used in whole Europe, but only 50 Hz and not 60 Hz. – Uwe Mar 28 '17 at 20:24
-
@DaveTweed Interesting. My kitchen hobs use two 220V phases (I guess that means 440V?). Does that mean that there's no equivalent in the US market? – WhyNotHugo Aug 29 '23 at 16:08
-
@WhyNotHugo: Since the phases are shifted 120°, it's actually 380V or 400V. Bringing in all three phases seems to be relatively common for heavy loads: https://en.wikipedia.org/wiki/IEC_60309 – 9000 Aug 29 '23 at 18:36
7 Answers
You should not be surprised that they use different voltages and frequencies, you should be surprised that there are only two big voltage/frequency standards.
When electricity was first introduced each producer provided a different voltage and frequency (or even DC instead of AC). Gradually producers merged, governments set standards, and market pressure demanded that appliances could be used everywhere. This lead to the current situation, where the pressure for a world standard is counterbalanced by the invested interests.
For the same amount of energy 110V requires more current, hence thicker wires. 230V requires better isolation. In some rare situations 220V might be more dangerous to touch.
I don't think 50 or 60 Hz makes any significant difference. (An iron core for a transformer might be a little smaller at 60 Hz. But iron cores are soooo last century...)

- 48,407
- 1
- 63
- 136
-
6About safety I once did a little math and it seemed that 110V versus 230V is the difference between life and death. – Vladimir Cravero Jun 13 '14 at 07:31
-
17In some circumstances it might. In other circumstances it might mean the difference between death and death. – Wouter van Ooijen Jun 13 '14 at 07:33
-
2Yeah of course if you bite a 50V wire while your feet are deep in the water you have no chance, but if I recall it right making some "real life, common" assumptions lead to what I said. – Vladimir Cravero Jun 13 '14 at 07:35
-
Whaat? Iron cores are last century? High power transformers in power grid are not iron core now? – Kamil Jun 13 '14 at 10:13
-
The choice between 50 and 60 Hz is not nearly as simple as folks think. http://www.sos.siena.edu/~aweatherwax/electronics/60-Hz.pdf goes into more detail on the history than you might think possible. The various decisions which produced the current standards dealt with all sorts of considerations.. – WhatRoughBeast Jun 13 '14 at 16:04
-
350Hz is just above the limit of when the eye will detect flicker, an important concideration with old-style light-bulbs and such. Can mention that for a while in some industiral towns (at least in Norway), frequence was decided by what was best for the industry - eg. melting aluminium. So some places they used 25Hz, and you could see the lights flicker - and the required size of the transformers... (lower freq., more iron). – Baard Kopperud Jun 14 '14 at 10:50
-
The old ODDA hydro station (very large) was 25 Hz. Using polewheel generators – Decapod Mar 28 '17 at 18:49
-
5@BaardKopperud - "an important concideration with old-style light-bulbs and such" - Exactly backwards, unless you consider fluorescents "old-style". A more reasonable view of "old-style" (at least as far as line frequency selection is concerned) is incandescents, and those puppies have enough thermal inertia in the filaments that there is no perceptable flicker at 50 Hz, or even at 30 Hz. – WhatRoughBeast Feb 19 '19 at 23:35
-
1@VladimirCravero In some situations 48V vs 110V might be the difference between life and death - if that was all it boiled down to, then we should be using 48V. – user253751 Feb 20 '19 at 00:28
-
@VladimirCravero High voltage doesn't generally kill. High amperage does. – Philip Whitehouse Aug 29 '23 at 22:06
The same reason we still pave roads that go around buildings that were knocked down half a century ago.
Historically someone, or some group, picked a number in each country, others followed suit, and it became "a standard". Now we are stuck with them.
Each has it's advantages and disadvantages. You can argue them forever.

- 46,364
- 8
- 68
- 151
-
1There is an even more wonderful example of how such things stick and then it's too expensive to change. Eastern Japan uses a 50 Hz grid (historically built by German companies), while western Japan uses 60 Hz (American). There's no way to consolidate them now. https://www.npr.org/2011/03/24/134828205/a-country-divided-japans-electric-bottleneck – 9000 Aug 29 '23 at 18:45
It's hard to be definitive. But before AC power distribution was widely adopted, there was a bitter battle in the United States between Edison and Westinghouse about DC versus AC power distribution.
Edison's DC system used +110V, 0V and -110V . There was a campaign by Edison to portray AC as dangerous, even going so far as to introduce an electric chair powered by AC electricity as an execution device, thereby demonstrating the "danger of AC". Once AC was widely accepted as being superior to DC for power distribution, 110V became the standard for AC distribution presumably because it used the "safer" Voltage level of the DC system.
After metal filament lamps became feasible, 220V became common in Europe because of the lower distribution costs.
As for 50Hz versus 60Hz.. well that's just the Metric system.

- 4,117
- 1
- 16
- 30
-
250 vs. 60Hz is actually not a metric system issue, but a choice to be able to easily multiply/divide the mains frequency into or out of clocks. 60Hz divides much easier mechanically as it has many divisors. – user36129 Jun 13 '14 at 10:06
-
1Could be one reason but quoting from http://www.school-for-champions.com/science/ac_world_volt_freq.htm#.U5rc3BDYPxw "With the backing of the Westinghouse Company, Tesla's AC system became the standard in the United States. Meanwhile, the German company AEG started generating electricity and became a virtual monopoly in Europe. They decided to use 50Hz instead of 60Hz to better fit their metric standards" – akellyirl Jun 13 '14 at 11:16
-
1Actually, the original Edison incandescent lamps were nominally 100 volts. 110 was chosen as the distribution voltage to allow for line drops (a major problem with early DC power systems such as Edison built.) – WhatRoughBeast Jun 13 '14 at 12:44
-
Amazing how the metric system affects every little thing, from the mains frequency to the [name of hamburgers](https://www.youtube.com/watch?v=uYSt8K8VP6k). – dim May 26 '16 at 09:28
-
@dim nope UK here we use Quater and Half Pounder, however, we don't get a pig and a cow confused – Martin Barker Sep 25 '19 at 23:13
-
Your link says the same thing that you do, but what's metric about 50 Hz? I mean, it's half of 100 Hz, but does that make it "more metric"? (It seems kind of irrelevant unless one performs frequency doubling anyway. Does that happen?) – LSpice Aug 29 '23 at 10:36
-
1Side note: “AC was [...] superior to DC for power distribution” *at the time*, when power stations served smaller areas, copper was cheap, and you didn’t have high-power electronics that could convert DC to AC and back whenever you needed to step down the voltage. (An AC transformer can sit in a cabin or on a utility pole for years with little to no maintenance; a DC motor-generator cannot.) These days, [HVDC](https://en.wikipedia.org/wiki/High-voltage_direct_current) lines are in fact deployed in some places. – Alex Shpilkin Aug 29 '23 at 10:47
The 110V issue is simply that once Tesla and Westinghouse proved long distance AC transmission feasible, the #1 issue driving the proliferation of electrification was lighting in houses, replacing the gas and oil lighting that was a major fire hazard of the day. Edison's lamps were 100V, but a lamp does not care if it gets AC or DC. So our AC distribution system, AT THE RESIDENTIAL LEVEL, was designed to take advantage of the existing installed base and inventory availability of Edison's lamps. Then as personal appliances began proliferating, they were designed to take advantage of the 110VAC lighting circuits already used in houses and the concept cemented itself into our culture to where there was no going back.
The 50/60Hz issue is different and is not "metric" at all (what's metric about the number 50?). Despite Westinghouse/Tesla having championed it, AC only really took off here once Edison gave in to the inevitability. Edison, despite having investing in AEG as Europe began electrifying, was reluctant to allow a system in which Europeans could enter our market by selling electrical products here. So after also experimenting with different frequencies (40Hz was the first major industrial installation, at the Folsom Power House in California), Edison and Steinmetz settled on 60Hz, partly because of the flicker issue, then also because it would make European equipment incompatible. He wanted it all to himself... which is the same motivation behind his initial push to discredit AC distribution in the first place. He wanted DC because he owned the US patent rights to his DC dynamo (even though he actually bought his first one, for proof of concept, from Werner von Siemens. Yes, THAT Siemens... Siemens had not patented it in the US). So if DC had won, we would have had Edison DC dynamos every 5 miles or so. Only the rich would be able to afford it, and they would all be paying Edison for the privilege. Tesla's egalitarianism ruined his vision.

- 3,561
- 11
- 12
This was extensively covered in The Simpsons:
You know, Europe's no place for a six-year-old. He can handle 110 volt, but 220 would kill him.
Jokes aside, once you pick a value and produce a substantial amount of compatible devices, the price of switching to a different value becomes prohibitively high.

- 25,576
- 5
- 45
- 106
In the UK wiring was available nationwide in the late 1950's. The rest of europe followed shortly after the US. Because it took the UK a bit of time to catch up, they had time to learn an important thing about the previous experience with household electricity - wiring houses was expensive! They had to use a lot of wire and by doubling the voltage they reduced the current to half thus reducing the wire gauge needed.
The AC freq. is a bit better known story... Up until 1890 there was no standard for the mains freq. (obviously) AEG who had a monopoly on electricity production in Europe set the standard at 40Hz, however a bit later on they had noticed lamps flicker at that freq. so they increased it to 50Hz which was fine. Westinghouse learned about this becoming a standard but they figured the lights still flicker a bit so they increased the freq. to 60Hz. In the following years they started wiring the entire US and motors and other devices were designed for that freq. so it was not easy to change it any longer.

- 168,369
- 17
- 228
- 393

- 1,872
- 2
- 22
- 44
-
This is all pretty vague and unsupported. I've read that [two-thirds of UK rural dwellings were connected by 1938](http://events.history.ac.uk/event/show/6939). Some of it doesn't even make sense. For example, if the rest of Europe 'followed shortly after the US' why did they go for 220V? – user207421 Jun 14 '14 at 10:39
-
@user207421 Can’t say anything about Western Europe, but the Russian Empire originally used 110V in the 1880s; the Soviet Union variously used 120 or 127V ( = 220V / √3) and ultimately switched to 220V ( = 380V / √3) over several decades starting in the 1930s (presumably for efficiency reasons). That was further tweaked to 230V ( = 400V / √3) in Russia in the 2000s, although given line loss in the apartment wiring in practice you get anywhere from 225V normally to 215V on a particularly bad day. – Alex Shpilkin Aug 29 '23 at 11:23
We learnt that it was about resources. Europe had abundant iron and was short of copper, hence 50Hz. Whereas (mistakenly) America and particularly Pennsylvania, had a surplus of copper, hence 60Hz. 110v, 60Hz has 4 times distribution loss compared with 220V (power loss = I squared R) That is why American wire is so much thicker and the overhead (not underground) distribution puts on such a spectacular display when a pole mounted transformer goes up. Also, lights flicker at twice the AC frequency, one flash on the positive half cycle and one flash on the negative half cycle, not at the supply frequency

- 21
- 1
-
1I understand the correlation between current and losses. But the link between raw material resources and frequency choice needs more explanation. – dim Apr 03 '18 at 11:53
-
What does it mean to mistakenly have a surplus of copper? Do you mean that it was mistakenly *believed* that there was a surplus of copper? – LSpice Aug 29 '23 at 10:35
-
Whatever voltage the home user needs, 110 or 220, shouldn't really impact the design of transmission lines beyond what actually connects the final transformer to the consumer. – Nick T Aug 29 '23 at 15:23