At our homes and utilities, the frequency is 50/60 Hz. Why not use any other value such as 500 Hz, 200 Hz or a small value 40 Hz, 30 Hz. Previously I searched it over the StackExchange but I want to know the technical and logical reason.
-
1Lower frequencies like 16.7 and 25 Hz are used for railway trains, see https://en.wikipedia.org/wiki/Railway_electrification_system#Low-frequency_alternating_current – Uwe Jun 10 '17 at 14:10
-
1A higher frequency of 400 Hz is used for aircrafts to save weight by smaller transformers. – Uwe Jun 10 '17 at 14:16
-
It seems like the purpose of this question was to promote the asker's website. – davidmneedham Jun 12 '17 at 16:57
2 Answers
Wikipedia explains it fairly well in their article Utility Frequency:
The induction motor was found to work well on frequencies around 50 to 60 Hz, but with the materials available in the 1890s would not work well at a frequency of, say, 133 Hz. There is a fixed relationship between the number of magnetic poles in the induction motor field, the frequency of the alternating current, and the rotation speed; so, a given standard speed limits the choice of frequency (and the reverse). Once AC electric motors became common, it was important to standardize frequency for compatibility with the customer's equipment.
Generators operated by slow-speed reciprocating engines will produce lower frequencies, for a given number of poles, than those operated by, for example, a high-speed steam turbine. For very slow prime mover speeds, it would be costly to build a generator with enough poles to provide a high AC frequency. As well, synchronizing two generators to the same speed was found to be easier at lower speeds.
Generally, lower frequencies require more iron. You may like to think of this as a magnetic store of energy per cycle: the shorter each cycle the less iron is needed. Since weight is a bigger consideration in aircraft design than cost is they standardised on 400 Hz AC systems.
Lower frequencies are better for long distance power transmission as power losses are reduced. DC, with a frequency of zero, is best although step up to and down from high voltage is complex.
All engineering is about finding the "least worst" answer to the problem 50 / 60 Hz is a good trade-off between the factors above.

- 168,990
- 12
- 186
- 385
-
Some older systems have used (and a couple still use) frequencies such as 16 2/3 or 25Hz. For example https://en.wikipedia.org/wiki/Amtrak%27s_25_Hz_traction_power_system – Kevin White Jun 10 '17 at 17:51
There is no specific frequency that is better than others in the range around 50/60 Hz but there are problems when you go much lower or much higher in frequency.
AC distribution relies on transformers and there are two problems with transformers that are notable: -
- Lower frequencies means higher core saturation (but less eddy current losses in the laminates)
- Higher frequencies means higher eddy current losses (but less core losses)
So, it is found, that round about 50 or 60 Hz, the best compromise is met. At lower frequencies you need more turns (or more iron) to avoid excessive H fields in the cores of transformers (or induction motors) that could cause saturation. More turns means also means more copper losses.
At higher frequencies you need to make the core laminates thinner to reduce eddy current losses and this becomes somewhat impractical for power applications at frequencies of hundreds of Hz. Also, at higher frequencies leakage inductance becomes a problem. Too much leakage impedance means too much volt drop under load current conditions.
For the actual wire/cable transmission of current, higher frequencies means progressively more skin effect and therefore increased losses.
It's a compromise between several competing factors.

- 434,556
- 28
- 351
- 777
-
1The other elephant is the room was lighting usage, go much lower then 50Hz and you start to get objectionable flicker. – Dan Mills Jun 24 '19 at 14:34