7

I am having a doubt in why are the transmission lines are sent at a very high voltages? I also went through quite a few sources which said that it is to decrease the losses in the lines. BUt I am not quite satisfied with the solutions, can somebody please let me the answer for this in detail. Your help is much appreciated.

IamDp
  • 1,865
  • 4
  • 26
  • 40

4 Answers4

15

The goal of high voltage transmission is to minimize power losses.

What causes power loss?

The main enemy is Joule's first law, or \$I^2R\$ losses. The resistance of our transmission line causes a loss in power with square proportion to our current.

How do we fix it?

Say we want to transmit some amount of power, we'll call it \$P\$.

We know \$P=VI\$. This means that 1 volt and 1 amp has the same amount of power in it as 1000 volts and 1 milliamp.

Now, to minimize our \$I^2R\$ losses, we need to minimize both \$R\$ and \$I\$. \$R\$ is hard to change. We have 1000s of km if not more of wire we'd have to change. Also, \$R\$ is a direct relationship, whilst \$I\$ is in a square relationship, so changing \$I\$ has a bigger effect. What's easier to change for our AC power system is \$I\$.

We do this with a transformer. An ideal transformer allows you to put in some amount of power at a given voltage, and receive the same amount of power out at a different voltage.

Now because \$P=VI\$ that means it also changes our current. \$V\$ and \$I\$ are in an inverse relationship in power, if \$V\$ goes up, \$I\$ goes down.

Thus, we increase \$V\$ to decrease \$I\$. By making the voltage on the power line higher, we lower the current, which thereby lowers our lost power \$I^2R\$.

For a given power we want to transmit (\$P = VI\$), a higher voltage means less current, which thereby means if we use a higher voltage, we minimize currents, minimizing \$I^2R\$ losses.

Why don't we jack the voltage way up?

So we increase the voltage and decrease the loss, so why not crank the voltage as high as we can? It turns out that too high actually increases our losses, due to a different effect. Eventually, we reach a point where the air surrounding our wire starts to ionize.

Ionization causes air to stop acting like an insulator, and allows current to flow. This process is known as coronal discharge, and means eventually we start losing some power as there is a new path to ground or another transmission line present due to the high voltages. It also slowly degrades our wire as atoms are pulled off the surface of the wire.

If we went even further, we'd eventually hit electrical breakdown, which means we now have arcs flying through the air. Arcing is extremely bad, and will both degrade our wire rapidly, as well as give what is for all intents and purposes a short, causing rapid power loss.

Bugasu
  • 528
  • 3
  • 11
  • +1 - didn't know about the coronal discharge part, nice bit of info. – Polynomial Jun 20 '13 at 19:56
  • Arcing *is* bad, but also *awesome*. http://www.youtube.com/watch?v=g4ph-h7l_aM – Phil Frost Jun 20 '13 at 21:45
  • 1
    Ionisation depends on both the voltage and the radius; hence those large globes on HV generators. For a single wire, ionisation limits the transmission voltage to about 250kv (AC). I believe the reason that higher voltage lines (440kv) have 4 spaced conductors, is to imitate a single larger diameter conductor. –  Jun 21 '13 at 08:08
  • 1
    Note: At voltages as low as 2,000 volts AC there are small losses. But the complete breakdown voltage is varies depending on the gap distance and medium between the power source. – Scientist Smith YT Jan 02 '19 at 19:10
1

Trasmission lines uses very high voltages because they are more efficient.

We know that \$P = VI = V^2/R = I^2R\$.

The resistance depends from the lenght, the section and the material. We use copper because it's the best compromise between cost and resistivity. The lenght can't be changed so the only parameter that could be changed is section. We can reduce the resistance incresing the section but the copper cost so we can't increase the section to infinity (and an higher weight need more robust pylons).

The result is that we can't change the resistance so we need to operate with the current and the voltage. An higher current generate more heat and heat is loss energy. This is called Joule Effect. From the formulas that I wrote you can deduce that voltage and current are inversely proportional so 1 V and 10 A have the same power of 10 V and 1 A. In order to reduce the losses we need to reduce the current and we can reduce the current increasing the voltage.

Now you are thinking: "Why can't we use higher voltage than what we currently do?". The answer is simple: surpassed a certain limit the costs of transformers and insulators are more than the cost of copper.

Oceanic815
  • 1,179
  • 1
  • 11
  • 19
0

Any conductor has some amount of resistivity and this resistivity is proportional to the conductor's dimensions. When you need to transfer electrical power over long distances, you obviously need very long conductors, which means resistivity goes up.

For the same power, higher voltage means lower current. Lower current means lower I^2R losses.

Angelo Stavrow
  • 376
  • 1
  • 3
0

The most simple reason is that power lines have to supply a vast amount of electric power.

Power can be computed as a product of current and voltage (i.e. P=IV), which means that to provide a lot of power, you either need a very high current, a very high voltage, or a pretty high level of both.

For example, if you want to support a load of 1MW (i.e. 1,000,000W), you might balance current and voltage in the following ways:

  • 100kV at 10A
  • 10kA at 100V
  • 1000A at 1000V

Due to an effect called resistive heating (also known as Joule heating), the amount of thermal energy emitted by a wire due to the wire's electric resistance is proportional to the square of the current:

$$Q \propto I^2 \cdot R$$

As such, every time you double the amount of current, you multiply the wasted thermal energy by a factor of four. The more current you need, the higher the amount of thermal energy. If you wanted to run your country's power infrastructure on 120V instead of 120kV, you'd need 1000 times the current, and would therefore produce 1,000,000 times the thermal energy. With the infrastructure we have today, that would be enough to literally melt the power lines. Attempting to compensate for this just doesn't make any sense.

High voltage doesn't have this problem. Increasing the voltage allows you to decrease the current to a relatively low level, whilst maintaining the same amount of power (remember that P=IV), which reduces your thermal losses drastically. As such, the most efficient way is to provide power is to use a very high voltage and keep current very low.

Polynomial
  • 10,562
  • 5
  • 47
  • 88