6

This question is related to my previous question that Olin provided a great answer to.

Note that this question is about transmission and distribution systems, not electronics. I believe the claims I make in the question are known to most power system engineers and I have therefore not included citations and external sources.

The following quote regarding the 2003 blackout is taken from the report "Preventing Voltage Collapse with Protection Systems that Incorporate Optimal Reactive Power Control", PSERC Publication 08-20:

By performing the optimal reactive power control after contingencies, the system voltage profile, the voltage stability margin at load buses, and the relay margins were improved to insure that system operating criteria were met after any of the contingencies.

Reactive power is used to control the voltage in transmission systems, and is said to increase the voltage stability.

Why and how does the reactive power help improve the voltage stability?

The previous question was about why reactive power affects the voltage. This question is indeed related, but it's in my opinion a completely different question and answers to the previous question will not be applicable for this. Therefore, in my opinion, this is not a duplicate.

Stewie Griffin
  • 1,547
  • 3
  • 19
  • 36
  • This is an incredibly specific question though. For instance, where I live this absolutely doesn't apply anymore. All grid power is regulated in a digital/active fashion with even quite some new installations using ac-dc-ac transformer stages instead of linear transformers. They don't need passive control with reactive loads anymore. And even in linear systems, control is usually not done by switching in and out reactive loads anymore. So keep in mind that your question is quite esoteric and relates to relatively 'outdated' tech. – user36129 Jul 09 '14 at 09:20
  • 1
    @user36129: You must live in a very nice country. As far as I know, Australia's electricity grid doesn't have any of the fancy stuff you describe... (Basslink HVDC excepted.) – Li-aung Yip Jul 09 '14 at 12:22

1 Answers1

6

First of all, your question I think shouldn't just be 'why do reactive loads improve grid voltage stability' - it should be grid stability - not just voltage or current or power. Everything is improved.

Let's go back to the 1970s: the power grid is entirely AC, entirely linear (i.e. AC power is generated at a power plant and multiple linear transformer stages are used to deliver it to the end customer). No DC power lines in between, no inverters, no PFC. The voltage on the power line is fairly precisely timed to allow for the use of timing motors (synchronous motors in e.g. train station clocks), timers and DTMF encoding on power lines, etc.

Most devices that regular households use and that use an appreciable amount of power have a good power factor; they are almost perfect resistive loads. Clothing irons, lightbulbs, ovens. Also, households use a pretty small amount of power (historically between 10 and 15% of electrical power). Now, a large industrial facility turns on its giant motors. Motors are a very inductive machines, i.e. they have a low power factor. One big sewage plant pump system can use as much power as an entire city block, so this has a large effect on the grid.

A perfect grid is very 'rigid': its power lines have no voltage drop, no self-inductance and no propagation delay. In reality of course, power lines do have some elasticity and especially the use of large motors and other devices with a large difference in power factor from the mean can destabilize the local grid. Regard the voltage and current waveforms; in an ideal world these are synchronous sine waves. However, if 50% of the grid is almost perfectly resistive, and the other 50% of the grid has a power factor of say 0.5, the current waveform isn't a sine anymore; current is drawn both on the peak of the voltage waveform and in between the peak and zero crossing. It's more like a block waveform. This increased current 'in between' the peaks combined with the self-inductance of the grid causes voltage spikes.

Not only that; circuit breakers for instance have traditionally always relied on the current zero-crossing for a reasonable amount of time to be able to switch. You cannot switch 100kA off; even if you would literally cut the wire with an axe they would still arc and cause current to keep flowing for way too long to be safe. Thyristors and other solid-state circuit breakers also just keep conducting until the current zero-crosses, even with their gates turned off. The increased edge speeds that distortion on the grid cause may cause big problems with circuit breakers.

So, the old-fashioned way to fix this is to put large capacitor banks as close to the motors as possible. The motors are a reactive 'load', the capacitors are a reactive 'generator' and combined they appear to the grid to be a well-behaved almost-resistive load.

This is very simple and effective, and even though I say it is old fashioned, its low complexity makes it incredibly reliable. It does have disadvantages though; capacitor banks large enough to compensate for a large-ish (100s of kW to MW range) motor are excessively expensive and large. Also, they are only optimal for a specific motor load (this depends on the type of electrical machine). You have to switch capacitors in and out to fine-tune the compensation. Lastly, there is still quite some energy loss in this system.

Note that this is not just done near or on specific machines; sometimes power companies use large capacitor banks on entire branches of their power grid to equalize the effective power factors of different domains.

A more modern approach is to use a frequency controller or inverter to control the motor. Mains power is rectified to a DC voltage, and then chopped (inverted) again to feed into the AC motor. PFC on the rectifier makes sure the drive has a good power factor on the grid. This approach is much more space- and cost-efficient, gives better control over torque and speed and easier on the grid.

user36129
  • 8,268
  • 29
  • 37
  • Thank you for an excellent answer! I agree that VAR-compensation is old fashion. However, it's still common in power systems as upgrades are very expensive and as long as it works in a satisfactory manner, other upgrades are preferred. – Stewie Griffin Jul 09 '14 at 10:01
  • Of course; cost effectiveness and sometimes simply complexity is a big reason for passive compensation. It's not bad by any means. But new installations never go this route; if it were only for the ability to use much smaller and cheaper machines with frequency drives. Copper is very expensive! – user36129 Jul 09 '14 at 10:07
  • "This increased current 'in between' the peaks combined with the self-inductance of the grid causes voltage spikes." Are these 60 Hz spikes affecting the RMS voltage? Or something else? – LShaver Feb 26 '23 at 03:59