Looking around at a couple of power supplies that I own such as a guitar pedal adapter that says it puts out 9V and 1700mA max, how does a power supply like this work so that it doesn't put out too much current and blow up what's attached? There isn't a switch that I physically have to switch but how does the adapater "know" how much current to put out?
Asked
Active
Viewed 158 times
0
-
Yes, this seems to be a duplicate. The device which is using the power only consumes as much as it 'needs'. So as long as the *voltage* of the power supply and device match, then the device won't be blown up by a bigger current capacity power supply. An under-powered power supply could be damaged by a device which wants more current than it cold supply, though typically power supplies are designed to protect themselves. – gbulmer Sep 19 '14 at 03:30
-
Oh what I was moreso asking though was how the device worked and not how current output is determined. But thank you though for trying to point me to the answer (: – BowesAndArrows Sep 19 '14 at 03:43
1 Answers
0
The adapter is fixed voltage. The load resistance (guitar pedal in this case) sets the current level. The indication of the output current value refers to the maximum that the adapter can deliver, if the load as needed.
Remember the Ohm's Law
$$ I = \dfrac{V}{R} $$
For a fixed voltage \$V\$, the load \$R\$ sets the value of \$I\$. An improved adapter or a power supply, can implement a current limiter, wich senses the output current and reduces the output voltage to get a current reduction.

Martin Petrei
- 3,034
- 12
- 20