4

Possible Duplicate:
Choosing power supply, how to get the voltage and current ratings?

I have a 5V / 1A regulated supply. I'm thinking of using it to power a PCB which asks for 5V and 0.25A.

In this case, I am thinking the power supply could be too powerful and damage the board.... My best guess is to put a resistor in parallel with the load, so 3/4 of the current goes through the resistor, and the rest goes through the board.

Any ideas?

Sprite
  • 49
  • 1
  • 2
  • In some cases, a resistor (often an NTC) is placed in series upstream of a power supply, to provide some protection against spikes. This is not one of those cases. – Kevin Vermeer Jul 06 '11 at 16:43
  • 8
    This misunderstanding of power supply current ratings has come up many times in different questions. Is there a way we can combine them or make one more prominent? – endolith Jul 06 '11 at 16:46
  • 1
    i'm with @endolith on this. how many different questions do we have to maintain, to provide the answer "supplies do not mandate current" – JustJeff Aug 21 '11 at 13:13
  • 1
    Sprite, think about this - if you turn that supply on with *nothing connected* to it, what happens to the current then? – JustJeff Aug 21 '11 at 13:41

4 Answers4

14

The power supply will definitely not damage your PCB. The PCB will draw only as much current as it needs. You could even use a 5V/10A regulated supply and it will give only as much current as the PCB asks for (in your case 0.25A). So there's no need for resistors.

m.Alin
  • 10,638
  • 19
  • 62
  • 89
12

The current rating of a power supply is the maximum it can deliver if the load demands it. A power supply can not dictate both the voltage and the current. In this case the supply will keep the voltage at 5V and the load will draw whatever it needs. The 1A rating means the load can draw up to 1A before the supply might not be able to keep its output at 5V.

That all said, there are some supplies that are designed to require a minimum current to function. Those that work like that usually require something like 10% of the maximum rated current. In this example, it would mean the supply is only promising to keep the output at 5V if you draw from 100mA to 1A. What happens above or below that depends on the supply. The only way to know if your supply has a minimum current requirement is to check its datasheet.

If your board draws 250mA, then almost certainly the supply will be able to maintain the specified voltage, but always check the datasheet anyway.

In no case can the supply somehow force 1A thru your board at 5V if the board only wants to draw 250mA.

Olin Lathrop
  • 310,974
  • 36
  • 428
  • 915
5

Well, talking about ideas, this is a bad one. :-) You'll lose 3.75W for nothing.

The regulator won't supply 1A if the load only requires 0.25A. It's the load that determines this. If the load needs 250mA it doesn't matter if the power supply can deliver a thousand amperes, there will only flow 250mA.

stevenvh
  • 145,145
  • 21
  • 455
  • 667
  • true enough, my point though is that it misses the point of the question. – JustJeff Aug 21 '11 at 13:46
  • here's an ideal place for that .. http://electronics.stackexchange.com/questions/17593/limiting-dc-motors-stall-current-draw-from-power-source – JustJeff Aug 21 '11 at 14:52
5

Use a fuse if you are worried about an overcurrent condition damaging your board. Under normal operating conditions you will only pull 0.25A from your power supply.

vicatcu
  • 22,499
  • 13
  • 79
  • 155