12

I found a "0-32V 0-5A Portable Adjustable DC Power Supply 110V/220V" (link available if I'm allowed to write it here) and I wonder why we should have to adjust current output (this device can also adjust current output) if the current load is not going to exceed the maximum of 5A? In other words, why not let the power supply provide as much current as is needed (up to 5A)? What's the point of adjusting it to lower ratings, say 3A or 2A or 1A etc.? This is something I don't understand. Is it just the "ripple" thing I read somewhere else?

By the way, I'm not an electrical engineer or something, just an enthusiast who ignores some more advanced aspects of the matter.

Thanks in advance for any explanations that will try to clarify things in a rather lay manner, as much as possible.

Aggelos
  • 121
  • 1
  • 1
  • 3
  • Many thanks to all of you for your very informative answers. I understand now why it makes sense to limit maximum current output under certain circumstances. Thank you. – Aggelos Sep 11 '14 at 13:28
  • 4
    Don't forget to accept the answer that best answered your question. – Adam Head Sep 12 '14 at 16:42

8 Answers8

28

Imagine the situation where you have a new card you have designed, never been powered before...

You plug it into a 28V supply and set the current limit to 5A because "well it will only draw what it needs to work".

Now ideal case it should only draw 0.1A but you now have hooked it upto a 140W supply. There is an unknown problem in your design be it

  1. Circuit design issue
  2. Circuit capture issue
  3. Layout issue
  4. Layout manufacture issue
  5. Card stuffing issue
  6. Test issue

A low impedance path is created somewhere so a circuit that would draw 0.1A now draws 5A => damage.

Your test setup should be for what could happen not what should.

5

If you are testing a circuit that should not take more than 0.1 amps then setting the current limit to 0.2 amps prevents 5 amps flowing should you short out something with your oscilloscope probe or meter test leads. This might save circuit board tracks from burning.

Andy aka
  • 434,556
  • 28
  • 351
  • 777
  • 2
    A simple analogy might be to ask why nearly all electrified homes have many 15A or 20A circuit breakers or fuses, since the electric company will only supply as much current as the home demands, and in many cases would have no trouble supplying 200A if required to do so. Not all applications require the same amount of current, and having an adjustable current limiter is easier than having separate non-adjustable device for each different amount of current. – supercat Sep 11 '14 at 16:07
  • Oh well, the reason why a house has a circuit breaker is not to protect the equipment, but to prevent fires and harm to people. Also, AC works differently from DC in many ways. – Vlasec Mar 04 '16 at 09:25
4

As well as precautionary and safety issues you may wish to use the CC (constant current) capability specifically for circuit operation purposes, or even CC and CC together.

The following examples are not 'contrived' - I regularly use a bench power supply in the manner and applications described below.

A PV (photovoltaic) panel aka solar panel when operated in a typical illumination range has an output that approximates a constant current source up to a certain voltage and the current capability then falls off rapidly as voltage increases (or voltage drops as load current increases). This characteristic can be simulated for test purposes by a power supply set to a voltage of panel Voc and a current limit of Isc. The actual panel Vout will droop slightly across its operating range and this can be simulated by a series resistor. The end result allows reasonably good simulation for may test purposes.

LEDs should ideally be driven by a constant current source. When testing LED based equipment a power supply can be used set to CV of slightly higher than expected max LED Vf and CC limit set to the desired LED current.

Most supplies allow Voltage setting with ease. Those with adjustable current limits are not usually calibrated to allow the CC limit to be accurately set before use. An easy adjustment method is to short-circuit the output and adjust the variable current limit until the desired CC is achieved. In some cases the supplied CC at operating voltage may be somewhat different than CC at short circuit. (You didn't buy an Agilent sup[ply, did you?). To achieve a Vout closer to the desired one when setting CC a resistor load may be used such that R < Voperating/CC_desired and CC can then be adjusted,.

Russell McMahon
  • 147,325
  • 18
  • 210
  • 386
  • 1
    +1 for the LED test especially (a common application for an enthusiast). Most of the units I've used allow the CC setting to be adjusted to an accuracy on a par with the display precision (i.e. low but OK) before enabling the output; if you want more precise control it's probably worth choosing your R to be close to R_load - they're not always that constant in use over a wide range of currents especially given lead/contact resistances. – Chris H Sep 11 '14 at 14:30
3

The amount of current supplied by the power supply is determined by the LOAD you connect to it together with the voltage setting.

Example if the output voltage is set to 10V, then connecting a load resistance of 2 ohms would allow a current of 5 amps (I = V/R - Ohm's law). At a set output of 20V output it would try to deliver 10 amps but as it is limited to 5 amps the voltage output would fall back to 10V.

As Andy correctly points out having just the full current capability is not always a good idea when you are trying out a new circuit. The ability to limit the maximum output current available to a lower value is very useful in preventing damage. It is also useful in the case of a fault developing in a circuit which may 'short' the power supply and draw too much current through wires/circuit. (fire hazard)

JIm Dearden
  • 18,926
  • 30
  • 40
3

The voltage and current controls on a bench power supply both function as limits, not settings.

If you want to supply a specific voltage to the load, adjust the voltage control to that voltage. The power supply will deliver that voltage unless the output current goes above the setting of the current limit control, in which case it will reduce the output voltage to keep the current at the limiting value.

If you want to supply a specific current through the load, adjust the current control to that current. The power supply will deliver whatever voltage is needed to drive that current through the load, limited to the setting of the voltage control.

In normal use as a constant-voltage supply you would set the voltage control as you wished and the current control to maximum, but as other replies note it is sometimes sensible to set a lower current limit to limit the damage that might be caused by a circuit or connection fault.

nekomatic
  • 1,490
  • 11
  • 15
0

You (or someone else) might want to use the power supply to drive current through some coils to make a magnetic field. (or other current source app.. driving LEDs etc...) The B-field is proportional to the current and so a current source is called for. You set the current and then let the power supply set the voltage. (the voltage can change if it's a lot of current and the coils heat up.)

George Herold
  • 4,754
  • 1
  • 18
  • 25
0

In two words: less smoke.

When something goes wrong, you are rarely in the situation where the power ends up not doing additional damage. I had a non-secured power supply where the 2N3055 used as voltage regulator would just take the whole current the transformer was able to throw at it, basically stating "I'm not the first to give up". Cost me a few slow fuses to figure out where the shortcircuit was. A 2N3055 on a proper heatsink will down 150VA of power on a continuing base IIRC. If your shortcircuit is not low-resistance enough to have basically 0V across it, it will convert a lot of power into heat, and it will likely be unhappier about it than a 2N3055.

0

I'd like to post an answer to this, even if it has been answered as I recently experienced a need to limit current. I designed a stepper motor driver circuit and the motor was rated at 4A. I made an error on my design and one of the IC's in the circuit can only handle 2A. As soon as I plugged it in, it blew up. I then replaced the piece, used the current limiting and set it to 1.5 Amps and it all worked fine- of course the motor had less than half it's torque but nothing blew up!

JohnChris
  • 101
  • 2