1

When I am designing circuits, I usually look at the current and voltage specs of my source to determine if my source is sufficient for my circuit. For example, if I have a load that requires 500mA of current but my source only supplies 250mA of current then I would have deemed this source to be insufficient for my needs.

I was recently told that I should instead be using power to determine if my sources are sufficient for my circuit. What I mean by this is that for example if I have a 12V source with 250mA current then my source can supply 3W. If my load is 1V 500mA then the power consumption of my source is 0.5W. If we use power to analyze this then we see that 0.5W < 3W so does that mean that this source is sufficient for our design? Even though the current draw from my load is larger than what my source can supply, can I use this source because the input power is larger than my output power?

Should I be using power to see if my sources and loads are compatible or should I be looking at the current and voltage requirements individually?

Thanks!

maxonezhou
  • 97
  • 3
  • 2
    You've demonstrated that power alone is not sufficient to specify the supply so I'd suggest you ask whoever 'told' you to clarify their intent. The only interpretation I can think of is when choosing a standard power supply, you determine what the power requirements are to lead you to a suitable series of power supplies and then select the model that has the required voltage and current specs. – Kartman May 24 '22 at 22:05
  • You need to match ALL of voltage, current and power. If any one of those fails to meet requirements the circuit may not operate. But part of the circuit may adapt one requirement to match another : e.g. a 120V 1A (AC) to 12V 10A transformer, or a 12V 100mA to 1V 1A buck converter. Then you can use the low current source ... using the buck as an adapter. –  May 25 '22 at 10:53

4 Answers4

1

Your informant is correct, but with caveats.

The statement that a power supply able to provide 3W of power is able to supply any load that demands 3W or less. This is absolutely true.

The caveat is that this power source has conditions attached. One of those conditions is that the voltage used to deliver the power is 12V. The other condition is that you can draw up to, but not exceeding, 250mA.

The question, then, isn't whether the power supply can provide the power (it can), it's whether the load can use that power given those conditions.

Take for example a load which is a 3W, 6V incandescent lamp, which (at 6V) would draw a current:

$$I=\frac{P}{V}=\frac{3W}{6V}=0.5A$$

The lamp requires 3W, so at first glance it seems that the power supply is suitable. Indeed, the supply can provide all the power necessary to operate this lamp, as your informant suggests. However, the question is not whether the supply can provide the power, it's whether the load can use that power at 12V. The answer is no, it can't. For starters, 0.5A is more current than the supply can provide. On top of that, given 12V, that lamp would try to draw way above 0.5A!

You can overcome this with a DC-to-DC converter. An ideal DC-DC converter is a device that can take any voltage as an input, and produce a different voltage output, with 100% power efficiency. In other words, a DC-DC converter can take power from a supply with a voltage condition attached (a fixed 12V output,in this case), and provide a different voltage source. However we must not forget that the converter is supposed to be 100% efficient, so not only can it transform the source voltage, but also the current.

In our example of a 12V power supply, limited to 250mA (for a maximum output power of 3W) and a lamp requiring 6V at 500mA (also 3W), you could use a "buck converter", otherwise known as a "step-down DC-DC converter", in the following configuration:

schematic

simulate this circuit – Schematic created using CircuitLab

Here you can see that the power supply is providing 250mA, which it is quite capable of, but the lamp is drawing twice that, 500mA! Crucially, the power being provided by the power supply, and power being delivered to the lamp are both 3W:

$$ \begin{aligned} P_{SOURCE} &= V_{SOURCE} \times I_{SOURCE} = 12V \times 250mA = 3W \\\\ P_{LAMP} &= V_{LAMP} \times I_{LAMP} = 6V \times 500mA = 3W \\ \end{aligned} $$

We can conclude that this power supply is able to operate the lamp, but we will have to use an intermediary to bridge the different conditions of supply and demand.

If you are wondering how it's possible to get more current than 250mA, from a source limited to 250mA, I remind you that the only physical law that needs to be obeyed here is the law of conservation of energy. There's no law called "conservation of current". The means of achieving this current-boost and voltage-buck behaviour are complex, and I encourage you to study "DC-DC converters" and their counterparts "linear regulators", to understand this better.

In this example I have assumed a 100% efficient buck converter, which is of course impossible. Nothing is 100% efficient in real life, but the principles here should be clearer now.

Edit, for clarification: When I say there's no law "conservation of current", immediately I am reminded of Kirchhoff's Current Law (KCL), which embodies that very principle I deny the existence of. However, the operation of a DC-DC converter depends on either magnetic coupling or switching of a capacitor's effective location, which both involve discontinuities in the current path. So, while KCL is true, it's not appropriate to apply it to currents in and out of a DC-DC converter or charge pump.

Simon Fitch
  • 27,759
  • 2
  • 16
  • 87
1

If the load is a linear regulator, use current rating of the supply.

If the load is a buck or boost DC-DC converter, use power.

But in reality, most loads specify voltage and current, so you just use the voltage and current the load needs.

user57037
  • 28,915
  • 1
  • 28
  • 81
0

I don't think there is a one-size fits all type of approach. Typically, I would look at the voltage requirements and then current. That means, if I had a device that requires 2.7V to 5.5V and needs 500mA, I would probably find a regulator that will output a voltage in that range and also make sure that it can do 500mA plus some headroom (eg. 30%).

When you think of it in terms of power only, all you can say is whether or not the source is capable, so if you're stuck with a specific source, that is what you do, but it may still require some conversion to set the correct voltage. That's why you're probably better off getting the right component(s) in terms of voltage and current to avoid having additional stages.

Big6
  • 5,534
  • 1
  • 18
  • 23
  • Thanks for the comment! So are you saying that my source in the example in my question is capable of supplying the load even though it can only provide 250mA of current vs the 500mA the load wants just because the input power is greater than the output power? – maxonezhou May 24 '22 at 22:49
  • @maxonezhou: If the voltage of power source is incorrect, then you have to consider how you will convert the voltage to the level required by your load. For linear voltage regulators (78xx, LM317, LDO), the input current for the regulator will be slightly greater than the output to your load - if the source is not capable of supplying that current, then it is not adequate, regardless of its power rating. If you use a switching regulator, its power input is a little greater than its power output, so your original source would be adequate to power your load. – Peter Bennett May 24 '22 at 23:05
0

I suspect that either you're not giving us all the information, or your source of information is a little confused.

The power-based advice is fine your device has an internal DC-DC converter.

Let's say you have a 5-volt power input, with an internal 5v/12v converter, which provides 250 mA to your load circuitry.

A quick way to size your 5 volt supply is to observe that 250 ma @ 12 volts is 3 watts, and 3 watts at 5 volts is 600 mA. Plus a margin for power losses in the converter.

On the other hand, your example doesn't state that, so the answer may be different

For example if I have a 12V source with 250mA current then my source can supply 3W. If my load is 1V 500mA then the power consumption of my source is 0.5W. If we use power to analyze this then we see that 0.5W < 3W so does that mean that this source is sufficient for our design?

Well, we can't tell. HOW is your circuit driving your load? Let's say you are using a linear regulator to drop your input to the load voltage. Then the current in to the circuit must equal the circuit to the load. Power has nothing to do with it. Electrons in, electrons out. In this case, a 12 volt, 250 mA source will not work, even though its maximum power, 3 watts, is greater than the load power, 0.5 watts. The reason is that the linear regulator will need to drop 11 volts at 500 mA, and will dissipate 5.5 watts, for a total dissipation of 6 watts.

If your circuit to drive the load is a 12-volt to 1-volt DC-DC buck converter with 50% efficiency, then it will dissipate 0.5 watts while providing 0.5 watts, for a total of 1 watt, which will require an input current at 12 volts of 83 mA.

Your question doesn't have a simple answer, because it depends on your circuitry. Current may be the limiting factor, or it may be power. There is no way to tell without more information from you.

WhatRoughBeast
  • 59,978
  • 2
  • 37
  • 97