3

I have seen many resistances with difference wattage i.e 1/4W, 1/2 W, 1 W, 2 W ,3W etc.

If I have 3 W load which is a large 3W LED in my case, what resistance should I use?

The LED I am using needs 3.5V DC and 0.8 A current. I have a battery which outputs 8V. How can I calculate the value of the resistance which can drop 8V to 6V?

JRE
  • 67,678
  • 8
  • 104
  • 179
Naeem Ul Wahhab
  • 311
  • 3
  • 8
  • 16

5 Answers5

6

This is a basic electronics calculation, do it a hundred times before you move on.

It's Ohm's Law:

\$ V = I \times R \$

or, put differently:

\$ R = \dfrac{V}{I} \$

The voltage is the remainder after the 3.5V drop caused by the LED, so that's 8V - 3.5V = 4.5V. The current seems to be 800mA (though I see also 350mA here and there).

\$ R = \dfrac{4.5V}{0.8A} = 5.6\Omega \$

Don't just pick a common 1/4W resistor. You should always, but especially with high currents like this, check what power it will consume.

\$ P = V \times I = 4.5V \times 0.8A = 3.6W \$

So the answer is a 5.6\$\Omega\$/5W resistor.

That's much of a waste however. Both LED and resistor see the same current, then their power ratio is the same as their voltage ratio. And the efficiency is 3.5V/8V = 44%, excluding the LED's own efficiency.

A linear voltage regulator to bring down the 8V is no solution; it will dissipate the 3.6W just the same as the resistor. A switching regulator would help, but you'll have to keep its output pretty close to the LED's 3.5V to be maximum efficient. There are switchers which output a current instead of a voltage however, and they're made for the job. The LT3474 needs only a couple of external components, can drive 1A and can handle input voltages up to 36V. Efficiency for 1 LED at 800mA is slightly above 80% (for two LEDs it achieves near 90%).

stevenvh
  • 145,145
  • 21
  • 455
  • 667
1

The Watts dissipated in a resistor is the current flowing through it times the voltage dropped across it. The variable missing in your question is the current (which you have added now) the LED needs to operate at. It will depend on your LED and there will be a range which is ok, more current for brighter and less for dimmer. Resistance = (Power supply voltage - LED voltage drop)/led current. Once you know those values you can see how many watts your resistor will dissipate. Check out wikipedia on leds

R=(8-3.5)/.8=5.625 ohms

P=IV=.8*4.5=3.6 Watts

It is possible I am misunderstanding your specs, or that they are wrong. But if they are correct, you need an LED driver instead.

Matt
  • 1,276
  • 11
  • 18
1

First thing: 350 mA at 3,5 V gives about 1,2 W, so are you sure about your specs?

EDIT: OK, now we have the right current.

Second: in my opinion driving a 3,5 V LED with a 8V source and only a resistance is a waste of power, because you will dissipate more power on the resistance than on the LED, so you will need at least a 4.5*0.8 > 3,6W resistance.

One way could be using two LEDs in series, or use a voltage regulator; but if you are sure that you want to use this configuration, i whink that you need at least a 4W resistor with value of 4,5/0.8 = 5,6 Ohm circa (To remain in the E12 series standard values).

Maybe a better solution would be a PWM regulation with a capacitor, but you would need a wave generator...

clabacchio
  • 13,481
  • 4
  • 40
  • 80
  • 2
    The problem is that standard regulators (like 78L05) also dissipate power as heat, similarly to the simple resistor. Switching regulator IC would be probably needed to save the battery energy. – Al Kepp Jan 18 '12 at 17:11
  • Yes in fact i was thinking to a more sophisticated regulator, like a buck converter, but it's becoming a little bit sophisticated... – clabacchio Jan 18 '12 at 18:05
  • 2
    Take a look at datasheet of LM3407 for example. It needs very little external components. It also supports analog and digital (PWM) dimming. – Marki555 Jan 19 '12 at 12:49
  • In a word...Awesome! – clabacchio Jan 19 '12 at 13:16
-1

Typical LEDs need about 20mA of current. Ohm's law says 1k Ohm resistor is needed for that current draw on 8 Volts to run your LED... you can adjust that resistance slightly for brightness. Your comment about a 3W load confuses me. A single LED doesn't pull 3 Watts. More info would help. Anyway, if it's just 8 Volts with one LED pulling 20mA then you need a resister rated for 1/8W or better.

Chef Flambe
  • 1,011
  • 11
  • 22
-1

Current through the LED is dependent also on its temperature. When the LED lights up, its temperature rises. Unfortunately as the temperature rises, LED will draw more current at the same voltage - which means even higher temp, higher current, malfunction. This is why high-power LEDs should be always used only with constant-current source and not just series resistors. You can find many LED driver ICs, to which you need only few components.

Another reason against series resistor in your design is that the heat dissipated in the resistor would deplete the battery much quicker.

Marki555
  • 463
  • 1
  • 4
  • 12
  • The current change is never so high that thermal runaway will occur, the resistor takes care of that. Literally billions of LEDs are resistor controlled without problems. – Federico Russo May 24 '12 at 08:13
  • "heat dissipated in the resistor would deplete the battery much quicker." Than what? The LED driver's constant current source? What do you think it does with the voltage difference? Indeed, dissipate as heat. – Federico Russo May 24 '12 at 16:33
  • Look at LM3407 for example. Its efficiency is between 86-96%. What is the efficiency of resistor-based source with input of 8 V and output of 3.5 V? Of course also these LED drivers dissipate heat, but much less then voltage difference * LED current. – Marki555 May 25 '12 at 07:50
  • Billions of normal LEDs are controlled without by resistors without problems, but not high power LEDs (with current > 500 mA). Resistor doesn't take care of current change - it is a constant voltage source, not constant current, which is almost a must for high power LEDs. – Marki555 May 25 '12 at 07:52
  • Then *say* it's a switcher. You don't say anything about it in your answer. The difference is important; just saying "constant-current source" doesn't automatically imply it's a switcher, on the contrary. – Federico Russo May 25 '12 at 07:54
  • A resistor is neither a constant current, nor a constant voltage source, but it will give you a fairly fixed current if the 8V is properly regulated. – Federico Russo May 25 '12 at 07:57