29

When I was young and learning about electricity, a fabulous tool for understanding voltage/current/resistance was an incandescent lightbulb (in my case it was a small 3V bulb). When you doubled the voltage by putting two batteries in series, it glowed 4x as bright, but heated up more and was more prone to burning out. When you put two lightbulbs in series, they glowed 1/4 as bright. When you put them in parallel, they glowed normally, but drained the battery twice as fast. Etc.

This day and age however incandescent bulbs are on the way out, and LEDs are replacing them for a good reason (like not burning out every few months or so). But LEDs are different and follow different rules, which I don't understand myself very well.

I was wondering - can LEDs be used in the same way? I know that for a LED to be usable in a similar way as a classical lightbulb, you need to put it in series with a resistor, otherwise it draws too much current and burns out. I think you can even buy LEDs with built-in resistors. But would they work in the same fashion? Would changes in voltage be accompanied with corresponding changes in brightness?

Vilx-
  • 703
  • 1
  • 7
  • 13
  • 1
    LEDs are current driven devices. There are some graphs around that provide you with current vs. light output information – PlasmaHH Sep 06 '16 at 10:18
  • 1
    It's the forward current that determines the brightness of an LED, not the voltage. In the case of an LED current vs luminous intensity is pretty linear, that is to say 2x the current 2x the brightness. –  Sep 06 '16 at 10:19
  • 1
    If you want to control the brightness of a LED, you need to control the current flowing through it. The voltage only needs to be slightly above the forward voltage rating of the LED. A common way to control brightness is through pulse width modulation (PWM). Instead of giving a LED less voltage, to dim it, you give it the full voltage, but in repeating bursts. The duty cycle determines brightness. – Drunken Code Monkey Sep 06 '16 at 16:58
  • I'm almost 99.99% sure this is a dupe, as I recall the "how to control diode brightness with voltage" matter being answered already (and also being in the network "hot questions" too, AFAIR)... somebody cares enough to find the original one? –  Sep 06 '16 at 18:04
  • 1
    --They don't actually. It's not 4 times as bright, and it's not 1/4 as bright. Incandescent bulbs are non-linear resistors: the resistance increases as they get hotter. When (V squared) is 4 times as big, R is also bigger. When (I squared) is 1/4, R is also smaller. Also, incandescent bulbs change color when they get hotter-cooler: they are redder (more yellow) when cool, bluer (more white) when hot. – david Sep 07 '16 at 02:52
  • Argh! First someone objects that it's not 2 times, it's 4 times; now it's not even 4 times! :D Well, whatever the ratio, you get the idea! :) – Vilx- Sep 07 '16 at 06:25
  • You should be able to get a good approximation to what you want from series combinations of LEDs and resistors, each with most of the voltage drop across the resistor. That will require higher battery voltages, though. – user6030 Sep 09 '16 at 03:48
  • @user6030 - True, but from what I've read here, I've come to the conclusion that it will be much simpler (and cheaper) to just use incandescent bulbs. At the small sizes they're still plenty available. – Vilx- Sep 09 '16 at 06:26
  • Strictly speaking, brightness of an incandescent lamp is determined by power flowing through it - current x voltage. It just so happens that incandescent filaments are pretty much resistive loads so voltage and current are proportional. LEDs are non-linear, but brightness will still be a function of power flowing through them (also current x voltage); it's just that voltage and current don't vary in proportion. – Anthony X Jun 24 '18 at 14:18

6 Answers6

35

LEDs are a very very different beast compared to incandescent light bulbs. LEDs belong to a class of device known as non-linear devices. These don't follow Ohm's Law in the classic sense (however Ohm's Law is still used in conjunction with them).

An LED is (obviously) a form of diode. It has a forward voltage which is the voltage at which the diode starts to conduct. As the voltage increases so does how well the diode conducts, but it does that in a non-linear fashion.

              enter image description here

With an LED it's the amount of current flowing through it that determines how bright it is. Increasing the voltage increases the current, yes, but the region where that happens without the current getting too much is very small. In the red curve above it may be that tiny little bit around 1.5V, and by the time you get to 2V the current is off the scale and the LED burns out.

Putting LEDs in series does sum the forward voltages, so you have to provide a higher voltage for conduction to start, but the controllable region is still just as tiny.

So we control the current instead of the voltage, and take the forward voltage as a fixed value. By either including a resistor in the circuit to fill the gap between the supply voltage and the forward voltage, limiting the current in the process, or by using a constant current supply, we can set the current that we want to flow through the LED and thus set the brightness. By increasing the current, but not increasing the voltage (or only a negligible amount, and purely incidentally), we increase the brightness.

The formula for calculating the resistance to use for a specific current is:

$$ R = \frac{V_S - V_F}{I_F} $$

Where \$V_S\$ is the supply voltage, \$V_F\$ is the LED forward voltage, and \$I_F\$ is the desired LED forward current.

Majenko
  • 55,955
  • 9
  • 105
  • 187
  • OK, but if I used a LED-resistor pair as a single "lightbulb" unit, then it would work, right? – Vilx- Sep 06 '16 at 10:29
  • 1
    In a way. You would end up doubling both your resistance and the forward voltage, which would mean you would need to increase the voltage to compensate. For instance, one LED at 2V and 20mA from a 5V supply would have a 150Ω resistor. Double everything, so 4V forward voltage and 300Ω resistance, to get the same current flowing you would need a 10V supply. But that is not how you think about LEDs. You don't think "For these LEDs I need a supply of X volts", but "I need to supply Y mA. How can I best achieve that?" And in the example I just gave it would be better to reduce the resistance not... – Majenko Sep 06 '16 at 10:34
  • ...increase the voltage since that reduces the power lost through the resistor (drop it to a 50Ω resistor and keep the voltage at 5V). \$P=I^2R\$, so at 300Ω you have \$0.02^2 \times 300 = 120mW\$ but at 50Ω you have \$0.02^2 \times 50 = 20mW\$ – Majenko Sep 06 '16 at 10:36
  • No, no, you misunderstood me. Suppose you have the 2V/20mA LED with a 150Ω resistor that you mentioned above. If you put this to a 5V power source, it glows with brightness X. If you now replace the power source with a 10V power source, but keep everything else the same, would it now glow at 2X brightness (or close to it)? Or would it go up in smoke? – Vilx- Sep 06 '16 at 10:43
  • 1
    Ah, right. Well, do the math. 10v - 2v = 8V. 8V over a 150Ω resistor is 0.0533mA. Not quite double the current, but more than likely too much for the LED to handle. Also note that brightness is *non-linear*. There's very little difference in brightness (to the human eye) between 15mA and 20mA on a typical LED, so 53mA on a 20mA LED won't be that much brighter either. Certainly not for long anyway ;) – Majenko Sep 06 '16 at 10:47
  • Ok, do the math again. 4V forward voltage total, 300Ω resistance total, 5V supply. 5v - 4v is 1v across the resistor(s). Current through it all is 1/300 = 3.3mA. A tiny fraction of before (a long way off half the current). And so a tiny fraction of the brightness from each LED. You see the non-linearity of the relationship now? – Majenko Sep 06 '16 at 10:48
  • OK, I get the gist, but if you don't mind, I'd like to work through the math myself, purely to understand it. One thing that confuses me is that you're subtracting voltage values. That... doesn't follow any of the (few) electrical laws that I know of. Those being the Ohm's law and the laws how current/voltage divides when things are connected in series or in parallel. – Vilx- Sep 06 '16 at 10:55
  • 1
    You subtract the voltage of the LED from the supply voltage because that forward voltage is a fixed property of the LED. Like I said in my answer, LEDs do not abide by Ohm's Law. They are non-linear devices. Ohm's law only applies to linear devices, like resistors. – Majenko Sep 06 '16 at 11:00
  • OK... that still doesn't make sense to me, but I'll accept it. I sense that a full proper explanation would be too lengthy/complicated to be handled here in comments. – Vilx- Sep 06 '16 at 11:05
  • 2
    As an aside note, and just to be pedantic :-), light bulbs are nonlinear devices too. – Sredni Vashtar Sep 06 '16 at 14:59
  • 2
    @SredniVashtar Yeah, I was thinking about mentioning that, but decided it would just muddy the waters even more. – Majenko Sep 06 '16 at 14:59
  • 2
    You could change the initial paragraph on the lines of "Although both light bulbs and LEDs are nonlinear devices, the nonlinearity shown in the V-I characteristic of LEDs is much more prominent and cannot be ignored". You'd just have to rewrite this in understandable English :-] – Sredni Vashtar Sep 06 '16 at 15:14
  • @Majenko I can quite easily tell the difference between 15 and 20mA for piranha LEDs! – chrylis -cautiouslyoptimistic- Sep 06 '16 at 16:10
  • @chrylis Well you're just special then aren't you? – Majenko Sep 06 '16 at 16:30
  • Just out of curiosity - how difficult would it be to make some sort of circuit that would turn a LED's brightness linearly dependent on applied voltage? Over some useful range, at least. – Vilx- Sep 07 '16 at 06:28
  • 1
    I find this answer, while being correct, slightly misleading for this very basic question - especially the graph. While certainly not being incorrect, it suggest, on first glance, that it is thinkable to drive the LED directly with a simple power source (like a battery), as long as you somehow hit the correct voltage. I think the answer would benefit from making it much more clear that the LED basically does not care about voltage (in certain limits) as long as you keep the current constant. I would expect the first graph in this answer to be the simple battery + resistor + LED picture. – AnoE Sep 07 '16 at 10:57
  • Does it mean I can connect LED to 240V AC power with 10 resistor? At least to state that LED shines (blinks) and is not burned? I know that AC power source changes polarity 50 times/second, so LED would not shine for half of time (it should survive reversed current), but for half time it would work, if maximal current doesn't exceed its capacity. – Pointer Null Dec 09 '18 at 11:03
  • 2
    Sure, you can run an LED directly from mains with a suitable resistor - and many panel indicators do exactly that. However it is more common to use a capacitor to limit the current for AC, and add a more reliable diode to block the reverse current, since LEDs are not good at blocking reverse current - or use a full bridge rectifier to convert it to DC. – Majenko Dec 09 '18 at 11:06
  • 1
    Upvote 7 years on :-) . Still here ? – Russell McMahon Aug 03 '23 at 22:57
11

No, an LED by itself (no resistors or other electronics) behaves quite differently from a light bulb.

Have a look at this datasheet of a random LED.

Scroll down to the page with many graphs. The third graph shows the relative intensity (light) versus current through the LED:

Intensity vs. Current (334-15/T1C1-4WYA datasheet)

(Source: 334-15/T1C1-4WYA datasheet)

You'll notice that this curve is somewhat linear, meaning twice the current would give you roughly twice as much light.

What have we learned: a LED's brightness is somewhat proportional to the current flowing through it.

But what current do you get for a certain voltage ?

Look at graph 2:

Current vs. Voltage (334-15/T1C1-4WYA datasheet)

(Source: 334-15/T1C1-4WYA datasheet)

Forward current vs forward voltage, notice how the current increases rapidly for a voltage above 3 Volt. Only 0.5 V more gives 4 x the current! This curve also changes between LEDs and over temperature.

That is why it is better to feed LEDs with a current instead of a voltage. If you feed a LED a with voltage, the current is not very predictable so neither is the brightness. Also the power fed to the LED will then vary as Power is voltage x current.

It is better to keep a LED at a constant current so that is why series resistors are needed, these limit the current to the intended value. Not exactly but close enough for most purposes.

With the series resistor in place a LED (+ resistor) somewhat behave more like a lightbulb in the sense that the change in brightness is more proportional to the voltage you apply.

try-catch-finally
  • 1,254
  • 3
  • 19
  • 37
Bimpelrekkie
  • 80,139
  • 2
  • 93
  • 183
  • The last sentence is the answer to OPs question. If you have a LED resistor combination for a voltage reasonably higher than the LED voltage (say at least a 12V for rated current) then the brightness will be closely related (but not perfectly due to the rather fixed LED forward voltage) to the voltage from say 5V to 15V and the experiments would have to have their voltage range adjusted to see the results. The behaviour of an LED and resistor is close to a light but does not teach as much about LED behaviour if you do not consider the parts separately. – KalleMP Sep 06 '16 at 11:26
3

LED & incandescent bulbs are almost opposite in characteristics.

  • LEDs drop in R with rising voltage.
  • BULB's Resistance rises by 10 times when turned on. This is due to a large exponential thermal PTC (+) of a tungsten filament. Meanwhile, LEDs are just the opposite, with a small linear NTC (-) value.

    • LEDs cannot handle negative voltages. All are rated @ -5V absolute max.
    • BULBs easily go both ways, AC-DC
  • LEDs use "micron thin" ultrasonic Au wirebond, 'cause soldering would kill it.

  • BULBs ... operate at 2500'C

    • LEDs need ESD protection.
    • BULBs absorb ESD without any problem.
  • LEDs come in all colours of the rainbow and beyond.

  • BULBs are all the same, in shades of white

    • LEDs can detect light with a small output current like photodiodes.
    • BULBs can't detect light.
  • LEDs are single sided even with a transparent substrate.

  • BULBs are omnidirectional.

So when you add it all up, you have to understand the differences in order to make them work in the same power environment. Or else rely on an engineered solution to make them simple to use.

Tony Stewart EE75
  • 1
  • 3
  • 54
  • 182
1

If you bought LEDs with builtin resistors they would work (nearly) exactly that way.

The light output of LEDs is nearly proportional to the current over a broad range.

For relatively high voltages \$(Vb >> Vf)\$ the current calculates as follows:

\$V_b\$: operating voltage

\$V_f\$: forward voltage of LED

\$R_i\$: builtin series resistor

\$I=(V_b-V_f)/R_i\$ (single LED). Which can be reduced (within 10% tolerance) roughly approximated to \$I=(V_b/R_i)\$

for two of them it reads: \$I=(V_b-2*V_f)/(2*R_i)\$ which can be reduced to roughly approximated:

\$I=(V_b/(2*R_i))\$

Thus when putting 2 LEDs with builtin series resistors in series, the current drops to half the initial current.

Ariser
  • 3,846
  • 3
  • 23
  • 43
  • This is true, but relies on Vb > 2Vf, i.e. most of the power being spent in the resistors in the normal use case. – pjc50 Sep 06 '16 at 10:38
  • This is incorrect, until you change "reduced" to "approximated". – Scott Seidman Sep 06 '16 at 10:38
  • @pjc50 That's why I wrote \$V_b >> V_f\$. I didn't want to extend it to n LEDs then :) – Ariser Sep 06 '16 at 10:46
  • @ScottSeidman: thanks for the correction. Better now? – Ariser Sep 06 '16 at 10:47
  • Your answer contains a bit too much generalisation in the text if one is working with combined voltages close to the LED forward voltage. a 3V LED + resistor combination will not work well at 1.5V or 6V but a 15V LED + resistor combination will work from 5V to 15V in the manner you hope. I gave you an upvote for "(nearly)". – KalleMP Sep 06 '16 at 11:29
  • Yes- but reader's should know that this is a *rough* approximation. $V_b$ is generally not much bigger than $V_f$ -- more often, it's bigger by a factor of 2-5, as $V_f$ is about 2V, depending on color. If you get an LED with a resistor built in, designed to be used with 5V, you may not get near enough to $V_F$ to light two in series. – Scott Seidman Sep 06 '16 at 12:32
0

The brightness of an LED depends primarily on the current flowing through it.

A conventional incandescent bulb is effectively a resistor, it follows ohms law the V = I * R. If you double the voltage the current will double and the power used will go up by a factor of 4 (not quite true, there are some temperature related effects but close enough for now).

An LED on the other hand is a diode, like most diodes it has a a relatively fixed forward bias voltage. Below that voltage no current flows, above that voltage current flow is unlimited but the voltage is reduced by the bias voltage. (This is a massive simplification but is good enough for most rough calculations)

What this voltage is will depend upon the materials used and so will be colour dependent. Typically ~1.8-2V for red, yellow or green, ~3V for blue, white or "true green". This voltage drop will increase with current but only by 0.1-0.2V, you can normally ignore this effect.

As you indicated in your question LEDs are typically connected with a resistor in series to limit the current. Why?

Think of the LED as a fixed voltage drop, it will use up a fixed amount of voltage no matter the current. So if you connect a 2V LED directly to a 3V source there will be 1V left to be dropped over the rest of the circuit. The rest of the circuit in this case will be the internal resistances in the power supply and wires. These resistances typically fairly low (so low you normally ignore them) and so a large current will flow.

Assuming the resistances are in the region of 0.1 omhs this would give a current of I = V/R = (3-2) / 0.1 = 10 amps.

The power dissipated in the LED would be P = I * V = 10 * 2 = 20 watts.

This would very rapidly heat the LED to the point where it is destroyed. The real world is a little more complex since the LED isn't the perfect zero resistance fixed voltage drop assumed but the end result is the same either way.

If we add a series resistor of 100 ohms in addition to the internal resistances then the current is reduced to 10mA and the LED glows nicely.

Changing the resistor value will change the brightness, most small LEDs are limited to about 20mA max and aren't visible much below 1mA. Generally going much over 10mA is hardly noticeable (this is more due to the way eyes work than the way LEDs work). You can also change the brightness by switching them on and off very quickly, this is simpler for digital systems to do and is generally more efficient for a given perceived brightness (again more due to eyes than LEDs), this allows you to change the brightness while only having a single fixed resistor in the hardware. If you are planning on using a variable resistor to set the brightness then it's good practice to also include a small fixed value so that with the variable resistor at 0 the current is limited to 20mA.

So what if we add two LEDs in series?

Each LED needs 2V to turn on. Two LEDs means 4V. With a 3V source we don't have sufficient voltage to forward bias the diodes and so they will block all current flow. The LEDs will be off. If you increase the voltage and set the current limiting resistor correctly then they will both turn on. Since brightness depends upon the current through the LED and they will both have the same current they will be the same brightness (for the same type of LED).

What if we add two LEDs in parrallel?

If we add two in parallel each with their own resistor then they are effectively separate circuits. Assuming the power supply is sufficient each will act as if it's the only one.

If they share the resistor then things get more interesting. In theory this would work fine, you'd need to halve the resistor value to give the same per LED current but other than that you'd expect it to work. Unfortunately no two LEDs are identical, they will all have very slightly different bias voltages which means that more current will flow through one than the other (it would be all the current through one if it wasn't for the small increase in voltage as current increases that we normally ignore).

This means that two LEDs in parallel with a single resistor will almost never be the same brightness.

Generally anything which needs to drive a group of LEDs (e.g. a backlight) will use a long series chain of LEDs and boost the voltage up as high as needed (within reason) so that they are all the same brightness.

Andrew
  • 6,872
  • 19
  • 25
0

While an LED is nothing like a incandescent, the answer is still YES.

The only difference in the ohms law calculations will be to subtract the LED forward voltage from the power supply voltage.

The difference in the LED's forward voltage vs. forward current is insignificant.

I measured the voltage of a string of 16 red LEDs at 200, 350, and 500mA. The voltages were 30.07, 31.20, 31.43. 1.02% change from 200 to 500mA.

Misunderstood
  • 7,287
  • 1
  • 11
  • 24