0

Obviously a basic question but unbelievably difficult to get an answer

I understand that LEDs are diodes and that if you don't limit the voltage to a LED it will blow because it draws as much current as it can from what is available. However whenever I see lamps in a series circuit, there is no resistor associated with it which I suppose means that lamps only draw what they need. So my basic questions are as follows

a. Does this then mean you can do ridiculous things like power a 2V lamp with a 220V power source and nothing will blow? b. If I am right with the fact that lamps draw what they want and LEDs dont how can I predict this for other components- ie how are these components then classified c. Slightly tangential but if I connect LEDs in series (lets say there are three and are all 2V) to a 12V supply and then introduce one resistor that limits the available voltage for all the LEDs to 6V then surely the first two LEDs in the series will blow because I'm giving it 6V and 4V respectively when they only need 2V? Or perhaps Im misunderstanding the idea of the voltage available and the voltage used.

Please help. Would really like to understand these simple concepts

user91185
  • 21
  • 1
  • 2
  • 1
    You make a lot of assumptions which come from lack of knowledge on the subject. Why a LED needs a resistor has been answered before. Have a look at the videos of this guy, https://www.youtube.com/watch?v=Ju1CQF39DH8 he explains many times how 220V LED lamps work. Half way the video he draws a schematic of it. – Bimpelrekkie Nov 08 '15 at 12:32
  • The short answer is that the lamp IS the resistor (albeit not a linear one). You are using Joule heating to heat that resistor to the point it emits light (it has to be in an inert atmosphere to avoid burning up). The LED OTOH, uses a different mechanism to produce light and, since it has a very tiny internal resistance you have to supply en external resistor to keep it working where it has to. – Sredni Vashtar Nov 08 '15 at 13:12
  • 1
    1st approximation: incandescent lamps are resistors. 2nd approximation: incandescent lamps are resistors that vary with temperature (more on that [here](http://electronics.stackexchange.com/a/30251/7036)). – Nick Alexeev Nov 09 '15 at 02:55
  • Lots of similar question here: http://electronics.stackexchange.com/questions/95874/when-people-talk-about-a-device-drawing-current-what-do-they-mean-why-do-dev etc. – Fizz Nov 09 '15 at 14:55
  • You may want to look at https://www.youtube.com/watch?v=DLaH06IOZGQ – Fizz Nov 09 '15 at 15:01

2 Answers2

6

You will find your answer in the characteristic curves of the devices in question.

Example characteristic curves

Consider the characteristic curves of the resistive loads. Current increases linearly with voltage. This means for any applied voltage, a certain proportional current will flow through the device. Lamps are approximately resistive loads so they behave this way. The important thing to know is that an increase in voltage produces a proportional increase in current.

So, your question a, if you increase the voltage to a resistive load by a factor of \$110\$, the current will also increase by a factor of \$110\$. Since \$P=IV\$, the power will increase by a factor of \$110^2=12100\$. This will more than likely destroy your 2V lamp. Resistive loads don't "draw what they want". They draw current that is proportional to the applied voltage.

Now, consider the curve for the diode. The current through the diode is an exponential of the voltage across it. It means that an increase in the voltage across the diode produces an exponentially larger increase in current. In practice, the exponential is so steep that any voltage over \$\approx0.6V\$ (for silicon) will produce a huge current.

In theory you could run an LED without any type of resistor ballasting. You'd need to control the applied voltage very precisely, though. Even a small deviation in the applied voltage will create large changes in current due to the steepness of the exponential curve.

For example, the 1N4001 diode begins to conduct at \$\approx0.6V\$. At \$1V\$, current is around \$2A\$. At \$1.4V\$ current is over \$10A\$. And this is in a device that has a maximum current of only \$1A\$!

So, unlike a resistive load, controlling the current through a diode or LED by using the voltage is not a very good idea. A better way is to control the current through the LED or diode to be what you want. A very simple way to do this is to use a resistor.

Regarding your part c, connecting three \$2V\$ LEDs in series to a \$12V\$ supply. The purpose of the resistor in that circuit is to limit the current, not the voltage. Consider the following circuit which I think is what you had in mind:

schematic

simulate this circuit – Schematic created using CircuitLab

Notice that after each LED, the voltage drops by exactly one diode drop (which is in this case \$2V\$). Since there are 3 LEDs, this causes \$6V\$ to fall across the resistor. Then, you need only choose the value of the resistor so the current through it is what you want. The resistor doesn't lower the voltage to the LEDs, it's resistance limits the current.

Robert Stiffler
  • 984
  • 5
  • 9
  • Won't the first LED blow because there is 12 volts being applied to it? Shouldn't the resister be placed before? – masfenix Sep 02 '16 at 22:22
  • 12V is applied across the whole series of three LEDs and resistor. If you notice the voltages at each node in the circuit, there is only 2V across each LED. – Robert Stiffler Sep 02 '16 at 23:11
3

If, by "lamp", you're referring to an incandescent lamp, they're designed so that with the rated voltage across them their resistance will limit the current through the lamp.

LEDs, on the other hand, aren't rated by voltage, they're rated by current, and what happens when you put the rated current through them is that they'll drop a certain voltage.

So, let's say you have an LED which is rated for 20 milliamperes and with that current through it it'll drop 2 volts. Then if you want to drive it from a 6 volt supply you'll have to connect it in series with a resistor which will drop the 4 extra volts with 20 milliamperes through the resistor.

From Ohm's law, the value of that resistance will be:

$$ R = \frac{E}{I} = \frac {6V - 4V}{0.02A} = 100 \Omega $$

EM Fields
  • 17,377
  • 2
  • 19
  • 23