0

I'm fairly familiar with Ohm's law and I did some calculations I'd like experienced engineers to confirm if it is correct or point where I made a mistake

Here's the idea

  • I want to power a self-made strip of 20 LEDs preferably in series (assuming 2.5Vf @ 20ma)
  • I got a 4.5W, 5V, 0.9ampere DC power supply
  • I calculated the required resistor for the circuit like this:
P = V * I
P = (20 * 2.5) * 0.02
P = 1W
---
V = I * R
50 = 0.02 * R
R = 2500ohms

From that I assumed I could power 20 LEDs in series with that power supply and a 2.5Kohm resistor. I also learned that circuits in series have voltage drops but current is the same, and in parallel circuit, voltage is the same but current drops. Is that correct or did I miss something?

Again, I'm an absolute beginner in electrical engineering. I graduated in CS but never got into the physics of circuits.

EDIT

From the answers I gathered that a resistor is used to drop the voltage that isn't dropped by the LEDs, so in this case, my 5V power supply would be able to power at most 1 LED at full brightness or 2 albeit dimmer.

Could I use a step-up to get the voltage higher and power those 20 LEDs in series?

EDIT 2

All the answers helped me understand a lot more about electronics and how they work as well as to always design circuits for margin and not for the nominal values. I realized my mistake by trying to go for LEDs in series instead of parallel, it didn't come to me I'd need just two wires and not 2 for each LED, I know, dumb mistake.

Thank you all for the help, tips and information.

schematic

simulate this circuit – Schematic created using CircuitLab

  • What voltage is your power supply? – Hearth Jul 30 '21 at 01:34
  • White LEDs usually is UV with luminophor, so do not expect less then 3.2V drop voltage at nominal current. – user263983 Jul 30 '21 at 01:35
  • @Hearth it's 5V – Vinícius Negrão Jul 30 '21 at 01:50
  • Does this answer your question? [Why we use 330 ohm resistor to connect a LED?](https://electronics.stackexchange.com/questions/27561/why-we-use-330-ohm-resistor-to-connect-a-led) – brhans Jul 30 '21 at 01:55
  • 2
    Why do you expect to be able to power 20 series LEDs on 5 V?? – Hearth Jul 30 '21 at 01:57
  • @Hearth that's what I'd like to understand, I can power several LED's in parallel on 5V as long as there is enough current, correct? Since power is a product of current and voltage, it should be possible to power either in series or in parallel just the same. If not, why is that? – Vinícius Negrão Jul 30 '21 at 02:04
  • @ViníciusNegrão One site to go read might be [this one at Digikey](https://www.digikey.com/es/articles/supplying-high-voltage-for-led-strings). – jonk Jul 30 '21 at 03:56
  • 1
    You can't change your question to a competely unrelated one, as it now made the answers to the original question invalid. If you have a new question, start a new question. – Justme Jul 30 '21 at 05:25
  • @jonk thanks for the link, I'll go read it – Vinícius Negrão Jul 30 '21 at 20:18

3 Answers3

2

Your resistance calc.is incorrect. You have 20 leds with a nominal (note nominal- ie it maybe smaller or larger) forward voltage of 2.5V. So you need at least 50V for your power supply. Lets say the power supply is 60V to cope with variation of the nominal Vf. The series resistor calculation would then be R = (60-50)/0.02 = 500 Ohms. P = I2R = 0.2W wasted in the resistor. So you’d need a 500Ohm resistor with a rating in excess of 0.2W.

Kartman
  • 5,930
  • 2
  • 6
  • 13
  • Ok, noted that I should consider variations. My question now is, can I power 20 LED's in series with a 4.5w 5V 0.9amp power supply? If it's possible, what would that circuit look like? Considering the use of a step-up regulator – Vinícius Negrão Jul 30 '21 at 02:23
  • 1
    Using the search term ‘led boost converter’ should give plenty of examples. I’d suggest getting Ohms law down pat - without the basics you’ll go in circles. Just to be clear - resistors waste energy as heat. They don’t create voltage.or put another way, you need 1V to have 1A flow through a 1Ohm resistor. – Kartman Jul 30 '21 at 06:37
  • thanks for the reply, it's great info but I decided to go for a parallel design, it's nice to finally understand the advantages and drawbacks of each design. Thank you all for the info! – Vinícius Negrão Jul 30 '21 at 20:23
2

You can use the power supply with your LEDs ... if you connect each LED in series with its own resistor, and 20 of these in parallel.

As you have 0.9A and 20 LEDs at 20mA each, the LEDs will take 400mA total so you have plenty of power to spare.

You don't mention the type of LEDs, but blue/white/violet will take about Vf = 3V each, red/green/yellow about Vf = 2V each (for more exact values, read their datasheets).

So the resistor should be (5V - Vf) / 20mA = (5V - 3V)/0.02 = 100 ohms for white, or 150 ohms for red; increase these values for a safety margin (120 or 180 ohms) and it's perfectly safe to increase further if too bright.

There's more wiring involved, but it may still be simpler than a boost converter.

  • Ok good, thanks for all the info. I thought parallel would be simple just becuase I would wire negative to positive from LED to LED and waste less wire. But I'm just now realizing it'd be simpler to just run 2 wires for the entire length and connect each LED in parallel, that'd would work just as well. Thanks again! – Vinícius Negrão Jul 30 '21 at 20:03
  • 1
    Each LED+ resistor. You might ask where's the learning if I do it this way? There's a deeper message, about where the advantages of series and parallel circuits are. They both have legitimate usages, and the need for 2 full length wires, or 20 resistors, are drawbacks. But you can tune the brightness of individual LEDs (by changing resistors) –  Jul 30 '21 at 20:06
  • thanks for the reply. In my case it'll be better to do it in parallel but I learned why it wouldn't work for series in this case, that it'd be better and cheaper to use more resistors and wire than buying a stepup booster. So much valuable information here, again, thank you all for the help – Vinícius Negrão Jul 30 '21 at 20:22
  • 1
    (for simplicity of wiring, you CAN also find LEDs with built in resistors. It's a tradeoff though, they are harder to find!) –  Jul 30 '21 at 20:25
1

You need the resistor to drop the voltage that isn't dropped by the diodes. Your diodes actually drop all 50V (Assuming that's the voltage of the power supply), so if the diodes are exactly as you describe (hint: they won't be) you don't need or want a resistor. That would be a cruddy design, though. Better to use a supply with some headroom so you can take up the difference with a resistor, or use a smaller supply and split you diodes into two parallel strings.

Scott Seidman
  • 29,274
  • 4
  • 44
  • 109
  • Ok, I got it! So, if I wanted to power a string of LED's without any parallel strips I would actually need a step-up instead of a resistor. I guess my mistake was thinking that a resistor would get the voltage higher, since V = I x R but it's a misconception. – Vinícius Negrão Jul 30 '21 at 02:12
  • 1
    Not a misconception, but all the voltage drops around a loop need to sum to zero (including sign conventions). Look up examples for Kirchoff's Voltage Law. Your equations actually work for a resistor parallel to the diodes, but then the resistor would have no impact on the current through the diodes. – Scott Seidman Jul 30 '21 at 12:55
  • Thank you for the reply, I'll definitely give it a read to understand more about the basics of electronics – Vinícius Negrão Jul 30 '21 at 20:25