0

I'm making an led light for video shoot purpose. It's an array of 20, 5w led in parallel. The problem I'm encountering is, current drops when I'm connecting them in parallel. My power supply is adequate, it's a 12v 10amp power supply. When I connect a single led directly to 12v supply, the current I get comes to be .8 amps. So .8amps X 12v ~ 10w, which is higher than the led's output rating, but when I connect 20 of them in parallel, the current flowing through is 4.12 amps. So 4.12amps X 12v ~ 50w. When logically the current flowing should at least be 8.2 amps. What is happening here? Kindly help.

  • You have 20 LEDs. Please measure and post the current taken by each LED when powered singly. – Neil_UK Nov 01 '15 at 23:15
  • It's .8amps when I drive an led individually with the same 12v source. – Nakul Chauhan Nov 01 '15 at 23:19
  • 1
    Possible duplicate of [6 LEDs in parallel with a single resistor to simplify soldering](http://electronics.stackexchange.com/questions/13613/6-leds-in-parallel-with-a-single-resistor-to-simplify-soldering) – leftaroundabout Nov 01 '15 at 23:38
  • 2
    [LEDs are _not_ supposed to be connected to a constant-voltage supply, ever!](https://en.wikipedia.org/wiki/Light-emitting_diode#Power_sources) A LED has a highly nonlinear characteristic. It's impractical to try and hit its exact working voltage: if you supply a bit to little, the LED will practically go out, whereas only slightly too high voltage can cause damage. — Use a _fixed-current supply_ with higher voltage rating, and drive the LEDs in series! Such supplies are easily availble as LED drivers (duh!) nowadays. – leftaroundabout Nov 01 '15 at 23:39
  • That means I should go in for constant current source. Then only I can get them to work at their rated power. – Nakul Chauhan Nov 01 '15 at 23:41
  • @NakulChauhan - NO! Connecting them in parallel is the problem, since the same voltage will be applied to all the LEDs. Whichever has the lowest voltage drop will tend to hog current, and may well self-destruct. – WhatRoughBeast Nov 02 '15 at 02:07
  • 2
    \$.8 \times 20 > 10\$. Why do you think "my power supply is adequate."? – The Photon Nov 02 '15 at 04:30

1 Answers1

1

It is hard to figure out just what you're doing. Is it possible that your LEDs have built-in current limiting resistors, and are intended for 12 volt operation? That would explain why connecting one to 12 volts doesn't kill the LED. Plus, if the LED takes 5 watts, that would leave 5 watts to be dissipated in the resistor, which seems about right. However, this would make the LED get very (very!) hot, and it's hard to see how this could happen. But let's say that this is the case. Then, when 20 LEDs are connected in parallel, the total current draw is 20 x 0.8, or 16 amps, and this would cause your power supply to limit. Are you sure this isn't happening? When you get 4.12 amps, what is the power supply output voltage?

WhatRoughBeast
  • 59,978
  • 2
  • 37
  • 97
  • They are actually 12v 5w leds. I'm not sure if they are having current limiting resistor inside. The power supply is rated at 10 amps. So if its being limited, then the current passing through should've been cut off at 10 amps. But it cuts off at 4.12! The reason I'm using them in parallel is because in series it won't be possible. Max I can go upto, is 48 vdc. Greater than that, I'm don't wanna use. Even then it would be 5 parallels of 4-series leds. My main concern is to get output as close as possible to 100w. What do you suggest? – Nakul Chauhan Nov 02 '15 at 02:29
  • Your supply obviously has a variable voltage. Does it also have a variable current limit? – WhatRoughBeast Nov 02 '15 at 04:16
  • I think I caught the culprit. It's the ammeter itself!! – Nakul Chauhan Nov 02 '15 at 05:09