0

I need to power many 100w 32-36V LEDs... and building loads of 32V PSUs is costly and time consuming.

This is a bit of a crazy idea, and I'm aware that it's probably not the safest way to do this, but this circuit has just popped into my head:

enter image description here

The tiny detail that I'm not quite sure about is the DC voltage after the rectifier:

  • Should I calculate using 240V, so 7 LEDs connected in series across 240V would give 34.2V per LED.
  • Or, should I calculate using the peak, non-RMS voltage 340v, so 10 LEDs connected in series across 340V would give 34V per LED.

N.B. I'm aware the picture shows 8 LEDs, but it's just to visualise what I'm talking about.

If anybody has any other quick, easy, safer ways to do this then please let me know!

BG100
  • 5,798
  • 4
  • 35
  • 48
  • You need to regulate current not voltage. The circuit above will not do. – DamienD Sep 02 '15 at 10:05
  • Is there any way to do that using minimal parts? – BG100 Sep 02 '15 at 10:11
  • 1
    Sorry, this kind of power is out of my league. You could have a look here as a starter: http://www.digikey.com/en/articles/techzone/2011/nov/the-challenges-of-choosing-offline-led-driver-topologies – DamienD Sep 02 '15 at 10:33
  • Please give some more specificaitons about the LED. If you choose your circuit layout carefully, it may be possible to put some of them in parallel. – Ariser Sep 02 '15 at 13:43
  • LED specs: Cool white, 100w, Input Voltage: 32-36v, Forward Current: 3.5A, 6000-7000 lumen – BG100 Sep 04 '15 at 14:40

1 Answers1

2

The circuit you have drawn will not do what you want. If I were going about things this way, this is what I would do:

schematic

simulate this circuit – Schematic created using CircuitLab

How you implement the current sink is up to you. You could have a simple current sense resistor + feedback driving a linear pass transistor (if you want to dissipate lots of heat). If this is passing 3A (which is roughly what those LEDs are pulling) and has to drop 10V of extra voltage, that's 30W of heat you need to dump off into a heatsink. Not impossible, but requires thought. You can see the basic design of a linear current sink here.

You could also implement a current-controlled buck converter for the LEDs - although rating this for the full rectified mains voltage will require consideration. The link in @Damien's comment above is a good place to start looking, specifically using something like the Fairchild FL7701. This would be much more efficient, but requires a more sophisticated design.

stefandz
  • 4,132
  • 17
  • 41
  • 1
    You would also want a rather bigger cap on the bridge. 3 amps @ 100 uF gives a discharge rate of 30,000 v/sec or 30 v/msec. Granted, you're working at 120 Hz, not 60, for a cycle time of ~8 msec. – WhatRoughBeast Sep 02 '15 at 14:33
  • 1
    Considering that each of these 100 W LED is going to produce 80 W of heat, maybe it isn't such a bad idea to go for a linear current regulator indeed. Esp. considering the OP's requirement of a low cost system... he could just use the same kind of heatsink for the MOSFET as for the LEDs. – DamienD Sep 02 '15 at 18:49
  • This is great information, thanks... I don't really understand whats going on though, what is the reason that I need to limit the current? – BG100 Sep 04 '15 at 08:18
  • Because of the way LEDs work: in the absence of a current regulator, small fluctuations in input voltage or temperature would lead to large fluctuations in current which could burn the LEDs. – DamienD Sep 04 '15 at 08:49
  • [This graph](http://goo.gl/RNUiat) shows the current-voltage relationship of a typical LED. Notice that around the forward voltage (3.8V in your case) as @Damien said, small changes in voltage = large changes in current, which affects brightness and can overdrive the LEDs. You should also take a look at [this](http://goo.gl/GOwbd) SE question which explains this more in detail. – stefandz Sep 04 '15 at 09:20
  • Thank you for the information, really helpful. I do stuggle to understand the maths though. I know this is a bit unconventional, but what if I put a filament bulb in series with the LEDs? Then the more current that flows, the hotter/more resistive the filament becomes, limiting the current...? Just trying to think of an easy/cheap way to do this... – BG100 Sep 04 '15 at 14:09
  • You would be relying on the current-resistance curve of the filament bulb, and I don't know of any bulbs that are specified that way (I might be wrong). I wouldn't trust this much power to a feedback mechanism like that - not least because the response time of the bulb is unlikely to be quick enough to prevent the destruction of the LEDs. There are good reasons that engineers use constant-current sources to drive LEDs - reliability, predictability and ease of design. – stefandz Sep 04 '15 at 14:13
  • Ok, I'm running out of daft ideas... I guess I'll have to try to work out how to build a linear 3A current sink that works at this voltage as per your suggestion. Dissipating the excess heat shouldn't be a problem. Thanks. – BG100 Sep 04 '15 at 14:44