11

I read in books that light intensity from an LED does not increase beyond a certain value of current.

The amount of light emitted depends on the combination of holes and electrons. If so, then as the electron flow increases in the circuit, the effective combination must also increase resulting in higher intensity.

But generally why doesn't this happen in a LED beyond a particular value?

Dwayne Reid
  • 23,390
  • 2
  • 35
  • 67
Andrew Flemming
  • 763
  • 3
  • 10
  • 20

3 Answers3

16

Not all recombinations result in the emission of a visible-light photon. Only ones that occur within the P-N junction of the LED itself have the energy for that, and this volume can become "saturated" at high current levels. When this happens, some of the electrons and holes pass all the way through the junction before recombining in the bulk material on either side, where they do so with reduced energy, resulting in the release of longer-wavelength (heat) photons.

Dave Tweed
  • 168,369
  • 17
  • 228
  • 393
11

For what it's worth, Maxim claims a somewhat different mechanism (thermal) than that cited by Dave Tweed:

enter image description here

As LED drive currents increase for multiplexing, internal temperatures within the LED also increase. There is a point at which the temperature increase causes a drop in photon conversion efficiency, which, in turn, negates the effect of the increased current density through the junction. At this point, increasing drive currents can result in a small increase, no change, or even decrease in light outputs from the LED chip.

The difference may be important if very brief pulses of current are being fed to the LED.

Spehro Pefhany
  • 376,485
  • 21
  • 320
  • 842
  • +1 For the figure.But I cannot understand what you actually come to say by "Temperature negates the effect of the increased current density through the junction". – Andrew Flemming Jul 19 '15 at 05:32
  • 1
    @RelevationsSajith: It's the first part that's important - `temperature increase causes a drop in photon conversion efficiency`. As the current increases, the LED gets hotter; the heat decreases the efficiency. Beyond a certain point, the decrease in efficiency from being hotter can be more than the increase from the extra current. – psmears Jul 19 '15 at 06:13
  • Thermal inertia of the LED semiconductor should allow higher intensity if the current pulse is short. – cuddlyable3 Jul 19 '15 at 19:05
4

Like the current answers by Spehro and Dave state, the limiting factor is by the heat that is generated by the current.

As the current increases, light output increases, but as current becomes high the junction of the LED becomes hot. The hotter the junction, the less efficient the LED becomes. Thus you reach a point where increasing the current actually decreases light output simply because the LED becomes less efficient at turning electricity into light.

It is common practice to increase efficiency of an LED by cooling it via heatsinks. (Also referred to by few as "heat plates" as some popular LEDs come pre-mounted on copper laden PCBs.)

To get the best light output / current ratio out of an LED set-up the general practice is to use more than one LED for the purpose and under-driving it. By actually using less current per LED you are rewarded more efficiency, however this is at the cost of using more LEDs in any given design.

LEDs can also have more current pulsed through them compared to having constant current. This is used to great effect in some stage lighting equipment as well as other products that use high-intensity strobing effects such as this Rescue Beacon.

Overall an LED is limited in intensity by the amount of heat it generates.

Peter Mortensen
  • 1,676
  • 3
  • 17
  • 23