2

I am currently looking at the datasheet of a LED and have snipped out the passage of interest.

enter image description here The question is how do I find out the intensity I can reach at a specific temperature/current by using the these curves?

Assuming I have an high ambient temperature of let´say 80°C, the current derating curve (current versus Temperature) would tell that I can´t drive the LED with more than approximately 8mA.

Looking at the Intensity versus current, I can see that at 8mA, I can expect about 0.4 of relative Intensity. Fine so far.

Now what about the relative intensity via temperature curve? Do I have to consider the effect described in this curve separately to the derating done so far? Means at 80°C, I only have 0.5 of relative intensity, so I had 0.4 before, so combining would lead to 0.4*0.5 = 0.2 of relative intensity at 80°C and 8mA.

That seems really low. Is the interpretation I have described correct or did I considere some effect(s) twice?

Transistor
  • 168,990
  • 12
  • 186
  • 385
Junius
  • 979
  • 11
  • 23
  • Compound the two, so 0.2. – winny Sep 13 '17 at 05:38
  • Concur, efficiency is down to 50% at 80 degrees, and you are limited to 8mA instead of 20mA so, you will get 20% of the light you would get at 20mA 20 degrees C. That is not as bad as you might think as the human eye is very much non linear. – Dan Mills Sep 13 '17 at 12:10

1 Answers1

1

Since the comments agree, the derating described is correct, that means: When driving the LED savely at 80°C (with about 8mA of forward current), the relative intensity is down to about 0.4*0.5 = 0.2 (w.r.t. to the intensity at a forward current of 20mA)

Junius
  • 979
  • 11
  • 23