7

OK, maybe not all of them, but by far most of the 5 mm LEDs I've ever handled are rated with I(max) = 20 mA and I've been using them for maybe over 15 years.

Why is this? Is caused by the size of the chip that 20 mA is the max? Is it because of historic reasons? Convenience to remember specs? Does a modern 20 mA chip not fit in a 3 mm package? Is it a dissipated power thing (guess not as blue LEDs dissipate almost double power with respect to a red LED).

stevenvh
  • 145,145
  • 21
  • 455
  • 667
jippie
  • 33,033
  • 16
  • 93
  • 160
  • 1
    Possibly because first historical use was for "current loops" communication lines. Those were standardized on 20 mA. Possibly exactly because first LEDs were specced at 20 mA. –  Jun 27 '12 at 23:48
  • @Rocket - shouldn't they be rated at 16 mA then, 20 mA - 4 mA reference current? Having a 500 ohm resistor parallel to the LED will do away with the 4 mA, otherwise it may also light faintly then. – stevenvh Jun 28 '12 at 08:31
  • 2
    @Rocket: The common 4-20 mA communication loop was around before LEDs of any sort were popular. This was also a analog signalling method, so I don't see LEDs being very relevant. – Olin Lathrop Jun 28 '12 at 11:46
  • @stevenvh, although I do not support the hypothesis, 4mA is probably low enough to only generate a negligible amount of light and 20mA is full on. The 4-20mA system uses the 4mA to know the cable is still connected, a light glow on the other side could probably be used to detect there is still some bais present. This is probably not the reason, but it entertains me. – Kortuk Jun 28 '12 at 13:12
  • 2
    I just breadboarded a 4mA current source and pushed in one of my maybe > 20yr old red LED's. @kortuk I can't call it a light glow, it is a very well visible red indicator light. With a 470 ohm resistor in parallel, all that is left is an slight glow. When I set the current source to 20mA, the 470 ohm resistor barely has any effect. Myth busted? ;o) – jippie Jun 28 '12 at 15:50
  • 1
    @jippie what is the difference in brightness, can you easily make it out using a photodiode, LDR, Phototransistor. You have busted it at mythbusters level but lets do this for science, not for mythbusters. I have what I need at work to test this next week if needed. – Kortuk Jun 28 '12 at 15:51
  • @jippie, I have purchased books on EE history and have enjoyed one greatly in particular, I find it interesting, but this discussion is probably better suited to historical research, would you disagree? – Kortuk Jun 28 '12 at 15:59
  • I agree. As mentioned before, I'd hoped for a simple physical explanation. – jippie Jun 28 '12 at 16:07
  • let us [continue this discussion in chat](http://chat.stackexchange.com/rooms/3916/discussion-between-jippie-and-kortuk) – jippie Jun 28 '12 at 16:46

1 Answers1

4

I believe the reason due to thermal junction temperature for standard form factors. Some vendors will derate the maximum current correctly for ambients above a certain temperature.

Epoxy makes a great insulator and a 50µm gold wire is bonded (welded) from a top pad on the LED chip to the Anode lead, so as not to block much light. The chips now use transparent substrates so almost 50% comes from the metallic reflector cup. As I have tested and verified this happens to be a significant heat conductor to the chip. However, Mfg's cannot dictate users must connect this to a large ground plane heat-sink to run at higher current, because other reliability risks may occur, so the industry standard of 20mA for 5mm LEDs is constant.

This cathode cup connection is consistent on almost all 5 mm LEDs, but not quite. I might add it is critical when hand soldering not to exceed 3 seconds when soldering the cathode as it is THE primary heat path to the chip, but most vendors will not admit to this and most people do not hand solder these. The define a keep-out zone as 5mm below the base of the LEd as a no solder zone to allow a time buffer for temp. flow, but I wil spare you the details. Also most users do not have a ground plane for every LED especially on 1 sided boards or LED's in series.

The 3mm LED's that are spec'd at the same 20mA may have a smaller chip and higher current density but also have a thinner epoxy insulator to ambient. So junction temperature is not much different.

5mm IR LED's are designed to pump as much IR into those TV remote controls for distance and extended battery life. They also run at a lower voltage and so they are often spec'd at 50~75mA or pulse at >=100mA.

BTW, you can improve junction temperature by using large copper pads for the cathode or use the ground plane.

Most LED's are rated to 20mA due to the is due to current density in the chip not the size of the epoxy. The epoxy has a large thermal resistance. With a 3.2V drop devices are derated for ambients above room temperature depend on the assumptions for thermal resistance and Rja. Since the package has no thermal conductance in the package except via the So the 20mA is limited due to the junction temperature rise. The Anode has the gold wire bonded and is so thin ( <50µm) it also has high thermal resistance. That leaves the cathode which has the metallic reflector cup being the lowest thermal resistance.

Beware that all LED's are spec'd at rated current and at 25'C and when you operate above that, you need to reduce your current at some point below the maximum ambient spec. For industry consistency, the 20mA spec does not change but various ODM's may improve their package reliability to say they can allow a slightly different If vs Ta profile. So rather than change the 25'C spec, they change this derating curve.

Tony Stewart EE75
  • 1
  • 3
  • 54
  • 182