When a LED is included in a circuit that applies a reverse voltage that exceeds the reverse breakdown, a reverse current can flow and the LED might be destroyed. But what is it that actually destroys the LED: is it the reverse voltage itself, or is it the reverse current that is made to flow, or is it simply the overall power dissipation caused by the reverse current and voltage exceeding the device rating? Or something else?
So, for instance, if I connect a 12 volt source in reverse to a LED that breaks down at 5 volts, via a resistor, the passage of reverse current will cause a voltage drop across the resistor which in turn can limit the voltage on the device to its reverse breakdown value (and thereby define the current that would flow) - rather similar to what happens in the forward direction. Would this in itself destroy the LED as long as the total power was within the LED rating?
Normally of course, one would place a regular diode in parallel with the LED in the reverse direction to limit the LED reverse voltage to 0.7 volts or so, but there may be situations where this might not be possible or economic to do. I am just trying to understand how much circuit design flexibility I might have to meet different requirements.
And if it is possible to expose a LED to a reverse voltage, what precautions should be taken to avoid damage, and which spec parameters are relevant?