I'm fairly new to this and I was wondering if it makes a difference if I put the current limiting resistor of a LED on the cathode or on the anode. Any "best practice" and why?
Thanks!
I'm fairly new to this and I was wondering if it makes a difference if I put the current limiting resistor of a LED on the cathode or on the anode. Any "best practice" and why?
Thanks!
Several reasons to put it on the high side. Though there is little practical reason to do it.
However, there are exceptions, depending on the circuit design. Suppose for example you used a BJT as a switch to turn on/off the LED. Depending on the design, it may be prudent to put the resistor on the high side, like normal, or perhaps on the low side. It depends on the biasing voltage available, etc. There are just other considerations, is my point. The explanation others and myself have given depend highly on the assumption of a simple circuit design.
Either is fine, though if the LED is remote it should go on the high side (Vcc or whatever you are powering the LED from.)
I would put the resistor in the positive supply, so:
Vcc -> R -> LEDA - > LEDK - > GND
This has no electrical benefit to the LED operation. But it has a few practical advantages...
If the LED is panel mounted etc, accidental shorting of the wires to GND or earth won't short the supply out.
When fault-finding, the LED terminals are being measured with respect to GND, letting you put the black DMM probe to GND and whip round your circuit terminals with the red probe.