-2

What is the current through a diode in series with resistor? Let's assume the source voltage is 12 V 800 mA max having a forward biased diode of 0.7 V drop connected in series with a resistor of 10 ohms. From the equation we know that:

$$I_D = \frac {V_\text{cc}-V_D}{R}$$

Solving with the above parameters will give \$I_D=1.13\text{ A}\$. I am confused. How did the current increase to such an amount?

JRE
  • 67,678
  • 8
  • 104
  • 179
  • 1
    Increased? From what? `800mA` is the *maximum* rating, so if your calculated current (which is correct BTW, if you parenthesize it properly) is more than that, then your power supply is not good to power this circuit. – Eugene Sh. Jul 07 '22 at 21:48
  • the current is the same for all things in series circuits, so you just have to figure for the resistor. – dandavis Jul 07 '22 at 21:51
  • @dandavis The OP calculated the current correctly. – Eugene Sh. Jul 07 '22 at 21:52
  • The 800mA current limiting protection (if there is some) on the power supply will force its voltage output to reduce from 12 V to about 8.7 V so that the current through the resistor is (8.7 - 0.7)/10 = 800mA. –  Jul 07 '22 at 22:03
  • 4
    Tip: in English we usually say "current **through** a component" and "voltage **across** a component". – Transistor Jul 07 '22 at 22:09
  • See [this](https://electronics.stackexchange.com/a/592785/38098) for a closed solution. – jonk Jul 07 '22 at 22:29

1 Answers1

2

The 10 Ohm resistor and series diode would like to draw 1.13 Amp from your 12 volt source, but if the source can only deliver 800 mA either the source will reduce its voltage until the resistor and diode only draw 800 mA, or the source may be damaged as it attempts to deliver more current that it is designed for.

Peter Bennett
  • 57,014
  • 1
  • 48
  • 127