No site, seems to explain this super well. For example, if current amplifies current then shouldn’t there be a larger voltage drop after, the current, per se, is amplified by 100? But the voltage drop seems to only be very small
The question I have arises from this, what do we really mean when we say a transistor amplifies current?
So, I am looking for a clear explanation, thanks so much!
When we say that a transistor amplifies current, we mean that it allows much larger current to flow from a much smaller current, it doesn't amplify the current in a circuit. The circuit doesn't make sense and should have another power source that the transistor is acting upon, a transistor is a switch after all.
There are equations that calculate the voltage drop but it is a tiny amount, and almost any circuit will still work with the voltage drop across a transistor, because there it is so tiny, that is.
This is all thanks to The Photon clearing this up, so thank you The Photon!