I feel like I am going a little crazy here because the more and more I read about these things the less I understand them.
So I understand that an ideal voltage source has no output impedance, no extra resistance in series between itself and the load, ensuring that nothing else will drop the voltage along the way, meaning the load will get the exact voltage listed.
An ideal current source has infinite output impedance, infinite resistance in parallel between itself and the load, ensuring that the load will get every bit of current as listed (no current will get diverted elsewhere).
I understand that these "ideal" sources are ideal and don't exist in practice. In reality there is always some resistance. What I don't understand is why then are are able to switch between them, or how we can talk about insensitivity to load changes when Ohm's Law says V = IR, if I have a voltage source, is that not fixed? And then if I add a bigger resistor, doesn't that mean less current must flow in turn? Or for a current source, if I change the resistor, am I not then changing the voltage? And since I am able to change the other variable, why are we then able to swap between them? (by swap/switch I mean replacing voltage with current sources and vice versa)
None of it makes any sense to me because I rarely see examples with numbers to illustrate what's going on and how I am supposed to think about these things. Can anyone please give a few examples showing these "ideal" forms, why they are unrealistic, why we are somehow able to switch between them, and what a "real" example might look like?