As I'm looking at some resistive-load applications I'm wondering whether there are any rules that put an upper-limit on supply-matching efficiency. Because if a load isn't matched to the power supply in terms of voltage, current, and waveform (typically AC vs. DC) then we have to add all sorts of components, and depending on how mismatched the supply and the load are we may end up cooking off a lot of waste power to bring them into alignment. To what extent is this just for design simplicity, and to what extent is it unavoidable?
Is there some minimum waste power cost involved in stepping voltage up or down? My understanding is that in the AC realm it is negligible even in practice with transformers, as it is in the DC realm with multipliers. But are these limited to cases where the input and output are multiples, and it gets more expensive in terms of waste power to hit a specific voltage that isn't an integer (AC) or binary (DC) multiple of the input?
Current seems to be an efficiency killer. I.e., once current in a circuit becomes significant it seems like there's no way to control it without generating a lot of waste. What theoretical constraints are there are on current throttling efficiency, and on what variables do they depend?