In my current electrical engineering course the concept of power factor has been a large topic, defined as the ratio of real power to apparent power. It is a measure of resistive power consumption in a circuit. Other kind of loss present in the circuit is the reactive or "parasitic" power.
In my material it is presented as a quantity that always should be maximized. But what I don't understand is why resistive power consumed is always a good measure of efficient use of power? There are lots of components where reactive power consumption is desirable, like transformers. If I connect a power source to a transformer, it would have mainly reactive losses and resistive power loss would actually be undesirable. But still, if we use power factor as a measure of circuit performance, it would be poor in this case as we would have mainly "parasitic" losses.
Not that I think of it, I can't really come up with many applications where purely resistive losses are good (in a resistive heater for one). So what do we really consider as consuming "real power" in this sense? What do we count as consuming "real" power and "parasitic"/reactive power when the circuit receiving power is more complicated than a simple example circuit consisting just a resistor, capacitor and an inductor?