I've always been intrigued by common-mode noise, as it seems to be such a pertinent issue when it comes to op-amps, chokes, etc. used as solutions to remove it.
I understand from a digital signal point of view (as explained very well here) why it is a bad thing, as on a differential pair it means the differential voltage (V+ - V-) taken at the receiver will bounce and fluctuate around some value (the common-mode voltage), when you would in reality want this to be stable.
What I still haven't quite understood is why it is a bad thing on power lines (mainly low voltage ones, not really talking about mains voltage here but I suppose it's the same deal). Like such, consider a power supply (maybe even isolated) that has an output of 24V. It is fairly common from the designs I have seen to use a choke on the output:
What exactly would this achieve? I could be wrong but no matter what the common-mode noise is, the device being powered won't care as the only voltage that matters is the differential voltage relative between the two rails. I.e. at an instance of time, say the common-mode noise is rapidly bouncing between -10V and +3.5V. The rails (Vcc, GND) will be bouncing between (14V, -10V) and (27.5V, 3.5V), which still leaves the differential at (14 - -10) (27.5-3.5) 24V in either extreme (duh).
So, why is the choke helpful at all?
They have a whole category on Digi-Key ("Common Mode Chokes > Power Line"), so surely there is some reason they are used.