This was actually quite commonly done back in the day.
In fact, some soundcards had a little auto-detect feature where if you plugged in a mono 3.5 mm plug, they would output S/PDIF, while if you plugged in a stereo one they output analog. Likely the switching was actually implemented in driver software, but it was fairly transparent to the user - somewhere I still have the little 3.5mm mono to RCA adapter that shipped with these.
As for distortions due to impedance discontinuity this does happen, but while you can just barely see it on a scope, normal equipment is not bothered. But early in may career I spent a while trying to debug a product which seemed hyper sensitive to various distortions - a product manager was amazed to watch as I sat there with a Y adapter at one end of a two meter cable changing out various terminating resistors that I had soldered to RCA plugs and alternately made the product work and not work. Ultimately went so far as to calculate a S/PDIF frame with various distortions and load it into an arbitrary waveform generator to demonstrate some of the specific susceptibilities. Turned out however, that the DSP vendor had given us a bad binary-blob for their proprietary chip which mis-operated the clock recovery PLL. When they finally sent one that used the hardware correctly the system stopped being sensitive to the various minor imperfections we'd been discovering in a long list of computer and DVD player outputs.
So in conclusion: it was done, and with equipment that isn't fundamentally broken it works fine.