I am trying to produce a circuit that produces crossover clipping distortion for musical purposes (to sound like a crappy old transistor radio). The problem I am running into (among a few, but to limit the scope) is that to observe clipping distortion, the circuit draws a ton of current to drive the BJTs and load. My intention for the circuit is to run it in a guitar pedal off a 9 V battery (though boosted to 18 V for op-amp headroom), but this seems implausible with the current design.
Is there an efficient way to emulate crossover distortion?
Circuit
The current circuit used is a simplistic Class-B amplifier. As R4 increase the bias improves, moving it towards Class-AB and reducing the distortion. This will be parameterised with a potentiometer so that musicians can dial in the amount.
simulate this circuit – Schematic created using CircuitLab
Design constraints
The three design constraints I am dealing with are:
Keeping resistances relatively high so that the highpass filters formed with C1 and C2 keep a low cutoff frequency, allowing wide-band signals to pass through. Ideally the cutoffs should be beneath 20 Hz.
R3 and R2 can be left equal for simplicity.
The ratio between R3/R2 and R4 is one factor that determines the amount of crossover clipping
The second clipping factor is the load, R1, as it decreases the distortion increases.
Problem
Overall the circuit draws nearly 10 mA of current. For comparison, I have two Sallen-Key op-amp filters before it which each draw less than 1 mA each (ideal LTspice simulation, mostly quiescent). Do I just have to suck it up and suggest that users keep a draw full of batteries to use? (Or more likely a wall wart).