I'm going to copy my answer to an old question here, because the title of this question makes it much more likely to be found in the future. This answer was originally meant to answer part of this question.
Is analog signal division possible (as FPU multiplication often takes one CPU cycle anyway)?
If you have an analog multiplier, an analog divider is "easy" to make:

Assuming X1 and X2 are positive, this solves Y = X1 / X2.
Analog multipliers do exist, so this circuit is possible in principle.
Unfortunately most analog multipliers have a fairly limited range of
allowed input values.
Another approach would be to first use log amplifiers to get the
logarithm of X1 and X2, subtract, and then exponentiate.
Would it be theoretically possible to speed up modern processors if one would use analog signal arithmetic (at the cost of precision)
instead of digital FPUs (CPU -> ADC -> analog FPU -> DAC -> CPU)?
At heart it's a question of technology---so much has been invested in
R&D to make digital operations faster, that analog technology would
have a long way to go to catch up at this point. But there's no way to
say it's absolutely impossible.
On the other hand, I wouldn't expect my crude divider circuit above to
work above maybe 10 MHz without having to do some very careful work
and maybe deep dive research to get it to go faster.
Also, you say we should neglect precision, but a circuit like I drew
is probably only accurate to 1% or so without tuning and probably only
to 0.1% without inventing new technology. And the dynamic range of the
inputs that can be usefully calculated on is similarly limited. So not
only is it probably 100 to 1000 times slower than available digital
circuits, its dynamic range is probably about 10300 times
worse as well (comparing to IEEE 64-bit floating point).