Intro: I am projecting a Brushless motor driver which relevant specs here are basically: use with 6v to 60v. To make it work the circuit's microcontroller needs to analyze the signal that comes from the motor phases through ADC - which can not exceed 3.3v.
The phase maximum voltage is ideally equal to the power voltage and looks like the wave form shown bellow, for the three wires:
So that, if it uses a simple resistive divider to measure each phase:
- when the power is 60v, the phase value at the ADC input must be bellow 3.3v.
- When the power is 6v (10x lower than 60 volts), at the ADC input it will be 3.3v divided by 10, i.e., 0.33v.
The Problem: Notice that using resistors simply the precision was lost by 10x (more than 3 bits!). This can cause instability at lower voltages and worse control, compared to 60v on power supply.
Solutions? I was thinking about using a AGC to adjust the signal to the ADC input based on the power supply voltage. No meter what voltage on the power supply, the ADC should catch a 0-to-3.3v signal linearly similar to the phase signal. What AGC chip you guys recommend? What else could be a possible simpler solution?