I'm trying to reconcile the physical steps that go into the operation of an AC induction motor. Looking online, I either find broad and general explanations that I already know and understand, or very specific and math heavy explanations that get a bit overly detailed.
I understand synchronous motor operation and describe it in my head like this: "From standstill, voltage is applied to the rotor. This creates a current in the rotor which generates a magnetic field which starts to turn the rotor due to attraction to the magnets of the stator. At the start, the rotor has little movement relative to the stator's field, so little back-emf is created, allowing for maximum current to flow, and so the machine has max torque at the start. As it gets up to speed, the rotor's quick movement cuts through more of the stator's flux, and generates more intense back-emf, which fights the input voltage, limiting current into the rotor, and torque drops off until some balance of speed/torque is achieved with the load.
But when I try to create the same "story" of what happens in an induction motor, I must have some fundamental misunderstanding (probably about back-emf), because I run into this: "At standstill, an AC voltage is applied to the stator windings, quickly manifesting a rotating magnetic field. This induces a voltage in the rotor coils, creating a current that manifests its own magnetic field such that the induced voltage/field opposes what created it. Now the relative motion of the rotating stator field and rotor coil would be maximum at startup, creating maximum possible back-emf.... and that's where my thinking fails. If the back-emf is biggest at the start, it'll fight the input voltage the most at the start, which will result in the least current and least torque at the start... and the thing will never get up to speed."
Obviously that's wrong, but I don't see where. In a DC motor, it makes sense easily because input voltage, back-emf, and rotation, are all in the rotor. So when the rotor speed is low, it cuts less flux, and creates low back-emf. When the rotor speed is high, it cuts more flux, and creates higher back-emf. A nice feedback loop.
But it seems reversed for the induction case the way I'm thinking of it. At a low rotor speed, the rotor would cut through more flux, because the stator field is quickly rotating around it. This would create a higher back-emf in the rotor, fighting the input voltage, limiting the input current, reducing torque, and slowing the machine or keeping it from moving.
Then at a high rotor speed, the rotor would cut through less flux, because it's catching up to the rotational speed of the stator's field. This would generate less back-emf in the rotor, which would mean the input voltage isn't being fought as much, increasing current, increasing torque, speeding the device up.
So my thinking ends up with the opposite of a self stabilizing device. If it slowed down, it'd want to keep slowing to standstill, and if it sped up, it'd want to keep speeding up...
None of that is right, so I must be messing something up about how back-emf operates in an induction motor. And I think the inverse nature of how the rotor speeding up leads to amount of flux being cut to be lower is also tripping me up...
Thank you so much for any help with a layman's quest for understanding!