3

Sorry I dont know the appropriate terms that should be used in this subject!

I have an inductive motor, it has 3 different speeds (or windings) at 50 Hz:

350 rpm / 700 rpm / 970 rpm

I want to drive it with a VFD. the speed I want is around 1000 rpm.

When I tested it I found this results (more or less):

350rpm ==> 1000rpm @ 140Hz ~ 4 amps.

700rpm ==> 1000rpm @ 71Hz ~ 4 amps.

970rpm ==> 1000rpm @ 51Hz ~ 7 amps or more.

In all cases to motor runs under a normal load without problems..

My questions are:

1. Which one is the best for the **motor, the VFD and the consumption?

2. Higher frequency doesn't affect the motor?

enter image description here

Shaydzmi
  • 301
  • 1
  • 7

3 Answers3

2

At or below base motor speed, a motor will operate as a constant torque device. When you use a VFD to operate a motor at above the base motor speed, the motor operates as a "constant power" device, because the amount of torque it can produce is related to the ratio of voltage and frequency, and the VFD cannot create more voltage. So as frequency increases, the ratio goes down and so does the torque. So with a 970 RPM motor forced forced to run at 1000 RPM by increasing only the frequency to 51.5Hz, the operating torque the motor is capable of will only drop to 97% of rated, basically still within "normal" tolerance range (assuming +-10%).

But if you use the 700 RPM connection and increase the frequency to 71.4 Hz in order to get 1000 RPM, the motor shaft torque will be decreased to 70% of rated torque and if you use the 350 RPM connection and increase the frequency to 142.9Hz to get to 1000 RPM, the motor will only have 35% of rated torque at best. In reality in both of these scenarios the actual usable torque will be even LESS because losses in the motor will increase and cause the motor to overload faster than normal. Then to make matters worse, the PEAK torque, called the "Break Down Torque" that the motor uses to RE-ACCELERATE a motor after a change in loading, reduces by the SQUARE of the change in the V/Hz ratio. So the likelihood of stalling the motor when a load is applied increases exponentially.

Use the highest speed.

JRaef
  • 3,561
  • 11
  • 12
1

According to the nameplate, the 700 RPM connection has the most torque per amp. Your results for the 970 RPM connection seem like there may have been a VFD setup error. I suspect that the 700 RPM connection is best. The lowest operating current will be best for the motor. A frequency that is not so much above 50 Hz and a lower current should be best for the VFD.

I neglected to consider that the VFD does not likely have the capability to maintain output voltage proportional to frequency above 50 Hz. That will result in increased slip and motor current when operating at 71 Hz. The question doesn't indicate whether the 4 amp test current is the motor current or VFD input current. If it is the motor current, I see no reason not to operate the motor at 71 Hz with the load conditions that you have. If you did not determine the motor current, you should do that. If the VFD provides that information, that is probably the best way to check it.

The 970 RPM connection should provide the most torque from the motor. However if the 4 amp test current is the motor current, the 700 RPM connection seems to results in the lowest percentage of rated winding current.

1

To obtain the 3 speeds, the motor would be configured as a 16 pole, 8 pole or 6 pole motor with synchronous speeds of 375, 750 or 1000 RPM.

Run direct on mains, the torque would be 1.5 kgm at 375 RPM, 2.3 kgm at 750 RPM and 2.4 kgm at 1000 RPM.

When run through a VFD at 1000 RPM, the low-speed configuration would generate a torque of 0.6 kgm, the mid-speed one 1.8 kgm and the high-speed one 2.4 kgm.

Which configuration to choose would be decided by the torque required for the new application.

Should a speed of 1000 RPM be not very critical, running direct on mains at 970 RPM would be a better option.

vu2nan
  • 15,929
  • 1
  • 14
  • 42