Most engineers recognize that gearheads can be used to create a better inertia match between the object that is being moved and the motor being driven. Put simply, it is dictated by the gearhead ratio where the object reflected inertia to the motor is reduced by 1/ratio2.
So what should the correct inertia be?
Typically the rule of thumb says the object inertia should be no greater than five times the motor inertia (5:1 ratio) in a motion control application. This will deliver very stiff and controllable motion.
However, getting to this ratio may be difficult at times, mainly due to the fact that as you add more ratio to the gearhead, you will require more speed from the motor, which may not be available or may exceed the input speed limitations of the gearhead. Further, choosing a higher gearhead ratio will typically be costlier due to the extra gears that need to be incorporated into it.
So here are some useful insights from our application engineering experts on what you need to know about gearhead and servo motors:
In order to optimize motor control, the design engineering goal is to create a solution where the object inertia to motor rotor inertia ratio is as close to 1:1 as reasonable. This rule is not absolute and several factors impact it, including load distribution, acceleration/deceleration rates, settling time requirements, friction content, and motor coupling technology.
These general guidelines assume the motor can provide adequate torque for the load and move requirements. If the motor sizing is marginal you may experience stability and tuning issues regardless of the inertia ratio. This may also lead to issues later on when the mechanics wear and the loading characteristics change slowly over time.
Special thanks to the Parker Hannifin Electromechanical and Drive application engineering team for sharing their established guidelines for inertia matching.
Article contributed by Jeff Nazzaro, gearhead and motor product manager, Electromechanical and Drives Division, Parker Hannifin Corporation.