On the other hand, when the precision gearbox engine inertia is bigger than the strain inertia, the motor will need more power than is otherwise essential for this application. This boosts costs because it requires spending more for a electric motor that’s larger than necessary, and since the increased power usage requires higher operating costs. The solution is by using a gearhead to match the inertia of the motor to the inertia of the load.
Recall that inertia is a way of measuring an object’s level of resistance to change in its motion and is a function of the object’s mass and shape. The greater an object’s inertia, the more torque is needed to accelerate or decelerate the thing. This means that when the load inertia is much bigger than the motor inertia, sometimes it can cause excessive overshoot or boost settling times. Both conditions can decrease production collection throughput.
Inertia Matching: Today’s servo motors are producing more torque relative to frame size. That’s because of dense copper windings, lightweight materials, and high-energy magnets. This creates greater inertial mismatches between servo motors and the loads they are trying to move. Using a gearhead to raised match the inertia of the motor to the inertia of the strain allows for utilizing a smaller engine and results in a far more responsive system that is easier to tune. Again, this is accomplished through the gearhead’s ratio, where the reflected inertia of the load to the motor is decreased by 1/ratio^2.
As servo technology has evolved, with manufacturers creating smaller, yet more powerful motors, gearheads have become increasingly essential partners in motion control. Locating the optimum pairing must take into account many engineering considerations.
So how really does a gearhead go about providing the power required by today’s more demanding applications? Well, that all goes back to the fundamentals of gears and their ability to modify the magnitude or path of an applied push.
The gears and number of teeth on each gear create a ratio. If a electric motor can generate 20 in-lbs. of torque, and a 10:1 ratio gearhead is attached to its output, the resulting torque will certainly be near to 200 in-lbs. With the ongoing emphasis on developing smaller sized footprints for motors and the gear that they drive, the ability to pair a smaller engine with a gearhead to attain the desired torque result is invaluable.
A motor could be rated at 2,000 rpm, but your application may only require 50 rpm. Trying to run the motor at 50 rpm may not be optimal based on the following;
If you are running at a very low acceleration, such as 50 rpm, and your motor feedback resolution is not high enough, the update rate of the electronic drive may cause a velocity ripple in the application form. For instance, with a motor opinions resolution of just one 1,000 counts/rev you have a measurable count at every 0.357 degree of shaft rotation. If the digital drive you are using to control the motor has a velocity loop of 0.125 milliseconds, it’ll look for that measurable count at every 0.0375 degree of shaft rotation at 50 rpm (300 deg/sec). When it does not see that count it’ll speed up the electric motor rotation to think it is. At the acceleration that it finds another measurable count the rpm will become too fast for the application form and then the drive will slower the motor rpm back down to 50 rpm and then the complete process starts all over again. This continuous increase and reduction in rpm is exactly what will trigger velocity ripple within an application.
A servo motor running at low rpm operates inefficiently. Eddy currents are loops of electrical current that are induced within the engine during procedure. The eddy currents in fact produce a drag force within the engine and will have a greater negative effect on motor functionality at lower rpms.
An off-the-shelf motor’s parameters might not be ideally suitable for run at a low rpm. When an application runs the aforementioned motor at 50 rpm, essentially it isn’t using most of its obtainable rpm. As the voltage continuous (V/Krpm) of the engine is set for a higher rpm, the torque constant (Nm/amp), which is certainly directly related to it-is lower than it needs to be. Consequently the application requirements more current to drive it than if the application form had a motor specifically designed for 50 rpm.
A gearheads ratio reduces the electric motor rpm, which explains why gearheads are sometimes called gear reducers. Utilizing a gearhead with a 40:1 ratio, the engine rpm at the input of the gearhead will end up being 2,000 rpm and the rpm at the result of the gearhead will become 50 rpm. Working the engine at the higher rpm will enable you to avoid the issues mentioned in bullets 1 and 2. For bullet 3, it enables the design to use much less torque and current from the motor based on the mechanical benefit of the gearhead.