Bearings are a vital component in rotating industrial equipment, and they often represent a significant capital investment. To maintain that investment and avoid costly downtime it is important to preserve the conditions that help keep a bearing performing properly. One condition that affects bearing performance is temperature—excessive heat generation is often a leading cause in bearing performance problems. However, proper bearing selection for defined applications and appropriate lubrication can minimize excessive heat generation to help keep industrial equipment running and reduce potential downtime.
Most bearing manufacturers aim to engineer a lower operating temperature into the geometric design of a bearing by controlling the form and tolerance of the friction interacting surfaces. However, a manufacturer’s efforts to lower operating temperatures through innovative design may go unnoticed if installers do not begin by choosing a bearing suitable to the application and its surrounding environment, or if the lubrication practices employed after installation are not aligned with the bearing system.
With so many factors to consider, it is important for engineers to understand why certain bearings are suitable for heavy load applications, what type of performance can be expected from a bearing, and the impact of temperature on a bearing’s performance. When this list of criteria is met and the correct bearing is chosen, the result can be an improvement in efficiency, which minimizes energy loss and lowers total operating cost.
Suitable Bearings
To accomplish reliable equipment operation and associated bearing performance, it is essential to select the proper bearing for an application and its environment. The most challenging bearing applications, which are subject to heat generation, include power generation, wind turbines, gear drives, and rolling mills. These applications demand a bearing with engineered characteristics that can accommodate dynamic misalignment and slide-roll relationships, all while maintaining a lower operating temperature (see Figure 1).
Spherical roller bearings (SRBs) are often the most suitable bearings for these applications, as they are engineered to have a maximum load capacity and to support combinations of radial and axial loading, even under significant dynamic misalignment conditions. However, many factors influence how well a bearing will perform in any given application, which is why the proper bearing selection process should be followed.
Bearing selection is the process of evaluating the suitability of bearings for specific industrial applications. While many bearing types exist to meet gear application performance needs, just as many mechanical factors exist to influence the best choice of bearing for specific power transmission systems. Rolling element bearing types are defined by the shape of the rolling element, and they range from needle to ball and tapered to spherical.
Characteristics such as stiffness or allowable deflection, lubrication requirements, static and dynamic misalignment, speed capability, load capability, and desired service life or durability all factor into choosing the best bearing for the application.
In any given system, rolling element bearings inherently manage broad ranges of speed and many combinations of radial and axial loads. However, surrounding environmental conditions such as low and/or high temperature, dust and dirt, moisture, and unusual mounting conditions can adversely affect a bearing’s performance. Therefore, it is important to consider both the power transmission system mechanical and environmental factors that may significantly impact the bearing choice and its performance.
The first step in bearing selection is identifying the proper roller element type, whether it is a ball, needle, cylindrical, spherical, or tapered roller bearing. Each bearing type has advantages and disadvantages that are design-specific and will affect such things as the load and speed the bearing can tolerate in the application, as well as the predicted durability and service life.
The size constraints of the bearing envelope or available space must then be addressed in the second step of the bearing selection process. This is done by considering the minimum shaft diameter, maximum housing bore, and available width in the bearing application. After the bearing envelope is defined, the selection options should be limited to bearings with bores, outer diameters, and widths that will fit in the bearing envelope. There may be several bearings with different load-carrying capacities available that fit in the envelope. Once the suitable bearing choices are defined according to the parameters required for installation, the third step is to determine which of these bearings is best able to produce the desired predicted-service life in the application by performing a bearing-system-life analysis (see Figure 2 & 3).
The key contributors to high performance and bearing life in industrial applications include ability to carry combination radial and axial loads, high load capacity, speed, and misalignment capability; SRBs are specifically engineered to handle such conditions.
Finally, the bearing selection is completed once the bearing design options are defined and satisfied. Bearing design options include cage type, bearing flange configurations (cylindrical and tapered roller bearings), radial internal clearance (RIC) or setting, and precision level and lubrication. These options are selected based on the application’s speed, temperature, mounting and loading conditions, and will help achieve optimum bearing performance and life.
Spherical Roller Bearing in a Gear Application
Gear applications often place great demands on a bearing system. However, spherical roller bearings are best suited to applications that carry significant radial or axial load, such as spur-single and double-helical geared systems that generate combined loads. They also offer a comparable speed capability in comparison to other bearings such as tapered roller bearings.
Typically there are two cage types to consider when defining the bearing design options for an SRB in a gear drive (see Figure 4 & 5). The pressed steel cage offers general-purpose design with high-strength characteristics that are often suitable for use in small to medium horsepower gear systems. For applications that are more aggressive from a marginal lubrication or inertial perspective, a machined brass cage provides geometry for a better advantage with roller interaction. Such applications include large horsepower gear systems, or systems that operate in extreme conditions. In addition, the design aspects of the brass cage and its positive impact on lubrication may make the bearing more favorable in high temperature environments and in applications where the potential for contamination is high.
Bearing RIC Selection
SRBs selected for a gear-drive system require specific features that must be considered in the installation and mounting arrangement. After determining a suitable cage, a second bearing design option or specific feature that must be determined is the optimum radial internal clearance. Defining the RIC is particularly important in an SRB system because an SRB is a unitized design for which the outer ring is easily misaligned relative to the inner ring, cage, and roller assembly. Only one component (either inner ring or outer ring) can be interference-fit with either the shaft or housing to prevent relative motion, fretting wear particles, or additional heat in operation.
Typically, the bearing component experiencing relative rotation or a rotating load zone is interference fitted to its supporting surface. The majority of applications have a rotating shaft with a rotating inner-ring load zone, so the inner ring is interference fitted with the shaft OD. Other applications, particularly planetary gear, may have either rotating outer rings or rotating inertial forces which dictate that the outer ring should have an interference fit. The interaction of inner ring or outer ring with this interference fit is to produce a reduction in the bearing RIC as mounted, versus the RIC which was purchased in the unmounted assembly.
When the correct component has been identified for an interference fit, the other should be installed with a transition or positive loose fit. This is to facilitate ease of installation, and also for axial float in applications which employ multiple bearings on a shaft in a fixed-float arrangement. Sufficient provision for displacement or float must be provided, otherwise the generation of parasitic axial loading of one bearing against the other may occur.
If the float bearing cannot adjust and move axially to relieve this condition, the operating temperatures could increase and can continue to amplify this condition, possibly producing premature bearing damage or a thermal runaway situation.
The unmounted RIC selected for an SRB is dependent on the mounting (interference fits), the operating loads, operating speeds, and the lubrication employed. These combine to directly affect the system heat generation. Generally, the greater the predicted or confirmed heat generation, the larger the selected unmounted RIC.
The heat generated and associated thermal gradients present in most applications produce inner-ring operating temperatures which are greater than outer-ring operating temperatures. This differential is usually significant enough that the net impact is a reduction in bearing operating RIC versus the installed or mounted RIC.
SRBs do not achieve thermally stable conditions if operating with a negative RIC or radial preload, unless a very controlled lubrication system is utilized to remove all excess heat generated. Consequently, the normal SRB unmounted RIC will be selected so that it is greater than the sum of RIC lost due to interference fits and thermal gradients for all combined operating conditions. This will help ensure that the operating RIC will have a value which is greater than or equal to zero in the thermally stable condition, producing the best balance of rolling element load zone for stiffness and load capability in a thermally stable environment.
Bearing Environment and the Influences of Temperature
As previously stated, simply choosing the right bearing is not enough. It is equally important to maintain a bearing’s performance with proper lubrication and monitoring. In any power transmission system where motion is involved, the friction generated translates into heat flux, producing a change in temperature or a thermal gradient.
The gear contacts and bearings generate heat, which affects not only the mechanical elements of the systems, but also interacts with the lubrication scheme and the surrounding environment to create a stable, thermally balanced system. The resultant stabilized operating temperatures for the gear and bearing positions determine the effectiveness of the lubrication and the impact on bearing and gear function, whether positive or negative (see Figure 6).
Successful bearing and gear performance is dependent upon generation of sufficient lubricant film thickness to separate the rolling/sliding contact surfaces. The film thickness, which can develop in these contacts, is an exponential function of the lubricant viscosity at the contact operating temperature. As the operating temperature increases, the viscosity decreases, having a negative impact on the lubrication film thickness—making it thinner.
Operating temperatures that generate lubrication films less than the minimum required will have a negative impact on performance by allowing surface asperity contact or interaction. This condition can result in an operating system that is unstable, and may operate at elevated temperatures. In this situation the result is a performance that is below predictions.
The choice of lubricant for a bearing system greatly impacts the operating temperature. A significant proportion of bearing heat generation during operation is produced by viscous interaction of the cage and rolling elements with the lubricant. The magnitude of this component of heat generation is dependent upon the lubricant viscosity and the operating speed. Both combine to have exponential impact on heat generation. Care should be taken to ensure the lubricant viscosity is sufficient to produce the desired minimum lubricant film thickness at the bearing operating temperature, but is not excessive. In general, more viscosity is good, but too much viscosity will produce substantial increases in heat generation and result in increased operating temperature, which cancels any positive benefits. If a bearing temperature issue is identified, it is important to review the original equipment manufacturers lubrication specifications to confirm they are being followed and are appropriate for the current equipment operating conditions.
The environmental temperature around the power transmission system is equally important since it establishes the stabilized base temperature for the system. As the environmental temperature increases, the system operating temperatures may have to increase to maintain a thermal balance. As previously noted, increases in operating temperature have negative impacts on lubricant viscosity and film thickness.
Conversely, extreme reductions in ambient environmental temperatures may produce dramatic increases in lubricant viscosity and associated heat generation from excessive viscous interactions. This can lead to excessive lubricant film thickness, increased sliding and potential operating temperatures that are excessive.
Each bearing type has unique geometric and manufactured features that influence the amount of friction occurring during application operation. This, combined with prior discussions about application influences on heat generation, underscores the fact that a power transmission system is a complex assembly that has an interactive relationship with its surrounding environment. The system seeks to establish thermal equilibrium with stabilized component operating temperatures. The greater the system complexity, the increased importance to assess the operating application characteristics, select the proper bearing type and features, and optimize the lubricant type, selection, and system.
When a bearing system is selected, installed, and maintained to run at lower operating temperatures, the results should be lower operating costs, improved reliability, increased uptime, and less energy loss. The time and resources spent assessing application needs, selecting the correct bearing types and modifications, and determining the impact of surrounding environments can lead to enhanced system performance and total cost savings.