A number of testing methods can be used to determine the performance characteristics of a lubricating oil. Two common tests that have endured over time are flash point and volatility. Although the methods, technology and practices have changed through the years, both of these tests are still utilized today and provide viable ways to assess new and used oils.
Flash Point
The flash point test dates back to the mid-19th century as one of the earliest identifiers of an oil’s physical properties. It was originally used to determine the fire hazards of fuels and oils being stored and transported.
The flash point test measures the tendency of an oil to form a flammable mixture with air. Once the oil sample is heated, a flame is exposed to the headspace. Ignition is the determining point. The lowest measured temperature at which the oil will ignite or flash is recorded as its flash point. If the test is performed over a longer period of time, the oil’s fire point can be obtained. The fire point is when ignition is sustained for five seconds.
Several methods can be employed to determine an oil’s flash point. Each varies depending on the fluid’s viscosity and the chosen technique. Among the ASTM tests that are available include ASTM D56, Flash Point by Tag Closed Cup Tester. It is utilized for viscosities below 5.5 centistokes (cSt) at 40 degrees C (104 degrees F), as well as for viscosities below 9.5 cSt at 25 degrees C (77 degrees F) and flash points below 93 degrees C (220 degrees F).
ASTM D93, Flash Point by Pensky-Martens Closed Cup Tester, is used for petroleum products with a temperature range of 40 to 360 degrees C (104 to 680 degrees F) and biodiesel with a temperature range of 60 to 190 degrees C (140 to 374 degrees F).
ASTM D92, Flash and Fire Points by Cleveland Open Cup, is another option for obtaining an oil’s flash point. Although the technology has evolved, the open and closed cup tests of today closely resemble the practices of more than 100 years ago. While often seen as a precursory test of new oil, flash point can also be utilized in used oil analysis to detect fuel dilution, base oil cracking and contamination.
Volatility
The Noack volatility test was developed by Dr. Kurt Noack in the 1930s and first used in Europe. It was introduced as a way to reveal the evaporation loss of lubricating oils. In 1984, Al Amatuzio began using the test in the United States to determine the performance of motor oils. Volatility testing became standard for North America in 1992 with the creation of the American Petroleum Institute’s SH/International Lubricant Standardization and Approval Committee’s GF-1 oils, which set the bar for the current standards in environmental emissions and fuel economy.
Volatization is a term used to describe the “boiling off” of lighter molecules in fluids. It is closely related to oil consumption in automobile engines. The test simulates the oil’s reaction to internal temperatures associated with piston rings and cylinder walls.
Known as ASTM D5800, the Noack volatility test reveals the evaporation loss of lighter oil molecules and additives at high temperatures. Depending on the method, a measured sample quantity is placed in a crucible or reaction flask and heated to 250 degrees C (482 degrees F), with constant air flow drawn through for 60 minutes. Assessing the before and after weight will determine the evaporation loss.
There are three different procedures for ASTM D5800: Procedure A, which uses the Noack evaporative testing equipment; Procedure B, which employs the automated non-Wood’s metal Noack evaporative apparatus; and Procedure C, which utilizes the Selby-Noack volatility test equipment.
Procedure A was first introduced in the 1930s using a toxic mixture of compounds known as Wood’s metal for sample heating. Wood’s metal, also called Lipowitz’s alloy, contains bismuth, lead, tin and cadmium. The toxicity comes from the lead and cadmium.
The Selby-Noack test was developed in the mid-1990s by Theodore Selby and his colleagues using a noble metal heater. It eliminates the need for Wood’s metal and utilizes a collection of evaporated material for later analysis. This is particularly useful in identifying elements such as phosphorus, which is known to lead to premature failure of catalyst systems.
Volatility testing plays an important role in engine lubrication where high temperatures occur quite frequently. Evaporation losses can be seen in the amount of oil consumption or the need for top-ups. This can also result in a change in the oil’s properties, as additives may evaporate during the volatilization process.
As lighter molecules “burn off” or evaporate, heavier molecules remain, causing a shift in the fluid’s viscosity. Leaving behind heavier or “thicker” oil can contribute to reduced fuel economy due to added viscous drag as well as poor oil circulation throughout the engine, greater oil consumption, higher wear rates and increased emissions.
Tests That Serve a Purpose
Flash point and volatility tests serve a purpose, just for different conditions. They are also related. After all, for an oil to reach its flash point, it must first volatize. While the flash point tells you very little about an oil’s volatility, an oil’s volatility can tell you a lot about its flash point. Volatility testing has proven that with better base oils, improved emissions and fuel economy will follow.
Keep in mind that synthetic lubricants generally have higher flash points and do not begin to evaporate until a much higher temperature is reached. On the other hand, mineral oils may start to vaporize much earlier than their flash points. If you are dealing with hazardous conditions, a flash point test is a staple that simply must be conducted.
84% | of lubrication professionals do not conduct flash point or volatility testing to assess new or used oil, based on a recent poll at MachineryLubrication.com |