Calex normally recommends using a short-wavelength infrared temperature sensor to measure the temperature of metals, instead of a general-purpose type, because it is more accurate. But why is it more accurate? Here are the reasons.
It has been common practice for decades to measure high temperature objects with short-wavelength infrared pyrometers. However, these sensors have traditionally been limited to measuring high temperatures only. Every sensor has a lower limit to its temperature range, and on short-wavelength sensors this lower limit has traditionally been too high to allow the measurement of reflective metals in common industrial applications, such as iron and steel rollers, until recently.
All surfaces emit infrared radiation. Infrared temperature sensors work by measuring the emitted infrared radiation, and converting this measurement into a meaningful temperature reading.
The amount of emitted energy depends on the temperature and emissivity of the surface. Non-metals typically have a high emissivity, and reflective metals such as iron and steel have a low emissivity.
At low temperatures, very little infrared radiation is emitted. For a given temperature, low-emissivity materials emit less infrared radiation than high-emissivity surfaces. The smaller the amount of detected radiation, the more difficult it is to achieve an accurate and stable temperature measurement.
Because the amount of measurable radiation is very small, it is traditionally difficult to use an infrared temperature sensor to measure the temperature of reflective metals such as iron and steel at low temperatures.
In general, the shortest possible measurement wavelength should be used for the most accurate measurement. In the past, it was not possible to measure low temperatures with short-wavelength sensors, so users were forced to measure low-temperature metals relatively inaccurately with long-wavelength, general-purpose sensors.
Measuring Low Temperatures at Short Wavelengths
Calex has developed technology in the PyroUSB that allows temperatures as low as 45°C to be measured at short wavelengths (2.0 to 2.6 µm) with improved accuracy compared to general-purpose long-wavelength sensors.
This sensor is used with great success to measure the temperature in pipe welding applications, and roller surfaces in the fabric, laminating, paper, corrugated board, plastic and tyre manufacturing industries, among many others.
The improved accuracy at 2.2 µm is for two reasons:
The emissivity of reflective metals is usually higher at 2.2 µm.
No surface is perfectly smooth. Under a microscope, the surface will appear rougher as magnification increases. Even a polished metal surface that appears reflective to the naked eye will look relatively rough under a microscope.
If the wavelength of the IR radiation is larger than the peaks and troughs in the roughness of the surface, it will be less easily absorbed by the surface, and more easily reflected.
If the wavelength of the radiation is small enough to fit into the microscopic valleys in the surface of the material, it is more easily absorbed by the surface (so the surface is less reflective at that wavelength).
Infrared temperature sensors measure the emitted radiation, not the absorbed radiation. If the temperature of the target surface is stable (it is in thermodynamic equilibrium), then it is emitting the same amount of IR energy as it is absorbing. Therefore an effective absorber of IR radiation is also an effective emitter (it has a high emissivity).
A reflective metal surface will generally have a higher emissivity at short wavelengths than at long wavelengths.
The measurement error is smaller at 2.2 µm when there is an error in the emissivity setting.
In general, the shortest possible measurement wavelength should be used for the best accuracy. The shortest possible wavelength is limited by the lowest temperature that must be measured.
The measurement wavelength is included in the sensor’s calculation of the measured temperature. For a short wavelength sensor, the nature of the calculations is such that an error in the emissivity setting (or a change in the target emissivity) has a smaller effect on the measurement accuracy.
For example, the graph below shows the measurement error versus the measured temperature when using 2.2 µm and 8-14 µm sensors with a 10% error in their emissivity setting. There is a smaller measurement error when using a 2.2 µm sensor.
A “10% error” means, for example, a target emissivity of 0.30 and an emissivity setting of 0.33.
The lowest measurable temperature depends on how reflective the surface is. For more information, please see https://www.calex.co.uk/product/pyrousb-new-version
The post Why Use a Short Wavelength Sensor to Measure Low Temperature Metals? appeared first on Calex.