With energy costs in an upward Death Spiral, everybody's getting serious about monitoring their energy use. For many plants, a big part of this is surveying their ovens and furnaces to identify areas for improvement.
One of the most basic of these tasks is collecting data on the shell temperatures of heating equipment. In addition to putting some hard numbers on heat losses, it may also tip you off to the deterioration or breakdown of insulation before it causes any serious damage.
It the (not so) Old Days, this was done exclusively with a contact thermocouple. You pressed the tip of the thermocouple against the oven or furnace skin to get a temperature reading. Collect temperatures from several points, average them, and you could figure your skin losses using one of the tables or graphs available from a number of sources.
There are two problems with this method -- the first is, it's slow-w-w-w-w. Thermocouples take a little while to reach thermal equilibrium, and until the temperature reading levels out, you have to stand there, pressing that probe against the skin. Shift a little, rock the thermocouple a bit, and the temperature might fluctuate, forcing you to stand there even longer. The second problem is that some locations on the skin might be inaccessible or just plain dangerous to get near, so you might not be able to sample all the locations you'd like.
Then, along came the infrared pyrometers. Just aim, pull the trigger, and you've got your temperature. No waiting and no problems with hard-to-access locations. There are two things to take into account, though. The first is the sighting angle. It's a cone, so the farther you are from the target, the larger the area the pyrometer reads, giving you an average temperature. That's really not an issue with an oven skin survey -- you're going to be averaging your readings anyway. More significant is the emissivity issue, which all radiation-based temperature measurements have to take into account.
Emissivity is a number that describes the efficiency with which a surface can give off radiant energy. At the high end, it's 1.0, which describes a black body, a surface that radiates 100 percent of the energy theoretically possible for a given temperature. The bottom end of the scale is 0, the characteristic of a white body, a surface that can't radiate at all, no matter how hot it gets. In real life, nothing is at either of these extremes. They're all somewhere in between, and they vary a lot. For example, if a surface were capable of radiating 10,000 BTU/hr-ft2 as a black body, it will radiate only 4,000 if its emissivity is 0.4.
There are two related properties -- absorptivity and reflectivity. Absorptivity describes the fraction of incoming radiation that will be absorbed by a surface. Numerically, it is equal to the emissivity, so if a surface has an absorptivity of 0.4, it will absorb 40 percent of all incoming radiation. The flip side of absorptivity is reflectivity -- this number describes the amount of incoming radiation reflected back by the surface. Absorptivity and reflectivity must total 1.0, so a surface with an absorptivity of 0.4 will have a reflectivity of 0.6.
Optical pyrometers of all types are calibrated against a radiation source that simulates a black body, so the instrument manufacturers have to provide some way to correct their readings for less than black body emissivity. Some high-end units measure radiation at two different wavelengths. Others allow you to select the emissivity for the surface. And what is it? There's the rub -- we're probably guessing, so the accuracy of our readings will depend on how good an estimate we made.
Less expensive units deal with the situation by programming the circuitry to correct for an assumed emissivity -- 0.9, 0.8, or whatever. For example, if the correction is based on an assumed emissivity of 0.8, it will multiply the radiation signal it receives by 1 ÷ 0.8, or 1.25, before it calculates and reports the surface temperature. If the emissivity of the surface is different, which is probably the majority of cases, we'll get an error proportional to the difference between the actual and assumed emissivities. If the surface emissivity is lower, the pyrometer will read high. This may seem illogical -- after all, if a surface with low emissivity is putting out radiation at a certain intensity, doesn't that mean it's actually hotter than reported by the instrument, which has given it credit for a higher emissivity? This makes sense until you realize the total radiant energy coming from a surface is not just the radiation it emits directly -- it also includes radiant energy from other sources, reflected, or turned away, by that surface.
Radiant energy behaves like light -- it travels in straight lines, can be reflected and throws shadows, so perhaps a light-based analogy will make it easier to understand.
A mirror has a high reflectivity and low absorptivity for visible light. If it were a perfect reflector, we would have a hard time recognizing it as a mirror, but it isn't -- close up, we can see small flaws in the reflective coating and distortions in the glass. These are equivalent to the direct radiant emissions -- they originate with the surface of the mirror, not the image it reflects. What we see, then, is a combination of our reflection and those little flaws and defects in the mirror.
In radiant heat transfer, all surfaces within sight of each other are exchanging radiant energy. The radiation transfer equations published in texts show the net movement of radiation is from the hotter to the cooler surface, and the amount of transfer is proportional to the fourth power of the surfaces' absolute temperatures, corrected for their emissivities (or absorptivities).
The implication is that radiation flows in only one direction. That's wrong. The cooler surface is radiating to the hotter one, and the hotter one is absorbing and reflecting some of that radiation. However, it transmits more to the cooler surface and wins the shoving match.
This is where the measurement problem comes in. Say our infrared pyrometer, programmed for 0.9 emissivity, reads the energy flux from a surface with an emissivity of 0.2. It assumes 90 percent of the radiant energy it measured is due to the temperature of the surface, but it's not.
It consists of the 20 percent direct radiation, due to the surface's temperature, plus 80 percent of the radiation received from the surface's surroundings and reflected back. The pyrometer collects all of it and calculates a temperature on the assumption the combined direct and reflected radiation equals 90 percent of the direct radiation. This gives the surface credit for being hotter than it really is. If you cross-check your readings with a contact thermocouple, you'll find they're lower.
How do you deal with this conflict between measurement methods? Use both. Continue to take your readings with the infrared pyrometer, but carefully spot check a few locations with the thermocouple. Select locations that give you a range of temperatures. Then plot a simple curve of temperature differences (figure 1) and use that to correct your pyrometer readings. This will give you the best combination of speed and accuracy.
Remember, though -- that curve you plot may be valid for only one oven or furnace. If your equipment has a variety of paint jobs, or if some equipment is older and grimier, plot a fresh curve for each unit.