Chronicling the development of noncontact infrared temperature measurement tools provides a backdrop for process use.

Just as the eye uses a lens to focus emitted radiation on the retina, which signals the brain -- properly “calibrated” by experience -- to reliably interpret visible light, a noncontact sensor/transmitter receives an object's invisible infrared radiation through a lens.


All organic and inorganic bodies emit infrared (IR) energy, even bodies colder than ambient temperature. Therefore, if our eyes were sensitive to infrared wavelengths, we could see in the dark. We can live with this limitation because other forms of light are visible, and because heat manifests itself to us, though not always obviously. We learn, for instance, that a glowing ember is very hot and a frozen lake isn't, though we don't know the precise temperature of either, and human vision is of limited use in interpreting temperatures between such extremes. For that, we may rely on our sense of touch, but only when direct contact or a very short distance is involved.

Of all human feats of unassisted noncontact temperature measurement, perhaps none is more impressive than that of the blacksmith, who can tell from subtle changes in a piece of heated iron's color when it has the right malleability. Glassmakers, too, judge by sight the precise moment to begin shaping heated rods into finished works. Still, we have nothing that compares to the biological infrared sensors of rattlesnakes and other pit vipers. Their small facial holes, or pits, alert them to the body temperatures of prey and predator.

Because the visible spectrum represents a fraction of the sun's emitted energy, human intelligence has had to compensate for the limitations of human vision with tools. In process heating applications, one tool is the noncontact infrared temperature sensor, which offers the ability to measure temperature from a distance. Just as the eye uses a lens to focus emitted radiation on the retina, which signals the brain -- properly “calibrated” by experience -- to reliably interpret visible light, a noncontact sensor/transmitter receives an object's invisible infrared radiation through a lens. To arrive at a precise temperature, it uses integrated equations that factor in the object's material, its surface qualities and the ambient heat.

Infrared thermometers are used for quality control in food processing.

From the Beginning

Galileo invented the thermometer some 200 years before English astronomer John Frederick William Herschel, the discoverer of Uranus, hypothesized (ca. 1800) the existence of infrared light. Using glass prisms to produce a sunlight spectrum and mercury thermometers to measure its components, Herschel found that temperature rose toward the spectrum's red end. Just past the red end, where Herschel expected the heat effect to disappear, temperature rose even further. This invisible region of the spectrum became known as infrared, which means “below the red.” More than half a century passed before scientists realized that infrared radiation has all the properties of visible light waves except that it produces no sensation of light on the retina of the eye.

In 1873, Scottish physicist James Clerk Maxwell presented the equations that comprise the basic laws of electromagnetism. These equations show that an electric charge moving in an electric field radiates waves through space at various definite frequencies that determine the charge's place in the electromagnetic spectrum -- now understood to include not only infrared waves but also other invisible radiation (radio waves, microwaves, ultraviolet waves, X-rays and gamma rays). Maxwell's work predicted the entire electromagnetic spectrum.

Several years before Maxwell began working on his equations, Gustav Robert Kirchhoff's law of thermal radiation had stated that the capacity of a substance to emit light is equivalent to its ability to absorb light at the same temperature. This law had led Kirchhoff, a German physicist, to a fundamental concept in radiation thermometry: the “black body,” an object (not necessarily black) that absorbs all frequencies of radiation falling on it, while reflecting or transmitting no radiation at all. As a perfect radiator, a blackbody emits all frequencies intrinsic to its temperature.

Austrian physicist Josef Stefan established the relationship between a body's radiated energy and temperature. He studied how hot bodies emit radiation as they cool and, in 1879, found that the total emitted radiation of a blackbody varies as the fourth power of its absolute temperature in Kelvin. Five years later, his former student, Ludwig Boltzmann, used Stefan's experimentally derived law, along with thermodynamic principles and Maxwell's electromagnetic work, to formulate the Stefan-Boltzmann law. This law states that the hotter an object gets, the more infrared radiation it emits. Stefan used it to make the first accurate estimate of the sun's surface temperature -- 11,000°F (6,000°C).

Investigations of spectral radiance vs. temperature in a blackbody followed, with German physicist Wilhelm Wien measuring blackbody radiation's wavelength distribution. In 1896, he showed that the wavelength of a blackbody's peak emission varies inversely with temperature (Wien's displacement law). Thus, peak wavelength shortens as temperature increases -- the color of the emitted light progresses from red to orange to yellow to white. Wien tried to formulate a corresponding empirical equation, which worked for high-frequency radiation (short wavelengths) but not for low frequencies (long wavelengths).

In the mid-1890s, a group of Berlin physicists, including Max Karl Ernst Ludwig Planck, a former student of Kirchhoff's, investigated blackbody spectral emissions. Noting that their spectrometer indicated distinct lines of light instead of broad bands, the physicists hypothesized that minute structures produced the emissions. They began work on an atomic theory that could account for spectral lines.

After several false starts, in 1897, Planck found a formula that could predict the observed energy of blackbody radiation at any given wavelength and temperature. He posited that light, heat and other forms of energy radiate in discrete units rather than in a steady stream. His discovery of a universal constant based on physical theory -- Planck's constant -- meant that science could finally compute the observed spectrum. This assumes that energy consists of the sum of discrete units Planck called quanta, and that the energy, E, emitted by each quantum is given by the equation E = hυ = hc/λ, where υ(sec-1) is the frequency of the radiation and h is Planck's constant. Directly relating the energy of radiation to its frequency accounted for the observation that higher energy radiation has a higher frequency distribution.

Planck's constant is fundamental to the theory of blackbody radiation. No longer was heat considered a fluid composed of repulsive particles capable of combining chemically with material atoms. According to this old view, mutual repulsion of heat particles created pressure, which is what a thermometer detected.

With his quantum theory, Planck ushered in a new scientific era. In 1905, Albert Einstein, who had studied Maxwell's electromagnetic work, used the quantum as a theoretical tool to explain the photoelectric effect. Subsequent experiments proved him correct. He published his paper in the journal Annalen der Physik. That same year, in the same journal, Einstein published his theory of special relativity. When Einstein won the Nobel Prize in 1921, the Nobel committee cited his explanation of the photoelectric effect but made no explicit reference to relativity, which was destined to become one of the most celebrated scientific theories of all time.

Applications for noncontact infrared sensors include tempering and annealing of glass and metals.

The Birth of the Radiation Thermometer

About the same time that Einstein found a fundamental process of nature at work in the mathematical equation that had explained blackbody radiation, the first patent for a total radiation thermometer was granted. This device had a thermoelectric sensor, produced an electrical output signal and allowed unattended operation. Total radiation sensors reached the market in 1931.

The first modern infrared quantum sensors -- lead sulfide photo detectors originally developed for the military -- became available after World War II. Since then, a host of industries has embraced noncontact temperature measurement: chemical, pharmaceutical, automotive, food, plastics, metals, utilities, construction material, pulp and paper, medical, scientific. Within this diverse group, applications for noncontact infrared sensors (also known as pyrometers) include extrusion, lamination and drying of plastics, paper and rubber; curing of resins, adhesives and paints; forming, tempering and annealing of glass and metals; and quality control in food processing.

What conditions make “infrared” (shorthand for “non-contact infrared temperature measurement”) preferable to contact thermometry? Those in which direct contact is impractical or impossible, such as industrial or laboratory processes involving hard-to-reach objects, moving objects or temperature extremes. Infrared also is the method of choice where contact might contaminate, scratch, tear or otherwise damage equipment or products.

Industrial infrared thermometers are formidable tools in their own right. Many handheld models have an “infrared gun” design in which a laser beam targets the part of the object to be measured. A passive system adjusts for the object's emissivity and the ambient heat, collects infrared waves through a lens, filters out atmospheric interference such as moisture, converts the radiation to an electrical signal, and instantly calculates temperature.

Sidebar: Emissivity: A Factor to Consider

Radiation energy can be classified as emitted, transmitted or reflected. Emissivity is the ratio of the energy emitted by an object at a given temperature to the energy emitted by a perfect radiator, or blackbody, at the same temperature. Emissivity values fall between 0.0 and 1.0. The emissivity of a blackbody is 1.0.

Because real-world objects are not blackbodies, infrared thermometers must compensate for different emissivities. In general, the higher an object's emissivity, the easier it is to get accurate infrared temperature measurements. Organic substances such as wood, cloth and plastic have an emissivity of about 0.95. Rough or painted surfaces also typically have high emissivity.

Objects with very low emissivity (below 0.2) are problematic. Some polished, shiny metal surfaces such as aluminum are so reflective in the infrared range that reduced power and spurious reflections at the sensor may preclude accurate measurements.

To ensure accurate infrared temperature measurements, factor in a material's emissivity with any of the following methods:

1 Using a precise sensor, heat a sample of the material to a known temperature, then read the sample with an infrared thermometer. Adjust the emissivity value in the infrared thermometer to force the correct reading.

2 Where low temperatures (up to 500°F [260°C]) are involved, cover the part of the object to be measured with masking tape, which has a known emissivity of 0.95. Adjust the infrared thermometer's emissivity to correspond.

3 Where high temperatures are involved, drill a hole in the object. The depth of the hole must be at least six times the diameter. The emissivity of this hole will be 1.0, so in effect, it is a blackbody. With a contact sensor, measure the temperature in the hole. Next, measure the object's temperature with an infrared thermometer, adjusting emissivity until the thermometer displays the correct temperature.

4 Paint the object with dull black paint, which has an emissivity of about 1.0. Set the infrared thermometer's emissivity adjustment accordingly.

5 Consult an emissivity table, which provides emissivity values for many common materials. Note that this is a last resort because such values are approximations at best.

6 Consider using an infrared thermometer that senses two separate wavelength ranges. Such thermometers calculate temperature from the ratio of the energy measured by the two sensors.

Sidebar: Contact vs. Noncontact Temperature Measurement

Use infrared (noncontact) when the surface is:
  • Too hot to measure with thermocouples
  • Too large to measure without a large number of thermocouples
  • So much in motion that thermocouple lead wire may break
  • So high in electrical potential that using a thermocouple would be dangerous
  • So low in mass that the thermocouple itself would influence surface temperature
  • Too fragile or wet to accommodate thermocouple contact
  • Too chemically active to accept a thermocouple
  • Too atmospherically hostile to a thermocouple
  • Inaccessible to a thermocouple or its instrumentation
  • Too close to noise-producing electric or magnetic fields


Links