The temperature environment used for calibration usually is provided by a drywell or dry-block calibrator, or a microbath. Both offer portability and a range of temperatures. Drywells use high stability metal blocks with drilled wells to accept the reference and unit under test. Drywells typically cover ranges from -49 to 2,192oF (-45 to 1,200oC) and microbaths cover ranges from -22 to 392 oF (-30 to 200oC). Microbaths are similar in size to drywells but use a small tank of stirred fluid instead of a metal block. Because the use of a fluid eliminates thermal contact problems resulting from poor fit in a block, microbaths are better suited for calibrating short or odd shaped probes.
The actual temperature of the bath or drywell is determined by a reference thermometer, which may be either a thermometer internal to the heat source or an external reference thermometer operating independently of the heat source.
External vs. InternalMicrobaths and drywells have a built-in sensor to provide a feedback loop to the unit's controller and to provide a temperature reading to the user. The heat source manufacturer or a third-party laboratory can calibrate the sensor so the unit displays a traceable temperature within a stated uncertainty. For some applications, this uncertainty level, typically +/-1 to 2oF, is adequate. Using an internal reference sometimes is preferred because it requires fewer instruments and enhances portability for field applications (figure 1).
The reference system, however, should be more accurate than the process system being calibrated. The generally accepted test uncertainty ratio (TUR) is 4:1; that is, the reference should be four times more accurate than the sensor or system being calibrated. Therefore, if a process thermometer is being relied on for correct readings within +/-2oF, the test system typically should be +/-0.5oF or better at each temperature in question.
Where uncertainty requirements are more rigorous, external reference thermometers can help improve system uncertainty (figure 2). External reference thermometers -- usually platinum resistance thermometers (PRTs) or thermistors -- often can be calibrated to a few hundredths of a degree and can be read by electronic readout devices that contribute little to total measurement uncertainty. These systems can provide measurements with uncertainties as low as +/-0.05oF or +/-0.02oF. The reference probe and readout should be recalibrated periodically, preferably by an accredited calibration laboratory.
Because external thermometers are more accurate, they increase the relative significance of other components of calibration uncertainty such as uniformity and stability. It is, of course, critical in any calibration to account for all sources of uncertainty in the process.
System or Component?Most temperature sensors used in processes are read by transmitters that send a 4 to 20 mA signal to a control panel, which then displays the temperature for process monitoring. Such systems involve three instruments, all of which require periodic calibration. Of these three, the largest errors usually are found in the temperature sensor.
Several calibration methodologies are used in the process plant. The most representative method is to calibrate the complete measurement system -- from sensor through transmitter to indicator or controller. Alternatively, each measurement system component can be calibrated individually.
The sensor can be individually calibrated using a drywell or microbath heat source to simulate process temperature. If the temperature sensor is electrical, a readout device measures its output. Adjustments then are made to the thermometer or its coefficients.
The transmitter is calibrated using a simulator to generate the resistance or voltage output from the temperature sensor and input to the transmitter. The simulator also measures the resulting transmitter current or voltage output. The transmitter is adjusted to ensure that the output follows the input. For example, looking at a 4 to 20 mA transmitter with a range of 32 to 392oF (0 to 200oC), 4 mA corresponds to 32oF and 20 mA corresponds to 392oF. The simulator provides input and output ranges to cover all resistance thermometer and thermocouple types.
The indicator or controller also is calibrated using a precision simulator to generate the transmitter's resistance or current input. The indicator or controller is adjusted so that the display variable matches the simulated input.
The complete system is calibrated using the drywell or microbath to compare the reference probe and unit under test. The transmitter is adjusted to ensure that the indicator or controller agrees with the reference probe readout. This calibration method is most representative of the real process.
Accredited CalibrationCalibration of the thermometer standards used to calibrate industrial thermometers provides traceability to both national and international standards. Traceability to international standards ensures that measurements made in one country agree with measurements in another country. This is particularly important for companies using similar manufacturing processes at different locations around the world.
More and more calibration laboratories throughout the United States are being accredited to international standards such as ISO Guide 25. Accredita-tion ensures that a laboratory's quality systems, uncertainty levels and traceability statements have been examined and verified independently.