Temperature — second only to time when it comes to measuring physical quantities — plays a key role in numerous industrial and commercial processes. Get the temperature wrong and your beer could be foul, what you eat could be undercooked or spoiled, or metals might not melt properly.
Indeed, examples of temperature’s critical importance to industry abound, including the monitoring of cooking temperature in food and beverage processing, measuring the temperature of molten steel in a mill, verifying the temperature in a cold storage warehouse or refrigeration system, or regulating temperatures in the drying rooms of a paper manufacturer.
How to Calibrate a Thermocouple Input TransmitterAn integrated temperature calibrator performs three key functions for calibrating a temperature transmitter: sourcing a temperature, providing loop power and measuring the resulting output current. The following example using a Model 724 temperature calibrator shows how to calibrate a Type K thermocouple transmitter that is ranged from 32 to 302°F (0 to 150°C), generating an output current range from 4 to 20 mA. |
In turn, calibration of critical temperature instruments is required to ensure consistency and product quality while at the same time allowing companies to effectively plan on the use of resources — all factors in staying competitive in the marketplace. Key reasons for temperature calibration include:
- Product quality.
- Regulatory compliance.
- Personal safety.
- Competitiveness.
Safety Comes First
When calibrating temperature instrumentation, 4 to 20 mA with a 24 V power source is considered low voltage technology, but by no means should it be performed without safety precautions. Older instrumentation can use 48 and 96 V supplies, and many instruments require 120 VAC line power or switch systems at 120, 240 or even 480 VAC.
Thus, there is a good reason for safe practices. Never assume low voltage, and always take the same precautions you would take while working with high power electrical circuits.
Best practices around temperature calibration center on testing the control loop. In an industrial setting, a temperature transmitter uses a measuring sensor to sense the temperature and then regulate a 4 to 20 mA feedback loop to a control system — and eventually a control element that affects the temperature. This control element might consist of a valve that opens or closes to allow more steam into a heating process or more fuel to a burner.
Temperature calibration professionals are counted on to keep temperatures within their proper range, in part by ensuring that all -temperature-related devices are calibrated to within their allowable variance. Technicians use handheld temperature calibration tools to test and calibrate temperature instrumentation devices in-situ to ensure they meet the specific requirements for the application.
It should be noted that calibrations must be done by using a calibrator traceable to a nationally or internationally recognized standard. In the United States, that is the National Institute of Standards and Technology (NIST). Ideally, the calibrator will be at least four times as accurate as the unit being tested.
Verifying a loop includes testing the output of the transmitter, the wiring, the input to the control system as well as the control system input card, and the return wiring back to the transmitter (figure 1).
Shooting the Loop
One way to do a handy go-no-go test on the transmitter is to “shoot the loop” with a temperature calibrator (table 1).
1. Disconnect the transmitter.
2. Connect a calibrator to simulate a milliamp signal taking power from the 24 V power supply.
3. Set the function on the calibrator to “mA simulate” and set to a 4 to 20 mA value.
4. Check indicator to determine if it matches the value set.
5. Ramp the milliamp signal in a step or linear ramp and check the indicator.
6. Adjust the transmitter to specifications.
A quick way to verify only the 4 to 20 mA signal is by connecting a clamp meter to the signal wiring. No signal in the control loop can indicate a bad power supply, wiring problems or even an I/O problem.
The two most common types of temperature-sensing devices are the thermocouple and resistive temperature detector (RTD). In addition to calibrators that test both thermocouple and RTD sensing devices, specialized single-purpose calibrators measure and simulate several different types of each of those devices.
Calibrating a HART Temperature Transmitter. The rise of smart devices and digital protocols such as HART (an acronym for Highway Addressable Remote Transducer) has led to tools dedicated specifically to calibrating them. Capabilities required to properly service HART instruments include precision analog source and measure capability, and digital communication capability.
HART smart transmitters require digital adjustment if found to be out of specification. Until recently, this required two separate tools: a calibrator and a communicator. Today, the capabilities of those two tools are available in special calibrators that can help you quickly and effectively service HART instruments.
Don’t Forget the Sensor
Whether using a thermocouple or RTD, testing the sensor itself is an important practice in complete temperature calibration. Testing the sensor and transmitter together verifies sensor and transmitter electronics. Adding a test of the control system input completes a full loop calibration. In addition, degradation of the sensor, which can be exposed to temperature cycling as well as vibration, can be accounted for during the tests, often ignored until the device fails.
In order to test a sensor, remove it from the process or thermowell with the transmitter and insert the probe into a dry well (also called a dry block) that is set at a known temperature. A calibration bath can be used for this function in place of the dry well.
Connect the milliamp connections to the documenting temperature calibrator. In addition, the uncertainty of a dry block can be reduced by using a reference thermometer and a SPRT probe.
The temperature reading in a dry well is compared with the appropriate display and control device — either within the control loop or from a separate calibration device simulating the temperature.
Another way to do a quick test on a sensor is within the thermowell. If enough room is available within the thermowell, an appropriate digital thermometer probe can be slipped inside, and the reading can be compared to the process control readout.
In conclusion, when done by a professional with the proper tools, temperature calibration can ease generation of required documentation as proof of successful calibration. These processes help you quickly and reliably calibrate your critical temperature instrumentation — and keep your operation on track.
Basic Calibrator Setup
1. Connect the temperature calibrator test leads to the thermocouple transmitter. The output from the thermocouple jacks will simulate a temperature input to the transmitter. The red and black test leads will provide loop power to the transmitter and will measure the current resulting from temperature changes into the transmitter.
2. Power on the calibrator and configure for milliamp measurement with 24 V loop power supplied.
3. Configure the source function to simulate a Type K thermocouple.
4. Select the correct measurement units.
5. Set the zero percent range value for this transmitter. To do this, set the display initially to 0.0°C. You can use the up and down arrow keys to change the output value. Use the left and right arrows to control which decade value of the display is being changed. When the display reads 0.0, hold down the “0 %” key on the calibrator and observe that “0 %” is displayed in the lower right corner of the screen. This establishes the zero point for calibration.
6. Set the span (100 percent) value. To do so, set the display to the desired span value for calibration. In this example the display should read 150°C. Depress the “100 %” key and observe that “100 %” is displayed in the lower right corner of the screen. This establishes the span point for calibration.
Performing an “As Found” Test
1. Depress the 0 % key and record the applied temperature and the corresponding milliamp measurement.
2. Depress the ”25 %” key two times. Record the applied temperature and the corresponding milliamp measurement.
3. Depress the “100 %” key; record the applied temperature and the corresponding mA measurement.
4. Calculate the errors for each of the three points using the following formula:
ERROR = ([(I-4)/16]-[(T/TSPAN])*100
where
Error is in percent of span.
I is your recorded mA measurement.
T is the recorded temperature.
TSPAN is the temperature input span (0 % to 100 % points).
Table 1 on page 20 shows how to apply the formula to actual recorded measurements.
5. If your calculated errors are less than the specified instrument tolerance, the transmitter has passed the As-Found test. If the test has not passed, perform adjustments as necessary.
Adjusting the Transmitter
1. Depress the “0 %” key to source the proper temperature for a 4 mA output. Adjust the zero potentiometer until the current reading is 4.00 mA.
2. Depress the “100 %” key to source the proper temperature for a 20 mA output. Adjust the span potentiometer until the current reading is 20.00 mA.
3. Depress the “0 %” key again and adjust the zero potentiometer again if necessary, to get a 4.00 mA output.
Perform an “As Left” Test
Repeat the steps for performing an “as found” test to complete the full calibration procedure on your temperature transmitter.
Report Abusive Comment