Risk can take many forms in food-and-beverage processes. From proper hygiene standards for employees to regular calibration of instrumentation on critical processes, there are dozens of ways that process owners can mitigate risk to avoid potentially costly or hazardous results.

Of course, risk does not just mean employee safety. It also mean the risk of violating a regulation and getting fined, or the risk of producing a bad batch of products that have to be recalled at huge expense. How do process owners mitigate risk exposure without implementing excessively costly or time-consuming measures for risk avoidance?

In this article, I will look at the need for frequent calibrations in the food-and-beverage industry. I will also explain how sensor technology developments can make calibrations easier and less risky.

Required Calibrations

Certain processes in the food-and-beverage industries may require frequent calibration of the temperature instrumentation. In many cases, such calibrations require shutting down a batch reactor or other complex device every 6 to 12 months to remove and replace an instrument (figure 1). The removed instrument is taken to a laboratory and calibrated before it is reinstalled in the process.

While redundant sensors can allow for continued operation of the process, this can be a costly solution. As a result, in many cases, plant personnel simply take the process offline during the calibration cycle. Given that a typical food-and-beverage facility can have hundreds of temperature-measurement points, this can be a time-consuming process that can negatively affect plant throughput.

U.S. Food and Drug Administration (FDA) 21 CFR part 117.165 outlines the requirements for “verification and effectiveness” related to instrumentation used for process monitoring and control. Other standards such as Safe Quality Food Institute’s SQF code and the International Organization of Standardization’s ISO 22000, among others, define requirements for calibration or verification of instrumentation.

In addition, ISO 9001:2008-7.6, Good Manufacturing Practices (GMP) and World Health Organization (WHO) regulations and standards all require instrumentation to be calibrated or verified at specific intervals against measurement standards traceable to international or national standards. The main issue with scheduled calibration cycles is the performance of instruments between calibrations.

Even with calibration intervals as short as six months, an instrument can deviate over time before being recalibrated back to initial specifications. Reasons for temperature-sensing deviations include sensor drift, aging and other factors. The possibility for an undetected out-of-specification situation gradually increases over time. This results in an increased risk of product quality issues. What is the accuracy of the instrument one week after calibration? One month? What about five months after calibration? With each day removed from the last calibration, the level of uncertainty in the temperature measurement increases, coming closer and closer to the maximum allowable level of uncertainty.

If the next calibration cycle is reached, and temperature sensors are outside the accepted deviation, how does the facility determine when or even if the deviation was significant enough? How many products were produced during that time period, and do they have to be recalled?

A drifting temperature sensor can increase operating costs as well. For example, consider when steam is used to heat a process to 253.4°F (123°C). What if the steam is instead heated to an actual temperature of 258.8°F (126°C) due to a drifting temperature sensor? There would be no impact on product quality, but the additional cost for excessive heating of the steam could be significant.

RTD sensor monitors

FIGURE 1. One RTD sensor monitors its measured temperature and compares it to a predefined standard temperature, thereby self-monitoring its calibration and temperature-sensing accuracy. It performs such a self-check during each steam-in-place cleaning procedure.

These risks to operational efficiency as well as product quality and safety make accurate, reliable temperature measurement so critical.

Regulatory guidelines provide little in the way of determining frequency of calibration cycles other than to say critical processes must be calibrated regularly. What defines a process as critical and a calibration cycle as regularly are determined by the process owners. Those definitions often are a reflection of how risk-averse they feel they must be to protect product quality.

So, when should you calibrate? Some processes in food and beverage are designed to run continuously for a long time — days, weeks or even months. If a temperature sensor drifts during this time, it could affect literally tons of product, which, in the worst case, would have to be recalled.

Ideally, temperature sensors could be calibrated before starting each batch or process; however, this is generally cost prohibitive. As a compromise, instruments often are calibrated on a predetermined schedule, whether needed or not. The costs of doing unnecessary calibrations include labor and lost production. In addition, a certain amount of risk is involved in handling and perhaps damaging the instrument. Calibrating too often results in unacceptable production reductions; calibrating too little can result in out-of-specification product.

Sensor Technology Developments

Developments in temperature sensor technology make it possible for plant personnel to determine if an instrument’s sensor needs calibration. Such self-monitoring sensors can help plants avoid unnecessary laboratory calibrations. When a sensor needs calibration, other technologies could cut the time needed.

Self-Calibrating Sensors. Many food-and-beverage processes include a clean-in-place (CIP) or a sterilize-in-place (SIP) procedure to clean reactors, vessels, valves, flanges and other components with steam. CIP and SIP procedures typically occur before the start of a new batch.

CIP is a cleaning process that consists of injecting hot water, then introducing a base to neutralize acids, followed by another injection of hot water. Once done, the entire vessel is rinsed with water. SIP is a sterilization process that consists of injecting steam into the vessel for up to an hour.

Self-calibrating temperature sensors can calibrate themselves during the cooling cycle of processes such as a SIP procedure (figure 1). These sensors have a temperature reference built into the temperature sensor itself. The reference sensor is based on the Curie point or Curie temperature — the temperature at which the ferromagnetic properties of a material abruptly change. This change in properties can be detected electronically. This provides a physical fixed point that can be used as a reference for comparison with the actual RTD temperature sensor.

For applications using SIP operations, where steam at 250°F (121°C) is used to sterilize equipment, one self-calibrating RTD uses a reference material with a Curie point of 244°F (118°C). When the SIP process reaches 244°F, the reference sensor in the device sends a signal. Simultaneously, the RTD measures the temperature. Comparison between these two values results in a calibration, identifying any errors in the temperature sensor. If both sensors read a value of about 244°F, the RTD sensor is still in calibration. If the measured deviation is outside set limits, the device sends an alarm or error message to the automation system, which is also displayed via a local LED.

temperature sensor

FIGURE 2. A quarter-turn neck simplifies removing and reinstalling a temperature sensor removed for calibration.

Similar automatic, in-situ calibration could be accomplished for CIP processes with a reference material with a Curie point lower than typical CIP loop temperatures (e.g., a material with a Curie point at 131°F [55°C]).

The calibration data acquired is sent electronically and can be read using asset-management software or at the automation system. This also enables an auditable certificate of calibration to be created automatically. Most importantly, the material used as the reference point in the sensor is fully traceable to known international standards, thus making it a true calibration, not simply a verification.

Calibration can be accomplished automatically during every CIP or SIP process. Unless a process runs continuously for six months or more without being cleaned, calibrations will occur on a fairly regular basis — perhaps every day.

Such frequent calibrations can help reduce the risk of bad batches and allow a plant to reduce the frequency of manual calibration intervals.

Eliminating Wiring Issues. As noted earlier, certain food-and-beverage regulations may require sensors to be calibrated on a regular basis, and this typically involves removing the sensor from the process and taking it to a laboratory. Most sensors require disconnecting the wires while removing the sensor and then reconnecting them after calibration. Such a procedure typically takes about 30 minutes. While the procedure is fairly simple, wiring errors can occur.

Another sensor technology development is RTD sensors that do not require disconnection of the wires when removing the sensor (figure 2). Instead, a technician twists the sensor a quarter turn to remove it and the reverse to reinstall it. Eliminating the need to disconnect and reconnect sensor wiring can cut calibration time.

Summary

FDA and other agencies require regular calibration of temperature sensors because accurate temperature measurement and control is such a critical parameter in food-and-beverage processes. Most plants calibrate sensors every six months, but sensors can drift between calibrations.

Sensor technology developments makes it possible for RTDs to calibrate themselves at the end of each batch during cleaning cycles. Another development eliminates the need to disconnect wires when removing sensors for calibration. These developments can help plant operators realize process optimization and risk mitigation.