Terms such as the Internet of Things (IoT), the Industrial Internet of Things (IIoT), Big Data and Industry 4.0 are bandied about in industrial contexts. Implementing these initiatives promises benefits that improve business outcomes through new insights on production results.
Opportunities for an integrated plant are driven by a new generation of wired and wireless sensors. Such sensors enable generating, gathering and storing data economically, and in quantities never previously available. The data then can be sent to process control and monitoring systems via plant networks or through the internet.
Organizations implementing IIoT solutions will improve real-time control and help plant engineers and operators make better decisions regarding operations and maintenance. Data gathered also can be directed to data-analytics software, which plant personnel can exploit to find additional information to increase efficiency, diagnose equipment problems and improve safety (figure 1).
The three most common ways to deploy such a system in an industrial plant or facility are brownfield, greenfield and “servicization.” All three scenarios can be implemented independently, or they can coexist within the same plant or facility.
- Brownfield is when new sensors are added to existing control or plant networks. Common brownfield scenarios include adding a wireless network and sensors to expand operator visibility and asset-monitoring capabilities.
- Greenfield scenarios are deployments in plants or facilities just coming online with IIoT projects. Typically, they are associated with cloud-based monitoring systems because the project is not designed around an on-premises control and monitoring system.
- Servicization involves a supplied asset and remote monitoring and services provided by a vendor. Vendors of heat exchangers, compressors, pumps, valves and other asset types are introducing subscription services whereby their equipment is installed on customer premises and the supplier provides ongoing monitoring of the equipment.
Regardless of the deployment method, these IIoT solutions collect, integrate and organize data for engineers and scientists to analyze and improve production output. Possible outcomes include improved maintenance programs for higher reliability at lower cost, optimized resource usage and predictive analytics on future outcomes in time to address issues.
FIGURE 1. Data-analytics software can be used to improve production outcomes in process heating applications across a variety of process industries.
These and other benefits are why the enthusiasm and growth in IIoT has been so strong over the past few years. Unfortunately, the rapid growth does not mean the industry has agreed on its naming conventions. Instead, various terms are bandied about, including those used earlier in this article as well as others like digital transformation, smart manufacturing and the Fourth Industrial Revolution. Yet, regardless of what a connected and sensor-equipped, integrated factory is called, the competitive advantages enabled by having access to accurate data-driven insights on production are too important for many firms to ignore.
With an understanding of the implementation methods and the theoretical benefits of an IIoT strategy, the final step before deployment is reviewing use cases to ascertain actual benefits and draw parallels with your company’s plants and facilities. Much of the language about deployment from software vendors is vague and relates to ambiguous platforms and analytics without specific context about either. This is sometimes promulgated in an effort to present a product as having a solution to any type of data-analysis problem.
The unspecific language also can suggest multiple IIoT implementations. The use of “platforms,” for example, can imply the development of custom code and consultants versus an off-the-shelf solution. Likewise, the term “analytics” has taken on a thoroughly ambiguous meaning: Many products and offerings claim to enable or provide analytics, but there is no specific definition or consensus of what those terms mean for specific applications.
Instead of relying on vague terminology, users should ask to see concrete examples where a supplier’s solutions have been applied to solve real-world problems. Practical examples demonstrate IIoT technologies in action.
Four use cases show how off-the-shelf solutions can be used to rapidly achieve insights in production environments. These use cases focus on demonstrating positive impacts to the bottom line via the application of data analytics.
Large Molecule Pharma Laboratories Analysis
Merrimack Pharmaceuticals was experiencing scale-up issues with a new upstream bioreactor process. Like many bioreactors — and reactors used in chemical, refining and other industrial applications — measurement and control of temperature, and control of heat sources are critical to operation.
Protein degradation due to a low molecular weight species was appearing over time in the production bioreactor. This resulted in a low concentration of the desired protein. The viable cell density and the titer data suggested the culture was successful at the 1000-liter (~264 gal) scale. In reality, the process needed modifications to achieve the desired final product concentration before scale-up could continue.
FIGURE 2. A historian (bottom right) is often used to store collected data, with data-analytics software (not shown) interfacing to the historian to provide insight.
In response to the issue, significant resources were quickly deployed to develop and test science-based hypotheses using the workhorses of the laboratory. (The facility included multiple master cell bank vials and 3-liter and 100-liter bioreactors). The sum of the additional upstream data, downstream data and corresponding offline analytics data resulted in a typical challenge often found in scale-ups: the manual and laborious spreadsheet-based approach confounded the efforts of scientists and engineers to derive insight from their data.
To address these issues, the IIoT technology provided the following approach:
- Define the physical situation to determine the key physics involved.
- Identify the key variables and determine all the data streams — both online time-series data like O2 flow rates, glucose addition rates, acid or base addition rates, temperatures and pH — and offline contextual data like integrated viable cell density, titer and media component concentrations.
- Recognize how and where this key data is collected and stored.
- Leverage an effective data analysis, visualization and reporting application alongside laboratory-scale and pilot-scale experimentation using data-analytics software.
The data-analytics software provided plant personnel with quick access to time-series features and contextual data by automatically connecting to historians and manufacturing systems. This made aggregation, management and modeling of the required data from disparate data sources and types a simple and expeditious process.
The data-analytics software also supported the required standard internal workflows for both analysis and communication of results. It included the tools for sharing insights among colleagues and for export to other applications for reporting and dashboard creation. The time-series data from the Emerson DeltaV historian was accessed by the data-analytics software along with multiple analytical devices (figure 2).
Through implementation of the data-analytics software, alongside appropriate investments in the appropriate historians and databases, Merrimack Pharmaceuticals was able to avoid the manual, time-consuming data investigation and analysis typically required. Merrimack’s scientists and engineers were able to quickly assess what was happening at the cell level and the process level for multiple scales and operating conditions.
The data-analytics software yielded insights into these relationships by providing visualization of individual process variables. It also provided insights by utilizing an internal calculation engine to determine relationships such as cell-specific oxygen uptake rates — an important metric when comparing the micro-environment across equipment scales.
In the example outputs (figure 3), each of the required variables was visualized, and the resulting calculations were developed. All were performed within a single processing environment while leaving the original data untouched in its original location.
Using this newly implemented data strategy, additional work was done within the data-analytics software to review biological growth and productivity data while testing the remaining science-based hypotheses. Additional process parameters were investigated along with additional offline analytics data, giving plant personnel the power to assess the impact of several important parameters on the bioreactor process.
For Merrimack Pharmaceuticals, this approach to data management, analysis, visualization and reporting resulted in the following specific improvements:
- It uncovered important issues related to bioreactor scale-up, including quick comparisons of key process steps across multiple batches.
- It quickly calculated and compared durations of sensitive growth periods during an intensive cell-growth phase.
- It enabled analysis of related downstream process operations. For example, rapid evaluation of chromatography curves using pattern-recognition features allowed downstream processing data to be visualized easily alongside other process data.
- It provided an environment aligned with the need to evaluate multiple unit operations together in a single place such as bioreactors with downstream chromatography.
FIGURE 3. In this example, the results from using software to perform the rapid data analysis and investigation are shown. The software outlines the course of action for a particular bioreactor process.
Power Plant Asset Optimization
For one power plant, the plant operators knew the feedwater heaters tended to foul and lose efficiency in a predictable manner. Despite that, the operators were never successful in efforts to quantify the process so maintenance could be optimized.
An analysis of the boiler’s heat rate provided the information necessary to determine the effect a cleaning effort has on efficiency, yielding a specific value. Operators can now optimize cleaning frequency based on the cost/benefit relationship.
Cement Production Energy Efficiency
Given the high energy intensity of cement manufacturing, the company struggled to optimize all the variables related to feed rates, fuel consumption, heat recovery and environmental regulations using conventional tools.
The ability to handle data from numerous sources and make it comparable and predictive helped resolve tradeoffs when working with alternative fuels and feedstocks while still maintaining product quality and efficiency across varying production levels.
Heavily regulated and energy-intensive production steps such as smelting were difficult to characterize using conventional spreadsheets. Consequently, balancing multiple factors and making optimal tradeoffs among costs, environmental factors and energy efficiency was almost impossible.
The ability to analyze tradeoffs among multiple variables made it much easier to control the process based on which variable was the most critical at any given time. Adjusting the mix for evolving costs or new regulations was simplified by trying out various what-if scenarios.
These use cases show how data analytics is no longer just another overused buzzword. Instead, it is being applied by process industries worldwide to improve production outcomes.