Process-Informed Security: Why the Traditional Security Triad Fails in OT
Introduction
The CIA triad (Confidentiality, Integrity, Availability) is a topic that tends to generate strong opinions, but it continues to surface in discussions on cyber-physical risk. From the perspective of the process industries—such as refining, chemical plants, pipelines, and offshore facilities—it's important to revisit it with care and perspective.
The CIA triad was developed for enterprise IT in the 1980s, yet it continues to appear in industrial cybersecurity conversations—often promoted by engineers with an OT background who adopted it into process automation contexts. While this may have been done with the intent of aligning with established security frameworks, it has led to a mismatch between the model's original purpose and the realities of industrial operations. Some suggest extending it with additional concepts like Safety, Reliability, or Resilience. While these suggestions acknowledge valid concerns, they also obscure the more fundamental issue: that industrial environments depend on tightly coupled control between digital and physical domains. Effective security must therefore reflect how these systems actually operate in practice — where automated control logic, real-time feedback, and equipment coordination are tightly linked to physical outcomes. Ensuring safety and reliability requires safeguarding the core operational functions — those that allow the system to detect relevant physical conditions, apply correct control responses, and maintain operation within its intended parameters.
Why the Triad Falls Short in Industrial Contexts
Traditionally, process safety has focused on random hardware or component failures, with risks mitigated through redundancy and probabilistic design. However, cyber-physical attacks introduce intentional and coordinated manipulations of automation components, which traditional safety models did not account for. This has led to a false separation between OT security and process safety. In reality, industrial automation is an integral part of the physical process, and failures of control and safety functions—whether due to malfunction or malicious interference—are inherently process safety concerns. Therefore, process safety and cybersecurity should be addressed as an integrated discipline. A cyber-physical attack doesn’t just expose data as in IT—it can trigger unsafe states, damage equipment, or halt production by deliberately bypassing controls or executing multiple failures in a specific sequence.
Shifting from IT to OT Mindset
In enterprise IT, data confidentiality is often paramount, and risks are primarily information-centric. In contrast, industrial systems operate at the intersection of software and physical action. Here, the automation system doesn't just support the process — it is part of it. A compromise doesn’t just lead to data loss or system downtime; it can result in incorrect actuation, unsafe physical states, or damage to equipment. A breach of controllability, observability, or operability doesn't just degrade performance — it threatens the integrity and safety of the process itself. Recognizing this distinction is key to moving from an IT-centric model to a cyber-physical risk perspective.
Dr. Marina Krotofil’s Approach
In her 2015 research presentation "Damn Vulnerable Chemical Process," Dr. Marina Krotofil illustrated how attackers can exploit structural weaknesses in the controllability, observability, and operability of industrial systems. These aspects are not abstract properties — they define the practical ability to run, monitor, and influence a physical process within safe limits. Her work showed that when these functional dependencies are compromised, attackers can cause real-world damage without needing to break traditional IT security models.
Rather than extending the CIA triad, Krotofil's approach highlights the need to align cybersecurity with the core operational mechanisms of industrial systems. By focusing on controllability, observability, and operability, we move the conversation toward physical impact and process stability — ensuring protection of what ultimately matters: safety, continuity, and plant integrity.
The Origin and Purpose of the CIA Triad — and Its Misapplication in Industrial Contexts
The CIA triad was developed to guide security practices in enterprise IT by focusing on three fundamental properties:
Together, these principles aim to maintain trust in digital information and the systems that manage it — particularly in environments where the primary asset is data itself.
This model was developed for protecting information in enterprise environments, not for safeguarding control over physical processes. In operational technology (OT), the automation system is not merely an overlay — it is part of the process. Control functions operate valves, pumps, compressors, furnaces, and reactors in real time, translating software logic directly into physical actions.
In industrial settings, control systems depend on a combination of data, algorithms, and procedures that represent and influence the physical state of the process — including pressures, levels, temperatures, flows, and batch operations. Any deviation — whether due to malfunction, misconfiguration, or a cyber attack — can alter automated system behavior or mislead operators, resulting in unsafe or unintended actions.
Because control logic and sensor input are so tightly bound to physical reality, even subtle manipulations can drive the process out of its intended range. The separation between the digital and physical is nearly nonexistent; manipulating the information is effectively manipulating the system. The core objective remains the same: to maintain safe and stable operation within defined limits. In this context, process safety, operability, and equipment protection depend on the integrity and availability of the system’s control and information functions — not for their own sake, but because they anchor safe physical performance.
Applying the CIA model without adaptation can result in focusing on abstract properties rather than the functions that directly govern process safety and operational behavior.
IT vs. Industrial Systems: The Essential Difference
This contrast highlights why information-centric security approaches can fail in cyber-physical systems: they ignore that data and physical action are inseparable. A corrupted value is not just incorrect information — it's a distorted view of the physical process, and often, a trigger for unsafe or unintended behavior.
Why Extending the Triad Doesn’t Help
To fix this mismatch, some propose adding new terms, like Safety, Reliability, or Resilience, to the triad. However, this introduces unnecessary complexity:
All these concepts already hinge on core process control attributes: observability, controllability, and operability. Adding them to the CIA triad may appear helpful, but it risks diluting the focus needed to manage physical process risk effectively.
Reframing CIA Using Process Control Fundamentals
Building on a process-aware security view, we could map CIA to process-control equivalents:
CIA Property Control System View
Confidentiality Protection of sensitive operational or business information from
unauthorized access — relevant when disclosure has potential
physical, financial, or regulatory consequences in specific sectors
Integrity Confidence that process data, control signals, and actuation faithfully
represent and enforce the intended physical behavior
Availability Ability to observe and intervene in real-time
However, it’s more effective to frame industrial security through the lens of cyber-physical risk:
Recommended by LinkedIn
These functional capabilities define whether the system can operate safely, respond to deviations, and recover within design constraints. In cyber-physical systems, there is virtually no separation between control information and physical behavior — a compromised input or instruction is effectively a compromised process action. Understanding how controllability, observability, and operability can be disrupted is essential for identifying how cyber intrusions can lead to real-world consequences: unsafe states, production loss, or equipment damage.
Framed this way, controllability, observability, and operability form a functional security triad for industrial systems — grounded not in information theory, but in physical process control:
Each of these dimensions represents a critical functional dependency that supports safe and correct operation of industrial systems:
As with CIA, these dimensions are interdependent. But unlike CIA, which is grounded in abstract information properties, the COO triad is based on how industrial processes actually function. It reflects how systems act, react, and stay within safe boundaries in the physical world.
Functional Losses as the Real Security Hazards
When viewed from a process control perspective, cyber-physical threats manifest as disruptions to the system’s ability to maintain controllability, operability, or observability — whether through manipulation of digital inputs or through effects that degrade the system’s physical function.
These failures directly lead to unsafe states, production loss, or equipment damage. They also resonate with operations and safety teams, who can more clearly see how a cyber intrusion translates to real-world risks.
Example Attack Scenario: Consider an attacker who gains remote access to a water treatment plant’s PLC. By spoofing sensor data (loss of observability), they hide the actual chemical concentrations while simulating underdosing. Meanwhile, they override the chemical dosing control logic (loss of controllability), causing excessive chemical addition. Operators, unaware of the real chemical state, continue normal operations. This compromises water quality, posing a direct public health risk to the community. In addition, prolonged overdosing may lead to corrosion and degradation of tanks and piping. The issue isn’t unauthorized access to sensitive information — it’s the use of that access to manipulate data and control functions, ultimately disrupting the physical process.
Applying the COO Model to Industrial Risk Practice
1. Map Process Functions to COO Requirements
Action: Identify critical process functions that affect safety or production (e.g., reactor temperature control, compressor surge control, pipeline pressure regulation). For each function, outline your needs for:
Benefit: Establishes a functional baseline for the system, making it easier to pinpoint vulnerabilities most likely to cause unsafe events.
2. Integrate Cybersecurity with Process Hazards Analysis
Action: Merge cybersecurity considerations into safety assessments (e.g., HAZOP, LOPA, Bowtie). During a HAZOP review, ask:
Benefit: Ensures safety and security teams jointly assess avenues to unsafe conditions, including malicious manipulations that defeat traditional safeguards.
3. Perform a “COO-Focused” Threat Model
Action: Identify threats or attack vectors explicitly targeting each COO element:
Modifying or disabling required process sequences (e.g., start-up, shutdown, or batch operations), causing unsafe transitions or blocking normal production.
Reconfiguring equipment operating windows (pressure, temperature, level, and flow ranges) or mechanical constraints, pushing the process outside design boundaries and risking damage or unsafe states.
Restricting essential utilities (cooling water, instrument air, etc.) or interfering with critical support systems, forcing partial operation or unplanned shutdowns.
Benefit: Highlights the highest-risk targets threatening key process functions, guiding targeted mitigation efforts.
From CIA Triad Thinking to COO Triad Thinking
The CIA triad reflects priorities from enterprise IT, where data protection is paramount. In contrast, the **COO triad—Controllability, Operability, and Observability—**is anchored in the functional realities of process automation and cyber-physical risk. In industrial cybersecurity, the goal isn’t just to prevent data breaches but to prevent unsafe process conditions and ensure the continuity of operations—core objectives for process safety and production integrity.
Rather than adding buzzwords to CIA, the COO triad re-centers security on the fundamental conditions for safe, stable operation of industrial processes. Each dimension can be compromised individually, yet all three must work together to maintain process safety, production continuity, and equipment integrity.
This framework clarifies why OT security and process safety must converge. Control systems are inseparable from physical operations, and a failure in control—whether accidental or malicious—can quickly become a process safety incident. Addressing cyber-physical risk thus requires treating OT security and process safety as a unified discipline.
Adopting the COO triad is more than just changing terminology; it represents a shift to align cybersecurity with the engineering principles that shape industrial process control and safety. By emphasizing controllability, operability, and observability, we protect systems according to the same logic used in their design—maintaining stable operation within intended limits and preventing failures that could endanger people, equipment, and the environment. In essence, securing cyber-physical systems means ensuring they function and perform exactly as designed.
Senior Manager, OT Security at Accenture
3wDr. Marina Krotofil COO strikes again
Cyber, Software and System Engineer in IIoT and OT - with a keen liking for processes, and great at embedded software development. Computational Thinker.
1moAbsolutely. I like the fact that you make the distinktion between Functional Safety and its probabilistic nature, from Cyber Security and its nature that is more akin to sabotage... ...and as you say, the models have lead to a false separation between Functinal Safety and Cyber Security. I think Cyber in Physical systems should be treated withe same kind of formalism as Functional Safety, and that Cyber can benefit very much from the experience gained through 50 years (see BS5304). There is am argument that FS was born formally with EN954-1 in 1996, notwithstanding there is a lot of ripe experience for Cyber to learn from, ...or rather to combine with...