The Air Gap Was Always a Fiction
The first time someone told me a power substation was air-gapped, I believed them.
The second time, I asked how the protection-relay firmware got updated. The answer involved a vendor engineer, a laptop, a USB stick, and a quiet drive to a yard somewhere in the East of England. I stopped believing.
What I didn’t realise was that I was thirty years late to the party. The air gap had been mostly fictional since the first Windows server got plugged into a control network. The architectural model that gave us the air gap in the first place — the Purdue Reference Model, drawn in the early 1990s — already had an asterisk against it by the early 2000s. Stuxnet just made the asterisk visible.
The interesting question now isn’t whether IT and OT are converging. They converged years ago. The interesting question is what the attackers found when they got through — and what the protocols they found on the other side were never designed to defend against.
A model that assumed the wires were air
In the early 1990s, Theodore J. Williams at Purdue published the Purdue Enterprise Reference Architecture (PERA). Buried inside it was a layered diagram of the manufacturing plant — Levels 0 through 4, from the physical instrumentation at the bottom to the corporate ERP at the top.
The reason the model caught on isn’t that the layering was novel. It’s that the gaps between the layers were assumed to be physical. Level 4 was an Ethernet LAN with TCP/IP. Levels 0-3 were serial cables, proprietary fieldbuses, MODBUS over RS-485, and Allen-Bradley DH+. They didn’t speak the same protocol; they couldn’t speak the same protocol. The boundary between IT and OT was a transceiver mismatch, not a firewall rule.
This was fine — for about ten years.
The first leak: OPC and the Windows server
The trouble started when plant operators wanted Windows-based HMIs. Then Windows-based historians. Then a way for the corporate office to read production data without walking down to the plant.
The bridge that emerged was OPC Classic — the OLE for Process Control specification, which standardised how Windows applications could read industrial process data. By the early 2000s OPC was everywhere: plant historians, ERP integration, mobile dashboards, all of it.
OPC Classic had one notorious problem: it was built on Microsoft DCOM, which needed port 135 plus a dynamic range of 1024-65535 to traverse a firewall. In practice, most sites just left the firewall off between the historian and the control network. The “air gap” had become a Cisco ACL with a comment that said temporary, will tighten later.
The Purdue diagram quietly added a new level — Level 3.5, the industrial DMZ. Historian on the IT side, replicas on the OT side, a firewall in between, and a story you could tell auditors about the boundary. Whether it was actually a boundary depended on who configured the firewall.
Stuxnet: the air gap was already fiction
In June 2010, an antivirus engineer at the Belarusian firm VirusBlokAda flagged a piece of malware that had been quietly working its way through Iranian industrial systems for at least a year. The malware was Stuxnet, and what it did was unprecedented — four Windows zero-days in a single payload, stolen code-signing certificates, the first publicly known PLC rootkit, and a ladder-logic substitution that drove uranium-enrichment centrifuges to destruction while reporting nominal speeds to the control room.
Stuxnet ended the air-gap conversation. Natanz, the Iranian enrichment facility, was as close to a textbook air-gapped industrial environment as existed anywhere. There were no IP routes from the corporate network to the centrifuge controllers. Stuxnet got in anyway, on a USB stick carried by a maintenance contractor — exactly the sort of route I’d been told didn’t exist in East Anglia.
Six years later, Industroyer (also called CRASHOVERRIDE) hit the Pivnichna substation outside Kyiv on 17 December 2016 and showed that the same playbook worked on power-grid kit. Industroyer didn’t need a USB stick — it spoke IEC 60870-5-104 and IEC 61850 natively. The protocol stack the substation was designed to speak became the attack surface. Industroyer2 followed in April 2022, refined for one specific Ukrainian transmission substation; it was caught before execution by a joint CERT-UA / ESET response, but the playbook had been refined.
CHERNOVITE’s PIPEDREAM toolkit, disclosed in CISA advisory AA22-103A in April 2022, was the first publicly attributed pre-deployment ICS toolkit — generic across multiple PLC vendors.
In April 2026, Forescout’s Vedere Labs disclosed BRIDGE:BREAK — twenty-two vulnerabilities across Lantronix and Silex serial-to-IP converters, with nearly 20,000 devices exposed on the public internet. These are the devices that bridge legacy serial instruments onto Ethernet — the exact Purdue Level 1-to-2 boundary. The flaws include authentication bypass, firmware tampering, and remote code execution. A compromised converter can alter sensor data in transit, suppress or forge commands, and provide lateral movement into the control network. The air gap didn’t fail at the firewall. It failed at the cable adapter.
The pattern by the mid-2020s was clear. Air gaps don’t fail catastrophically. They fail quietly, at the seams, where a contractor’s laptop or a vendor’s remote-support tunnel or a cloud-hosted historian dissolves the boundary just enough for an attacker to walk through. The audit report still says air-gapped. The network diagram still has the dashed line in the middle. Reality has moved on.
Virtualisation, cloud, and the line that vanished entirely
If Stuxnet was the moment the air gap became visibly fictional, the 2015-2020 period was when it stopped being even rhetorically useful.
EMS and ADMS platforms in control centres had been moving to virtualised infrastructure since the early 2010s. Plant historians shifted to cloud — AWS IoT Core launched in October 2015, Azure IoT Hub in February 2016, and the major DCS vendors all built cloud-connected variants. COVID accelerated remote vendor access from “occasionally” to “permanently”. RTUs that had previously been physical appliances at the substation edge started being deployed as virtual machines in the same control-centre cluster as the EMS that consumed their data.
By 2024, a typical modernised utility ran something like this: Sampled Values and trip GOOSE on the substation LAN with PRP redundancy; a virtualised RTU and OT-IDS sensor on the same vSphere or KVM cluster; DNP3 or IEC 60870-5-104 northbound to a virtualised SCADA front-end in a regional data centre; the historian replicating to cloud for analytics; remote vendor access via a jump host with MFA and session recording. Six different “boundaries”, none of them physical.
NERC took until 2025 to formally update CIP-005, CIP-007, and CIP-010 to acknowledge virtualised cyber assets — FERC Order No. 919, with an enforcement deadline of 1 April 2028. For a decade the standards had told North American utilities that virtualisation was either prohibited or in a regulatory grey area. The standards were the last to admit what the architectures had already done.
The UK regulator Ofgem, enforcing the NIS Regulations 2018, got there earlier in spirit — the NIS regime is principles-based and does not need to enumerate every architectural pattern — but is now hardening its expectations through the Cyber Security and Resilience Bill working its way through Parliament.
What the attackers found on the other side
The air gap dissolved. The protocols on the other side of it were never designed to survive in a world where it didn’t exist.
DNP3, IEC 60870-5-104, and even the GOOSE and Sampled Values multicast on the substation LAN were all designed without authentication or encryption. A DNP3 command to open a breaker looks exactly the same whether it comes from the control-centre SCADA system or from a compromised laptop on the same network. There is no signature, no challenge, no session token. The protocol trusts the wire, and the wire was supposed to be air-gapped.
Underneath all of them, on the auxiliary side — battery monitors, HVAC, metering — there’s almost always Modbus, which has the same plain-text problem with even less of a security retrofit available.
Dragos’s analysis of ELECTRUM — the activity group behind Industroyer and Industroyer2 — is the public reference for what a targeted attack on substation control looks like. The attacker didn’t exploit a bug in the protocol. The attacker spoke the protocol. INDUSTROYER issued legitimate IEC 60870-5-104 commands to open breakers. The commands were syntactically correct, properly formatted, and the RTUs obeyed them — because nothing in the protocol stack could distinguish a legitimate operator command from an attacker’s command originating from the same network segment.
That’s the threat model the industry is now working backwards from. Not a zero-day in a PLC. Not a supply-chain implant in a firmware update. Just a plain-text protocol that does what it’s told, spoken by someone who shouldn’t be on the network but is.
Layered on top of the protocol-security retrofit is OT-specific intrusion detection — Dragos, Claroty, and Nozomi all sell passive sensors that deep-parse IEC 61850, DNP3, IEC 60870-5-104, Modbus, and the rest, build asset inventories from observed traffic, and detect both anomalies and the tradecraft of named threat groups. The sensor itself ships as a VM, which slots into the same vSphere or KVM cluster as the virtualised RTUs it’s monitoring. The detection is necessary because the protocols can’t protect themselves.
The boundary that needs to be drawn deliberately
The air gap was always a fiction. We told ourselves it was real because we needed the boundary to exist for the architecture to make sense, and because nobody wanted to do the engineering work of designing a boundary that would survive contact with reality. Stuxnet, Industroyer, Colonial Pipeline, and a hundred quieter incidents in between have made the fiction harder to sustain.
What’s replacing it isn’t another air gap. It’s a deliberately engineered boundary — drawn in standards documents, enforced in software, and audited by humans. The standard the industry has settled on for drawing that boundary is IEC 62443, and the standard for bolting cryptography onto the protocols that were never designed to have it is IEC 62351.
The next post is about both — how 62443’s zones and conduits replace the air gap as the unit of security architecture, how 62351’s protocol-security mechanisms give those conduits something to be made of, and what happens when you actually try to apply them to a real-shaped substation.
The vendor engineer is still going to drive to a yard somewhere in the East of England with a laptop. The question is whether the laptop is in its own zone, what conduit it traverses, and what’s logged when it does.
References
Architectural foundations
- Theodore J. Williams — The Purdue Enterprise Reference Architecture (Computers in Industry, 1994)
- ISA-95 / IEC 62264 — Enterprise-control system integration
Stuxnet and ICS attacks
- Symantec — W32.Stuxnet Dossier (Falliere, O Murchu, Chien, 2011)
- VirusBlokAda — initial Stuxnet disclosure, June 2010
- Dragos — CRASHOVERRIDE white paper
- CISA TA17-163A — CRASHOVERRIDE
- ESET — Industroyer2: Industroyer reloaded (12 April 2022)
- CISA AA22-103A — APT Cyber Tools Targeting ICS/SCADA Devices (PIPEDREAM)
Serial-to-IP converter vulnerabilities
- Forescout Vedere Labs — BRIDGE:BREAK (April 2026) — 22 flaws in Lantronix and Silex serial-to-Ethernet converters
OPC and IT/OT integration
- OPC Foundation
- OPC Classic firewall guidance — DCOM port 135 plus dynamic 1024-65535
OT intrusion detection
Regulatory