Post-Stuxnet market failures and socialisation of risks?  2.2.12

More than a year ago, we’ve learned that Stuxnet would be a game changer. Indeed, no advisor in all things security missed to mention that the alleged U.S.-Israel (Langner) originated hack and blow-up of Iranian Uranium enrichment facilities posed a show-case of future attacks on our beloved infrastructures and industrial production sites. While one might argue that the transfer of the world’s production sites to China serves as a mediator to scare going wild, there are still some Industrial Control Systems implemented and running within, say, the EU or the U.S. With Stuxnet discussed ad nauseam both at security conferences and in global mainstream media, with policy awareness increased up to the level of the leaders of the universe, with calls for decisive policy responses on all policy levels, calls for cyber-defense programmes against prospective attacks in cyber-warfare (by non U.S.-Israel) for national and international critical infrastructure protection programmes – with all that stuff one would assume that at least some of the most obvious steps have been accomplished.

And then you read an update by the commercial community of technical experts on Industrial Control Systems. According to their assessment, the ICS industry acts deaf and akin to the automotive industry in “Fight Club” (mentioned in the scene in which the automotive white-collar insomniac protagonists meets Tyler Durden on the airplane): it’s cheaper to let systems go bust occasionally and pay for some clean-up than to preventively fix the systems. Industrial control systems are still highly buggy, a group of ICS security researchers around the consultancy Digitalbond have tried to showcase at their SCADA Security Scientific Symposium (S4). For experts in the field, this is common knowledge for more than a decade.

The technical ICS geniuses at the S4 conference put all the blame to the vendors, such as Siemens, General Electric, Schneider Modicon, Rockwell Automation, SEL, or Koyo Automation. But is that easy? My experience from general IT, not ICS admittedly, tells me that life is more complicated. Independent consultancies, which are bound to specific vendors, have certainly no incentive to blame existing or prospective customers. More substantially, while there might be customers with inadequate security procedures out there, I highly doubt that knowledge about notorious insecurity of a particular set of artefacts doesn’t exist somewhere in customer companies and doesn’t climb up the communication ladder to the CIOs or CSOs. If owners are not interested in getting their 20-years old ICS fixed, a vendor interested in subsequent orders wouldn’t want to embarrass itself and its clients by being utterly explicit about the risks or the security hick-ups of the installed base of legacy systems.

The financial sector and the nuclear industry serve as nice role models for dealing with, as we institutional-economics-infected researchers call it, negative externalities of societal or technical systems. For both system vendors and owners of such infrastructures, inactivity is a viable option to respond to publication of vulnerabilities. Why would you want to spend millions on hardening your chemical facilities against a rather hostile hack into its control systems? If shit hits the fan, writing off your production site and transferring the external costs to the public is probably the most economic approach. Just make sure that the downfall of one site doesn’t bring down the complete parent group as with this TEPCO guys who failed to install proper economic firewalls inside their group. There are no columns or rows for the rhetoric of cyber-warfare in the Excel sheets on which executive boards of infrastructure owners rely in their decision making. The ongoing installation of insecure systems and components is certainly is worrying.

The great potential realigner of incentives aka public authorities have have remained rather calm on this issue, too. For Europe, Kroes is gunning for “providing the right incentives“, but we don’t know yet what the Commission will come up with eventually. Hohlmaier, rapporteur of the European Parliament on Cybersecurity issues and with a constituency in Siemens land, has been likewise silent on this, Google tells us. Inaction by incompetent or unwilling operators of information and industrial infrastructures might pose risks for the public at large. The public might want to live with some risks. Or prefer to have incentives realigned, i.e. get regulations installed that force vendors, customers or third parties to invest into security measures. For the last couple of years, policy makers, researchers and public authorities have been obsessed with “incentivising” third parties such as ISPs to make up for the failures of vendors and customers of ICT systems. For industrial control systems, I don’t see this option. It’s either the vendors and/or the customers (owners of infrastructures) that need to take the bill. Or learn to live with the risks. Just like we did with financial and nuclear systems.

“unauthorized enrichment facilities” as IO targets in a May 2010 article  24.10.11

History, but anyhow. Jon Baumgartner, “Computers as Weapons of War”, IO Journal, May 2010, pp. 5-8:

Similar IO attacks could be conducted against nation states that have violated international treaties in order to carry out as uranium enrichment for nuclear weapons. Most of the unauthorized enrichment facilities in these cases are constructed deep underground. Conventional munitions, including bunker busters, could have difficulty penetrating and damaging these hardened structures. Cyber munitions, however, could be used to destroy key equipment used in the enrichment process. One of the primary IO targets would be the gas centrifuges used to create weapons grade uranium. The rotors within these centrifuges operate at extremely high speeds (e.g. 50,000 RPM). A cyber attack that increased the RPMs beyond normal safely levels could result in a catastrophic failure of a single centri- fuge. Implementing this IO attack across thousands of centri- fuges has the potential to disrupt enrichment operations for considerable periods of time.

A couple of months before Stuxnet broke news.

Anonymous cyber terror  23.10.11

Dan Kaplan, SC Magazine:

In my eyes, this seems to be another step by U.S. officials, without exactly coming out and saying it, to label Anonymous as a cyber terrorist organization, bent on indiscriminate destruction of digital property and infrastructure.

The DHS in the “National Cybersecurity and Communications Integrations Center Bulletin”, A-0020-NCCIC / ICS-CERT –120020110916:

“The loosely organized hacking collective known as Anonymous has recently expressed an interest in targeting industrial control systems (ICS). (…) Anonymous’ increased interest may indicate intent to develop an offensive ICS capability in the future.”

Kaplan continues, on Duqu, the alleged Stuxnet offspring:

Which reminds me: I’m waiting for DHS to publish a warning based on a potential real critical infrastructure issue that popped up just yesterday — evidence that the Stuxnet authors are back with new malware. I’m sure the bulletin will arrive any minute now.

Even a year after, Langner sticks to his assessment:

Thinking about it for another minute, if it’s not aliens, it’s got to be the United States.