The severity of the cyber attack at a regional water purification plant was not immediately clear.
It was only after the third week of taking responsibility for cyber security at the plant that we knew we were in trouble. The business reports indicated that sales and profits were down.
By week five, we realised just how much trouble we were in. Thousands of people were without drinking water and company losses were high and mounting.
We were all conscious of reports by Symantec that more than 1,000 energy companies in Europe and North America had been compromised by an Eastern European hacking collective.
Researchers had found that since 2013, a group known as Dragonfly had been targeting organisations that use industrial control systems (ICS) to manage electrical, water, oil, gas and data systems.
One of the group’s latest cyber espionage campaigns was aimed at energy grid operators. This made an attack on our water treatment plants seem all the more likely.
We began to realise we had been targeted, but no one on our cyber security team was able to say just where we had gone wrong. The task had seemed straightforward enough.
We had done everything by the book, or so we thought, and yet things had gone horribly wrong.
From the start, we had all acknowledged that raising awareness of all IT system users around cyber security was essential.
We had done everything by the book, or so we thought, and yet things had gone horribly wrong
We had planned to take care of training from the start, but we had been forced to delay for two weeks.
Our immediate priority had been to patch all systems and carry out a systems audit after warnings of attacks by hacktivist group Anonymous. With the resources available, there was only so much we could do in a week.
The easiest approach would have been to cut all connection to the internet, but without links to vital business intelligence systems, we knew the plant would be uncompetitive and fail to meet profit targets. The business would not tolerate any downtime. The internet connection had to remain.
The system audit revealed two unknown internet connections into the industrial control system network. We were also alerted to a link with a previously unrecorded engineering workstation.
This enabled the installation of firewalls on all internet-connected systems. We also prioritised installing anti-virus protection on all systems.
The cyber security team felt confident that all systems were protected, but then we started receiving reports that the water quality was not of the required standard.
Chlorine levels in the production line were inconsistent and the water quality was failing to meet customer requirements.
The most logical reason was that the programmable logic controller (PLC) for the chlorine dosage in the disinfection tank was malfunctioning.
Our renewed confidence soon turned to dismay when it became clear that the problem had not been fixed
But we had implemented a host of controls to ensure that attackers could not introduce any malware designed to alter the PLCs. No one on the team could understand what was wrong.
We received reports that other water treatment plants had been hit by malware attacks, but we had taken all measures available to block attacks from the very first.
It was at this point that we decided to check and fix all PLC software. But our renewed confidence soon turned to dismay when it became clear that the problem had not been fixed as losses continued to grow.
By contrast, some of our competitors in the region were managing to maintain water quality and their businesses were thriving.
Fortunately, the greatest damage was to our pride – we were one of six teams taking part in a live interactive session of the Kaspersky Industrial Protection Simulation (KIPS) role-playing game.
The game, simulation software and interactive app were developed by security firm Kaspersky Lab to provide first-hand experience to participants of the challenges involved in providing advanced industrial cyber security.
So where did we go wrong?
It was frustrating to learn that failure to conduct staff training in cyber security awareness meant that several employees targeted by phishing emails had clicked on malicious links.
Failure to conduct staff training in cyber security awareness meant that several employees targeted by phishing emails had clicked on malicious links
This had allowed attackers to install malware on the network before we had implemented all the control measures that had occupied our initial rounds of the game.
Our second crucial mistake was to fail to replace all the PLCs after we had put in all the controls to ensure any malware already on the systems was eradicated.
We had spent all our time and money on bolting and barring the entrances and exits to the network, but the malware was already inside. We were doomed from the very start.
Our third mistake had been not opting to set up a security information event management system.
This meant that when things started going wrong, we did not have a complete picture of what was happening.
As a result, we were not able to trace the problem immediately to the disinfection tank, losing time and business.
The winners, determined by the fact that they had the largest profits over the time period of the simulation, had found the key to success: educate users, put in controls, and eliminate existing malware.
In hindsight, our mistakes were obvious. But even though we had missed the trick of replacing existing PLCs, we had known all along that training was important.
The simulation gave us first-hand experience of how easily it is to fail to do the right thing, or to do the right thing at the wrong time or for the wrong reason when faced with time, money and business pressures.
The simulation immediately put into context, and made much more meaningful, a presentation earlier in the day on targeted attacks on computer industrial control systems (ICS) as the biggest threat to critical national infrastructure.
We have found lots of examples of industrial control systems that are connected to the internet, which means they are hackable
Andrey Nikishin, Kaspersky Lab
According to Kaspersky Lab, such attacks take place regularly, but are rarely made public.
However, a decoy set up by the security firm on an ICS typically used to control national infrastructure saw 1,300 attempts to gain unauthorised access in a single month.
“Of these, 400 were successful,” said Andrey Nikishin, special projects director of future technologies at Kaspersky Lab.
The successful attacks included 34 connections to integrated [software] development environments (IDEs), seven downloads of PLC firmware, and one case of reprogramming a PLC with the hacker’s software, he said.
“This is especially worrying in the light of the fact that we have found lots of examples of industrial control systems that are connected to the internet, which means they are hackable,” said Nikishin.
The first stage to any targeted attack, he said, is information gathering and preparation. “Attackers will scour social media for information on ICS operators who can be targeted through well-crafted email phishing attacks to infect control systems,” said Nikishin.
This was all the more meaningful after we had experienced the consequences of this type of attack in the simulation.
Our experience also made all the more chilling Kaspersky Lab's prediction that attackers will increasingly target industrial control systems because industrial networks potentially offer an easier way in to the more heavily protected corporate IT systems.
The analysis of malware has revealed components for controlling office doors, so attackers could theoretically block access to targeted buildings
Andrey Nikishin, Kaspersky Lab
Attackers are also expected to target and compromise remote controls systems, where such systems have been introduced to cut costs and for ease of use without due consideration to security.
“All sorts of things, including public transport systems and even battleships, are being equipped with remote control systems, which are conceivably open to compromise,” said Nikishin.
Kaspersky Lab is also warning of potential attacks on building control systems for CCTV, air-conditioning, lighting and access control.
“The analysis of malware has revealed components for controlling office doors, so attackers could theoretically block access to targeted buildings, which could be dangerous,” said Nikishin.
“From experience, my feeling is that the situation is likely to go from bad to worse.”
Possibly, the lack of this element of experience is responsible for organisations, including suppliers of critical national infrastructure, failing to take the threat seriously enough.
The Kaspersky Industrial Protection Simulation was a real eye-opener and should be made mandatory for all security professionals, especially those working for suppliers of national infrastructure.
Thanks to the simulation, this security journalist, for one, has a far better understanding of the challenges and pressures involved.
This was first published in August 2014