Read Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon Online
Authors: Kim Zetter
In 2010, when the Symantec researchers discovered that Stuxnet was designed to sabotage Siemens PLCs, they believed it was the first documented case in which digital code had been used to physically destroy equipment. But three years earlier, on this Idaho plain, the Aurora Generator Test had demonstrated the viability of such an attack.
It was around eleven thirty a.m. that March day when a worker back
in Idaho Falls got the signal to launch a stream of vicious code against the target. As the generator’s 5,000-horsepower diesel engine roared over speakers in the lab’s small theater, the spectators stared intently at a screen searching for signs of the code’s effects. At first, there were none. But then they heard a loud snap, like a heavy chain slapping against a metal drum, and the steel behemoth rattled briefly as if shaken awake. Several seconds passed and they heard another snap—this time the generator lurched and shuddered more violently as if jolted by a defibrillator. Bolts and bits of rubber grommet ejected from its bowels toward the camera, making the observers wince. About fifteen seconds passed before another loud snap sent the machine lurching again. This time, after the vibrations subsided, the generator spit out a puff of white smoke. Then suddenly,
bam!
the machine heaved again before coming to a final rest. After a lengthy pause, when it seemed the beast might have survived the assault, a plume of angry black smoke billowed from its chambers.
Only three minutes had elapsed since the test began, but that was all it took to reduce the colossal machine to a smoldering, lifeless mess of metal and smoke. When it was all done, there was no applause in the theater, just stunned silence. To rock a piece of equipment the size of a tank should have required exceptional force. Yet all it had taken in this case was twenty-one lines of malicious code.
The test had been exhaustively planned and modeled for weeks, yet the force and violence of the attack still took its engineers by surprise—“a moment of incredible vividness,” Michael Assante, one of the architects of the test, said.
1
It was one thing to simulate an attack against a small motor perched atop a table, but quite another to watch a twenty-seven-ton machine bounce like a child’s toy and fly apart.
The test provided certified proof that a saboteur didn’t need physical access to destroy critical equipment at a power plant but could achieve the same result remotely with just a piece of well-crafted code. Three years
later, when Stuxnet was found on machines in Iran, no one who worked on the Aurora project was surprised that a digital attack could cause physical destruction. They were only surprised that it had taken so long for such an attack to show up.
WHEN THE SYMANTEC
researchers discovered in August 2010 that Stuxnet was designed for physical sabotage of Siemens PLCs, they weren’t the only ones who had no idea what a PLC was. Few people in the world had ever heard of the devices—this, despite the fact that PLCs are the components that regulate some of the most critical facilities and processes in the world.
PLCs are used with a variety of automated control systems that include the better-known SCADA system (Supervisory Control and Data Acquisition) as well as distributed control systems and others that keep the generators, turbines, and boilers at power plants running smoothly.
2
The systems also control the pumps that transmit raw sewage to treatment plants and prevent water reservoirs from overflowing, and they open and close the valves in gas pipelines to prevent pressure buildups that can cause deadly ruptures and explosions, such as the one that killed eight people and destroyed thirty-eight homes in San Bruno, California, in 2010.
There are less obvious, but no less critical, uses for control systems as well. They control the robots on car assembly lines and dole out and mix the proper portion of ingredients at chemical and pharmaceutical plants.
They’re used by food and beverage makers to set and monitor temperatures for safely cooking and pasteurizing food to kill deadly bacteria. They help maintain consistent temperatures in the furnaces and kilns where glass, fiberglass, and steel are made to ensure the integrity of skyscrapers, cars, and airplanes. They also control traffic lights, open and close cell doors at high-security federal prisons, and raise and lower bridges on highways and waterways. And they help route commuter trains and freight trains and prevent them from crashing. On a smaller scale, they control the elevators in high-rise buildings and the heating and air conditioning in hospitals, schools, and offices. In short, control systems are the critical components that keep industries and infrastructures around the world functioning properly. They need to be reliable and secure. Yet, as Stuxnet clearly showed, they are anything but.
And now with that code available in the wild for anyone to study and copy, the digital weapon can serve as a blueprint to design other attacks targeting vulnerable control systems in the United States and elsewhere—to manipulate valves in a gas pipeline, for example, or to release sewage into waterways, or possibly even to take out generators at a power plant. It wouldn’t necessarily require the resources of a wealthy nation to pull off such attacks. With most of the core research and development already done by Stuxnet’s creators to expose the vulnerabilities in these systems, the bar has been lowered for other attackers, state and nonstate players alike, to get in the game. From anarchic hacker groups like Anonymous and LulzSec to extortionists looking to hold the controls of a power plant hostage to hackers-for-hire working for terrorist groups, the door is now open for a variety of attackers who never have to venture beyond their borders, or even their bedrooms, to launch an assault. And although Stuxnet was a surgical attack targeting specific machines while leaving others untouched, not all attacks would be so targeted or skilled, raising the possibility of assaults that create widespread disruption or damage—whether intentionally or not.
Attackers wouldn’t need to design a sophisticated worm like Stuxnet,
either. An ordinary run-of-the-mill virus or worm can have detrimental effects as well.
3
In 2003, train-signaling systems on the East Coast went dark after computers belonging to CSX Corporation in Florida got infected with the Sobig virus. CSX operates rail systems for passenger and freight trains in twenty-three states, and as a result of the signals going out, trains running between Pennsylvania and South Carolina and in the DC Beltway had to be halted.
4
Similarly, the Slammer worm took out the safety monitoring system and process control network at the Davis-Besse nuclear power plant in Ohio for about five hours that same year.
5
On a scale of one to ten measuring the preparedness of US critical infrastructure to withstand a destructive cyberassault, one being least prepared and ten being most prepared, NSA Director Gen. Keith Alexander told a Senate committee in 2013 that the nation is at three, due in part to the lack of security with control systems.
6
“We’ve been working on offensive cyber capabilities for more than a decade in the Department of Defense,” Jim Lewis of the Center for Strategic and International Studies has said. “But … I think people … just don’t realize that behind the scenes, there’s this new kind of vulnerability that really puts a lot of things at risk.”
7
In truth, the problems with control systems are not new; Stuxnet just
exposed them for the first time to the public. But some control-systems experts had known about them for years.
PLC
S
WERE FIRST
developed in the 1960s, when computer hackers and viruses were still the stuff of science fiction.
8
They were designed for the automotive industry to replace hardwired relay logic systems that controlled the assembly lines on factory floors. With hardwired relay systems, the only way to make an adjustment to a line was to send an electrician to physically rewire the relays. PLCs made it easy to update the systems with just a few hundred lines of code, though technicians still had to update the systems in person, traveling out to devices in the field to upload the commands from a tape cartridge.
As the use of digital control systems grew in the ’90s, operators pressured vendors to provide them with the ability to log into systems remotely via dial-up modem. Hackers were by then becoming legion, but operators still weren’t concerned about the security of their systems, because control systems ran on standalone networks, using custom protocols to communicate and having proprietary software that was incompatible with other programs and systems. You couldn’t just plug any computer into a control system and communicate with it. And even if you did have a system that could talk to the machines, the universe of people who understood how control systems worked and had the ability to manipulate them was small.
All of this began to change in the late ’90s, however. Congress passed environmental laws requiring companies to monitor and control their factory emissions, and the Federal Energy Regulatory Commission began to require access to electricity transmission systems to monitor their output and distribution. Suddenly compliance officers and corporate executives
demanded access to data and systems that were previously accessible only to plant operators. Out went proprietary operating systems that no one could communicate with or understand, and in came control systems that ran on commercial operating systems, such as Windows and Linux, making it easy for other computers on a company’s corporate network to connect and communicate with them. The switch to Windows, however, meant that control systems were now vulnerable to the same viruses and worms that plagued personal PCs. And as the systems became increasingly connected to the internet, or to dial-up modems to make them remotely accessible to operators, they also became increasingly vulnerable to remote attack from hackers.
In March 1997, a teenage hacker in Massachusetts who went by the name “Jester” gave a small preview of what could occur when he dialed into the Bell Atlantic computer system via modem and knocked out systems that managed phone and radio communications for the air traffic control tower at Worcester Airport, as well as phone service for six hundred homes in a nearby town. Communications for the airport’s security and fire departments were down for six hours, as was the system pilots used to activate the runway lights. Air traffic controllers had to use cell phones and battery-powered radios to direct planes during the outage.
9
No accidents occurred, but an air traffic control manager told CNN, “We dodged a bullet that day.”
10
That same year, the specially convened Marsh Commission published a report examining the vulnerability of critical infrastructure systems to attack—both physical and digital. The commission had been charged with investigating the matter after Timothy McVeigh blew up a federal building in Oklahoma City in 1995 and took out a number of key data and communication centers in the process. The commissioners warned of the increasing perils created from connecting critical systems for oil, gas, and electricity to the internet. “The capability to do harm … is growing
at an alarming rate; and we have little defense against it,” they wrote. The right commands sent over a network to a power-generating station’s control computer, they wrote, “could be just as devastating as a backpack full of explosives.… We should attend to our critical foundations before we are confronted with a crisis, not after. Waiting for disaster would prove as expensive as it would be irresponsible.”
11
A second report released the same year by the White House National Security Telecommunications Advisory Committee warned that the nation’s power grid and the utilities feeding it were pockmarked with security holes that made them vulnerable to attack. “An electronic intruder … could dial into an unprotected port and reset the breaker to a higher level of tolerance than the device being protected by the breaker can withstand,” investigators wrote, anticipating the Aurora Generator Test a decade before it occurred. “By doing this, it would be possible to physically destroy a given piece of equipment within a substation.”
12
Despite these early warnings, there were no signs yet that anyone was interested in conducting such attacks. That is until 2000, when a former worker sabotaged the pumps at a water treatment plant in Australia, in what is considered to be the first publicly reported case of an intentional control-system hack.
MAROOCHY SHIRE ON
Queensland’s Sunshine Coast is the kind of place made for picture postcards, with a lush rain forest, rugged volcanic peak, and azure coastal waters bordered by white sandy beaches. But in early 2000, the shire’s beauty took an ugly turn when, over the course of four months, a hacker caused more than 750,000 gallons of raw sewage to spill from a number of wells and pour into public waterways.
At first it was just a small amount of sewage spilling from a well at the Hyatt Regency Hotel into a lagoon on the five-star resort’s PGA golf course. But after workers cleaned it up, the well overflowed again and again. The worst spills occurred, however, in Pacific Paradise, a suburb along the Maroochy River. Here several hundred thousand gallons of sewage poured into a tidal canal, endangering the health of children playing in backyards abutting the canal, and into the Maroochy River itself, where it killed off fish and other marine life.
The problems began on New Year’s Eve 1999, after Maroochy Water Services installed a new digital management system. The treatment plant’s control system had been installed in stages by Hunter WaterTech, a contract firm, and was just nearing completion when settings for the pump stations responsible for moving sewage to the treatment plant began to mysteriously change. Pumps would turn off or continue to run in defiance of operator instructions, and the two-way radio network used to broadcast instructions to pump stations would become clogged with traffic, preventing operators from communicating with the stations. Alarms that should have sounded when things went awry didn’t.
13