Read Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon Online
Authors: Kim Zetter
Beresford reported his findings to ICS-CERT, which worked with Siemens to get the vulnerabilities fixed. But not all of them could be. Some, like the transmission of unencrypted commands and the lack of strong
authentication, were fundamental design issues, not programming bugs, which required Siemens to upgrade the firmware on its systems to fix them or, in some cases, re-architect them. And these weren’t just problems for Siemens PLCs; they were fundamental design issues that many control systems had, a legacy of their pre-internet days, when the devices were built for isolated networks and didn’t need to withstand attacks from outsiders.
Beresford’s findings defied longstanding assertions by vendors and critical-infrastructure owners that their systems were secure because only someone with extensive knowledge of PLCs and experience working with the systems could attack them. With $20,000 worth of used equipment purchased online and two months working in his spare time, Beresford had found more than a dozen vulnerabilities and learned enough about the systems to compromise them.
Since Beresford’s findings, other researchers have uncovered additional vulnerabilities in Siemens and other control systems. According to a database of control-system vulnerabilities managed by Wurldtech Security, a maker of systems for protecting critical infrastructure, about 1,000 vulnerabilities have been found in control systems and control-system protocols since 2008. Most of them would simply allow an attacker to prevent operators from monitoring their system, but many of them would also allow an attacker to hijack the system.
31
In 2011, a security firm hired by a Southern California utility to evaluate the security of controllers at its substations found multiple vulnerabilities
that would allow an attacker to control its equipment. “We’ve never looked at a device like this before, and we were able to find this in the first day,” Kurt Stammberger, vice president of Mocana said. “These were big, major problems, and problems frankly that have been known about for at least a year and a half, but the utility had no clue.”
32
The security problems with control systems are exacerbated by the fact that the systems don’t get replaced for years and don’t get patched on a regular basis the way general computers do. The life-span of a standard desktop PC is three to five years, after which companies upgrade to new models. But the life-span of a control system can be two decades. And even when a system is replaced, new models have to communicate with legacy systems, so they often contain many of the same vulnerabilities as the old ones.
As for patching, some control systems run on outdated versions of Windows that are no longer supported by Microsoft, meaning that if any new vulnerabilities are discovered in the software, they will never get patched by the vendor. But even when patches are available, patching is done infrequently on control systems because operators are wary of buggy patches that might crash their systems and because they can’t easily take critical systems—and the processes they control—out of service for the several hours it can take to install patches or do other security maintenance.
33
All of these problems are compounded by a growing trend among vendors to package safety systems with their control systems. Safety systems used to be hardwired analog systems configured separately from control systems so that any problems with the control system wouldn’t interfere with the safety system’s ability to shut down equipment in an emergency.
But many vendors are now building the safety system into their control system, making it easier to disable them both in a single attack.
34
Many of the vulnerabilities in control systems could be mitigated if the systems ran on standalone networks that were “air-gapped”—that is, never connected to the internet or connected to other systems that are connected to the internet. But this isn’t always the case.
In 2012, a researcher in the UK found more than 10,000 control systems that were connected to the internet—including ones belonging to water-treatment and power plants, dams, bridges, and train stations—using a specialized search engine called Shodan that can locate devices like VoIP phones, SmartTVs, and control systems that are connected to the internet.
35
In 2011 a hacker named pr0f accessed the controls for a water plant in South Houston after finding the city’s Siemens control system online. Although the system was password-protected, it used a three-character password that was easily guessed. “I’m sorry this ain’t a tale of advanced persistent threats and stuff,” pr0f told a reporter at the time, “but frankly most compromises I’ve seen have been a result of gross stupidity, not incredible technical skill on the part of the attacker.”
36
Once in the SCADA system, pr0f took screenshots showing the layout of water tanks and digital controls, though he didn’t sabotage the system. “I don’t really like mindless
vandalism. It’s stupid and silly,” he wrote in a post he published online. “On the other hand, so is connecting interfaces to your SCADA machinery to the internet.”
37
Many SCADA field devices, if not connected directly to the public internet, are accessible via modem and are secured only with default passwords. Switches and breakers for the power grid, for example, are often set up this way with default passwords so that workers who need to access them in an emergency will remember the password. For the same reason, control systems aren’t generally designed to lock someone out after several failed password attempts—a standard security feature in many IT systems to prevent someone from brute-forcing a password with multiple guesses—because no one wants a control system to lock out an operator who mistypes a password a few times in a state of panic. In 2011, a test team led by security researcher Marc Maiffret penetrated the remote-access system for a Southern California water plant and was able to take control of equipment the facility used for adding chemicals to drinking water. They took control of the system in just a day, and Maiffret said it would have taken just a couple of additional steps to dump chemicals into the water to make it potentially undrinkable.
38
Making critical systems remotely accessible from the internet creates obvious security risks. But if Stuxnet proved anything, it’s that an attacker doesn’t need remote access to attack a system—instead, an autonomous worm can be delivered via USB flash drive or via the project files that engineers use to program PLCs. In 2012, Telvent Canada, a maker of control software used in the smart grid, was hacked by intruders linked to the Chinese military, who accessed project files for the SCADA system the company produced—a system installed in oil and gas pipelines in the United States as well as in water systems. Telvent used the project files to manage the systems of customers. Though the company never indicated whether the attackers modified the project files, the breach demonstrated
how easily an attacker might target oil and gas pipelines by infecting the project files of a company like Telvent.
39
Direct computer network intrusions aren’t the only concern when it comes to critical infrastructure, however. There are documented cases involving electromagnetic pulses interfering with SCADA systems and field devices. In November 1999, the radar system from a US Navy ship conducting exercises twenty-five miles off the coast of San Diego interrupted the wireless networks of SCADA systems at local water and electric utilities. The disturbance prevented workers from opening and closing valves in a pipeline, forcing them to dispatch technicians to remote locations to manually activate the valves and prevent water from overflowing reservoirs. Electromagnetic pulse (EMP) disturbances were also responsible for a gas explosion that occurred near the Dutch naval port of Den Helder in the late ’80s when a naval radar system caused the SCADA system for a natural gas pipeline to open and close a valve.
40
OVER THE YEARS,
numerous Doomsday scenarios have explored the possible consequences of a massive cyberattack.
41
But to date, no such attack has occurred, and unintentional events involving control systems have far outnumbered intentional ones.
But one need only look at accidental industrial disasters to see the extent of damage a cyberattack
could
wreak, since often the consequences of
an industrial accident can be replicated for an intentional attack. A smart hacker could simply study the causes and effects of an accidental disaster reported in the news and use them to design an attack that would achieve the same destructive results.
The NSA’s Keith Alexander has cited the catastrophic accident that occurred at the Sayano-Shushenskaya hydroelectric plant in southern Siberia as an example of what could occur in an attack.
42
The thirty-year-old dam, the sixth largest in the world, was eight hundred feet high and spanned about half a mile across a picturesque gorge on the Yenisei River, before it collapsed in 2009, killing seventy-five people.
Just after midnight on August 17, a 940-ton turbine in the dam’s power-generation plant was hit with a sudden surge of water pressure that knocked it off its bolts and caused it to shoot in the air. As a geyser of water flooded the engine room from the shaft where the turbine had been, it caused massive damage to more than half a dozen other turbines, triggering multiple explosions and causing the roof to cave in.
The catastrophe was attributed in part to a fire at the Bratsk power station some five hundred miles away that caused the energy output from Bratsk to drop. This forced the turbines at Sayano-Shushenskaya to pick up the load. But one of those turbines was already at the end of its life and had been vibrating dangerously on and off for a while. A new control system had been installed months earlier to stabilize the machine, but vibrations from the added workload proved to be too much. The turbine sheared off the bolts holding it down and became unmoored. Surveillance images showed workers scrambling over equipment to flee the site. In addition to killing seventy-five workers and flooding the surrounding community, the plant spilled 100 tons of oil into the Yenisei River and killed 4,000 tons of trout in local fisheries. Experts calculated that repairs would take four years and cost $1.3 billion.
43
The June 1999 pipeline explosion in Washington state also presented a blueprint for hackers to follow. In that case, a 16-inch-diameter pipeline belonging to the Olympic Pipe Line Company in Bellingham ruptured and spewed more than 237,000 gallons of gasoline into a creek in Whatcom Falls Park. Gas poured out of the pipe for ninety minutes before it ignited into a fireball that stretched 1.5 miles downstream, killing two ten-year-old boys and a teen and injuring eight others. Although multiple issues contributed to the disaster, including improperly configured valves and a backhoe that weakened part of the pipe, an unresponsive control system also played a role. “[I]f the SCADA system computers had remained responsive to the commands of the Olympic controllers,” investigators found, “the controller operating the accident pipeline probably would have been able to initiate actions that would have prevented the pressure increase that ruptured the pipeline.”
44
It took operators more than an hour to register the leak, and by then residents were already calling 911 to report a strong smell of petroleum in the creek. Although the gas leak wasn’t caused by hackers, investigators found a number of security problems with Olympic’s system that made it vulnerable to attack. For example, the company had set up remote dial-in access for its SCADA control system that was secured only with a username and password, and its business and SCADA networks were interconnected. Although they were connected by a bridge that provided some security from a casual intruder, the connection lacked a robust firewall as well as virus protection or access monitoring, raising the possibility that a determined attacker could break into the business network from the internet, then jump to the critical SCADA network.
The natural-gas pipeline explosion in San Bruno, California, in 2010 was another worst-case scenario that served as a cautionary tale. The explosion occurred after maintenance on an uninterrupted power supply
unit, or UPS, caused electricity to the SCADA system to go out. A control valve on the pipeline was programmed to fall open automatically if the SCADA system lost power; as a result, gas poured into the pipeline unimpeded, causing pressure to build in the aging structure until it burst. Since the SCADA system had lost power, operators couldn’t see what was happening in the pipeline.
45
Then there was the collapse of a dike in Missouri in December 2005. The disaster began when sensors on the dam wall became detached from their mounts and failed to detect when the dam’s 1.5 billion-gallon reservoir was full. As pumps continued to feed water to the reservoir, a “fail-safe” shutdown system also failed to work.
46
The overflow began around 5:10 a.m. and within six minutes a 60-foot section of the parapet wall gave way. More than a billion gallons of water poured down Proffit Mountain, sweeping up rocks and trees in its massive embrace before entering Johnson’s Shut-Ins State Park and washing away the park superintendent’s home—with him and his family still in it—and depositing them a quarter of a mile away.
47
No one was seriously injured, but cars on a nearby highway were also swept up in the torrent, and a campground at the park was flooded. Luckily, because it was winter, the campsite was empty.
Railway accidents also provide blueprints for digital attacks. The systems that operate passenger trains combine multiple, often interconnected components that provide possible avenues for attack: access-control systems to keep nonticketed pedestrians out of stations, credit-card processing systems, digital advertising systems, lighting management, and
closed-circuit TVs, not to mention the more critical systems for fire and emergency response, crossings and signals control, and the operation of the trains themselves. In the past, these systems were separate and did not communicate with one another except through wires. But today the systems are increasingly digital and interconnected, including systems that communicate via radio signals and transmit unencrypted commands in the clear. Although rail systems have redundancies and fail-safe mechanisms to prevent accidents from occurring, when many systems are interconnected, it creates the opportunity for misconfigurations that could allow someone to access the safety systems and undermine them.