Read Countdown to Zero Day: Stuxnet and the Launch of the World's First Digital Weapon Online
Authors: Kim Zetter
Ulasen had been hired by the antivirus firm while still in college. He was hired to be a programmer, but the staff at VirusBlokAda was so small, and Ulasen’s skills so keen, that within three years, at the age of twenty-six, he found himself leading the team that developed and maintained its antivirus engine. He also occasionally worked with the research team that deconstructed malicious threats. This was his favorite part of the job, though it was something he rarely got to do. So when the tech-support team asked him to weigh in on their mystery from Iran, he was happy to help.
3
Ulasen assumed the problem must be a misconfiguration of software or an incompatibility between an application installed on the machine
and the operating system. But then he learned it wasn’t just one machine in Iran that was crashing but multiple machines, including ones that administrators had wiped clean and rebuilt with a fresh installation of the operating system. So he suspected the culprit might be a worm lurking on the victim’s network, reinfecting scrubbed machines each time they were cleaned. He also suspected a rootkit was hiding the intruder from their antivirus engine. Ulasen had written anti-rootkit tools for his company in the past, so he was confident he’d be able to hunt this one down if it was there.
After getting permission to connect to one of the machines in Iran and remotely examine it, Ulasen and Kupreev zeroed in on six suspicious files—two modules and four other files—they thought were the source of the problem.
4
Then with help from several colleagues in their lab, they spent the next several days picking at the files in fits and starts, hurling curses at times as they struggled to decipher what turned out to be surprisingly sophisticated code. As employees of a small firm that mostly developed antivirus products for government customers, they weren’t accustomed to taking on such complex challenges: they spent most of their days providing routine tech support to customers, not analyzing malicious threats. But they pressed forward nonetheless and eventually determined that one of the modules, a driver, was actually a “kernel-level” rootkit, as Ulasen had suspected.
5
Rootkits come in several varieties, but the most difficult to detect are kernel-level rootkits, which burrow deep into the core of a machine to set up shop at the same privileged level where antivirus scanners work. If you think of a computer’s structure like the concentric circles of an archer’s target, the kernel is the bull’s eye, the part of the operating system
that makes everything work. Most hackers write rootkits that operate at a machine’s outer layers—the user level, where applications run—because this is easier to do. But virus scanners can detect these—so a truly skilled hacker places his rootkit at the kernel level of the machine, where it can subvert the scanner. There, it serves as a kind of wingman for malicious files, running interference against scanners so the malware can do its dirty work unhindered and undetected. Kernel-level rootkits aren’t uncommon, but it takes sophisticated knowledge and a deft touch to build one that works well. And this one worked very well.
6
Kupreev determined that the rootkit was designed to hide four malicious .LNK files—the four other suspicious files they’d found on the system in Iran. The malware appeared to be using an exploit composed of these malicious files to spread itself via infected USB flash drives, and the rootkit prevented the .LNK files from being seen on the flash drive. That’s when Kupreev called Ulasen over to have a look.
Exploits that spread malware via USB flash drives aren’t as common as those that spread them over the internet through websites and e-mail attachments, but they aren’t unheard of, either. All of the USB exploits the two researchers had seen before, however, used the Autorun feature of the Windows operating system, which allowed malicious programs on a USB flash drive to execute as soon as the drive was inserted in a machine. But this exploit was more clever.
7
Windows .LNK files are responsible for rendering the icons for the contents of a USB flash drive or other portable media device when it’s plugged into a PC. Insert a USB flash drive into a PC, and Windows Explorer or a similar tool automatically scans it for .LNK files to display the icon for a music file, Word document, or program stored on the flash drive.
8
But in this case, the attackers embedded an exploit in a specially crafted .LNK file so that as soon as Windows Explorer scanned the file, it triggered the exploit to spring into action to surreptitiously deposit the USB’s malicious cargo onto the machine, like a military transport plane dropping camouflaged paratroopers onto enemy territory.
The .LNK exploit attacked such a fundamental feature of the Windows system that Ulasen wondered why no one had thought of it before. It was much worse than Autorun exploits, because those could be easily thwarted by disabling the Autorun feature on machines—a step many network administrators take as a matter of course because of Autorun’s known security risk. But there is no way to easily disable the .LNK function without causing other problems for users.
Ulasen searched a registry of exploits for any others that had used .LNK files in the past, but came up with nothing. That was when he suspected he was looking at a zero-day.
He took a USB flash drive infected with the malicious files and plugged it into a test machine running Windows 7, the newest version of the Microsoft operating system. The machine was fully patched with all the latest security updates. If the .LNK exploit was already known to Microsoft, patches on the system would prevent it from dropping the malicious files onto the machine. But if the .LNK exploit was a zero-day, nothing would stop it. He waited a few minutes to examine the computer and, sure enough, the malicious files were there.
He couldn’t believe it. VirusBlokAda, a tiny security firm that few in the world had ever heard of, had just discovered that rarest of trophies for
a virus hunter. But this wasn’t just any zero-day exploit; it was one that worked against every version of the Windows operating system released since Windows 2000: the attackers had bundled four versions of their exploit together—in four different .LNK files—to make sure their attack worked against every version of Windows it was likely to encounter.
9
Ulasen tried to wrap his head around the number of machines that were at risk of infection from this. But then something equally troubling struck him. The malicious driver module, and another driver module that got dropped onto targeted machines as part of the malicious cargo, had installed themselves seamlessly on their test machine, without any warning notice popping up on-screen to indicate they were doing so. Windows 7 had a security feature that was supposed to tell users if an unsigned driver, or one signed with an untrusted certificate, was trying to install itself on their machine. But these two drivers had loaded with no problem. That was because, Ulasen realized with alarm, they were signed with what appeared to be a legitimate digital certificate from a company called RealTek Semiconductor.
10
Digital certificates are trusted security documents, like digital passports, that software makers use to sign their programs to authenticate them as legitimate products of their company. Microsoft digitally signs its programs and software updates, as do antivirus firms. Computers assume that a file signed with a legitimate digital certificate is trustworthy. But if attackers steal a Microsoft certificate and the private cryptographic “key” that Microsoft uses with the certificate to sign its files, they can fool a computer into thinking their malicious code is Microsoft code.
Attackers had used digital certificates to sign malicious files before.
But they had used fake, self-signed certificates masquerading as legitimate ones, or had obtained real certificates through fraudulent means, such as creating a shell company to trick a certificate authority into issuing them a certificate under the shell company’s name.
11
In both scenarios, attackers ran the risk that machines would view their certificate as suspicious and reject their file. In this case, the attackers had used a valid certificate from RealTek—a trusted hardware maker in Taiwan—to fool computers into thinking the drivers were legitimate RealTek drivers.
It was a tactic Ulasen had never seen before and it raised a lot of questions about how the attackers had pulled it off. One possibility was that they had hijacked the computer of a RealTek software developer and used his machine and credentials to get their code secretly signed.
12
But it was also possible the attackers had simply stolen the signing key and certificate, or cert. For security reasons, smart companies store their certs and keys on offline servers or in hardware security modules that offered extra protection. But not everyone did this, and there were possible clues to suggest that RealTek’s cert had indeed been nabbed. A timestamp on the certificates showed that both of the drivers had been signed on January 25, 2010. Although one of the drivers had been compiled a year earlier on January 1, 2009, the other one was compiled just six minutes before it was signed. The rapid signing suggested the attackers might have had the RealTek key and cert in their possession.
The implications were disturbing. The use of a legitimate digital certificate to authenticate malicious files undermined the trustworthiness of the computer world’s signing architecture and called into question the legitimacy of any file signed with digital certificates thereafter. It was only a matter of time before other attackers copied the tactic and began stealing certificates as well.
13
Ulasen needed to get the word out.
Responsible disclosure dictated that researchers who find vulnerabilities in software notify the relevant vendors before going public with the news to give the vendors time to patch the holes, so Ulasen dashed off e-mails to both RealTek and Microsoft, notifying them of what his team had found.
But after two weeks passed with no response from either company, Ulasen and Kupreev decided they couldn’t keep quiet.
14
The rest of the security community needed to know about the .LNK exploit. They had already added signatures to VirusBlokAda’s antivirus engine to detect the malicious files and were seeing infections pop up on machines all over the Middle East and beyond. The worm/virus was on the run and spreading quickly. They had to go public with the news.
15
So on July 12, Ulasen posted a brief announcement about the zero-day to his company’s website and to an online English-language security
forum, warning that an epidemic of infections was about to break out.
16
He divulged few details about the hole it was attacking, to avoid giving copycat hackers information that would help them exploit it. But members of the forum grasped the implications quickly, noting that it had the potential to be “deadly to many.”
Three days later, tech journalist Brian Krebs picked up the announcement and wrote a blog post about it, summarizing what little was known about the vulnerability and exploit at the time.
17
The news raced through the security community, causing everyone to brace for a wave of assaults expected to come from the worm and copycat attacks using the same exploit.
18
In the meantime, the head of an institute in Germany that researched and tested antivirus products brokered an introduction between Ulasen and his contacts at Microsoft, prompting the software company to begin work on a patch.
19
But with news of the vulnerability already leaked, Microsoft decided to release an immediate advisory about the critical flaw to customers, along with a few tips advising them how to mitigate their risk of infection in the meantime. In the absence of a patch, however, which wouldn’t be released for another two weeks, it was far from a cure.
20
The computer security industry also rumbled into action to address the worm that now had a name—“Stuxnet,” an alias Microsoft conjured from letters in the name of one of the driver files (mrxnet.sys) and another part of the code. As security companies added signatures to their engines to detect the worm and its exploit, thousands of malicious files started showing up on the machines of infected customers.
21
Almost immediately, another surprise emerged. On July 17, an antivirus firm in Slovakia named ESET spotted another malicious driver that appeared to be related to Stuxnet. This one was also signed with a digital certificate from a company in Taiwan, though not from RealTek. Instead, it came from a company called JMicron Technology, a maker of circuits.
The driver was discovered on a computer by itself, without any of Stuxnet’s other files, but everyone assumed it must be related to Stuxnet since it shared similarities with the other drivers that VirusBlokAda had found.
22
There was something notable about the compilation date of this driver, however. When hackers ran their source code through a compiler to translate it into the binary code that a machine could read, the compiler often placed a timestamp in the binary file. Though attackers could manipulate the timestamp to throw researchers off, this one appeared to be legitimate. It indicated that the driver had been compiled on July 14, two days
after
VirusBlokAda had gone public with news of Stuxnet. Had the Stuxnet hackers unleashed the driver in a new attack, completely oblivious to the fact that an obscure antivirus firm in Belarus had just blown their cover? Or had they known their stealth mission was about to be exposed and were racing to get Stuxnet onto more machines before it would be blocked? There were clues that the attackers had missed a few steps while signing the driver with the JMicron cert, which suggested they may indeed have been in a hurry to get their attack code out the door and onto machines.
23
One thing was clear, though: the attackers had needed this new
certificate to sign their driver because the RealTek certificate had expired a month earlier, on June 12. Digital certificates have a limited life-span, and once RealTek’s expired, the attackers could no longer use it to sign new files. The certificate was also revoked by certificate authorities once Stuxnet was exposed, which meant that Windows machines would now reject or flag any files that had already been signed with it.
24