Read Black Code: Inside the Battle for Cyberspace Online

Authors: Ronald J. Deibert

Tags: #Social Science, #True Crime, #Computers, #Nonfiction, #Cybercrime, #Security, #Retail

Black Code: Inside the Battle for Cyberspace (9 page)

BOOK: Black Code: Inside the Battle for Cyberspace
2.8Mb size Format: txt, pdf, ePub
ads

The goal of preventing the next 9/11, and of rooting out shadowy terrorist cells in Iraq, Yemen, Afghanistan, and elsewhere, led to another information revolution: big-data analytics. The challenge was clear: to find a way to integrate seamlessly all of the disparate data sets out there but trapped in silos across government agencies and proprietary databases. One of the first attempts to address the “data fusion” problem ended up backfiring, at least in the public eye. The idea behind the Total Information Awareness (TIA) system, spearheaded by ex-admiral and CIA officer John
Poindexter, was simple:
find a way to integrate as much data as possible about everyone – not only data that is classified, but also open-source information – into a single, searchable platform. Credit card transactions, tax records, flights, train trips, and numerous other pieces of information were thought fair game by Poindexter. His involvement in the 1980s Iran-Contra affair, coupled with the frightening civil liberties implications of what TIA embodied, provoked howls of protest among privacy advocates and this effectively short-circuited the project from getting the necessary Congressional approvals. But the TIA did not get shelved. According to Shane Harris, author of
The Watchers
, the TIA went “black budget” and was supported under a secret umbrella, one that was not subject to public scrutiny. TIA was driven underground, but most experts believe that it remained operative.

At the same time as the TIA went dark other projects were being seeded with the same goal, and the result today is a major new defence industry around data fusion and analytics. One of the chief driving forces behind this push was the CIA’S investment arm, In-Q-Tel, which financed a number of startups in the data fusion and analytics arena. One of them, Palantir – a company whose origins lie in the PayPal fraud detection unit and whose founders were given early advice by Poindexter – has become a darling of the defence and intelligence community, but a bit of an outcast among civil libertarians. In February 2011, an Anonymous operation breached the networks of the security company HBGary, and then publicly disclosed plans they had uncovered involving HBGary, the Bank of America, Palantir, and others to attack WikiLeaks servers, and spread misinformation about Wikileaks supporters, including the journalist Glen Greenwald. Although Palantir’s CEO apologized and then distanced his company from the misguided plan, the taint of the association still lingers among many. (Full disclosure: In 2008, Palantir donated a
version of their analytical platform to the Citizen Lab, and we employed it only as a minor research tool during the GhostNet and Shadows investigations.)

Although an industry darling, Palantir is only one among a growing complex of data analysis companies that orbit the law enforcement, military, and intelligence communities. A two-year investigation by the
Washington Post
, called “Top Secret America,” provides some startling insights: “Some 1,271 government organizations and 1,931 private companies work on programs related to counterterrorism, homeland security and intelligence in about 10,000 locations across the United States,” and companies that work specifically on data analysis and in information technology include Alion Science and Technology, Altera Corp., BMC Software, Cubic Corporation, Dynetics, Inc., ESRI, Informatica, Mantech International, MacDonald, Dettwiler and Associates, Inc., Verint Systems Inc., and many, many others. It’s a multi-billion-dollar annual business. These companies’ systems are used to parse enormous databases, scour all existing social networking platforms, integrate data from the vast troves in the hands of telecommunications companies and ISPs, and piece it all together to provide decision makers with actionable intelligence.
As former CIA director David Petraeus explained at In-Q-Tel’s CEO Summit in March 2012, “New cloud computing technologies developed by In-Q-Tel partner companies are driving analytic transformation in the way organizations store, access, and process massive amounts of disparate data via massively parallel and distributed IT systems … among the analytic projects underway with In-Q-Tel startups is one that enables the collection and analysis of worldwide social media feeds, along with projects that use either cloud computing or other methods to explore and analyze big data. These are very welcome additions to the initiatives we have underway to enable us to be the strongest swimmers in the ocean of big data.”

One of the more lucrative of these markets, and potentially the most troubling for privacy, is for biometrics and facial recognition systems. While developed for military, law enforcement, and intelligence purposes – approximately 70 percent of current spending – the broader consumer market is growing fast. Many social media and mobile platforms use facial recognition technology on their digital photo apps so that users can tag, categorize, and verify their own and their friends’ identities – apps like Photo Tag Suggest, Tag My Face, FaceLook, Age Meter, Visidon AppLock, and Klik (produced by
Face.com
, an Israeli-based company that was acquired by Facebook for $100 million in July 2012). In 2010, Cogent, a company which provides finger, palm, face, and iris ID systems to governments and private firms, estimated that the biometric market stood at $4 billion worldwide. Cogent then expected this figure to grow by about 20 percent a year, driven mostly by governments and law enforcement agencies interested in identification systems, a forecast that appears to have been proven correct.

In 2011, Google Executive Chairman (and former CEO) Eric Schmidt explained that he was “very concerned personally about the union of mobile tracking and face recognition.” Just the same, Google had purchased several facial recognition start-ups, including Viewdle and PittPatt, and the German biometric company, Neven Vision. In the same year that Schmidt expressed his concerns, Google launched an “opt-in” photo-tagging feature, Find My Face, for its social network. Just as the NASA space program partially justified its existence on the basis of civilian benefits and spinoffs (and we ended up with freeze-dried food and memory foam as a result), so too we can expect the military and intelligence big-data fusion market to find its way into the civilian marketplace. Regulators are beginning to take note.
In 2012, the Hamburg Commissioner for Data Protection and Freedom of Information
in Germany issued an administrative order to Facebook to cease its automatic facial recognition system until it could bring its operations in line with European data privacy policies. The question remains: Will regulations like this be enough to stem the tide?

•  •  •

It is easy to demonize
the companies involved, but big data is related to our own big habits and big desires. It is we users, after all, who share and network through social media, and it is we who have entrusted our information to “clouds” and social networking services operated by thousands of companies of all shapes, sizes, and geographic locations. We are the ones who have socialized ourselves to share through clicking, through attachments and hyperlinks. Today, surveillance systems penetrate every aspect of life, and individuals can be mapped in space and time with an extraordinary degree of precision. All of this has emerged with our generally unwitting consent, but also with our desire for fame, consumption, and convenience. The
who
that makes up cyberspace is as important as the
what –
and the
who
is changing fast.

4.
The China Syndrome

On November 16, 2009
, under the headline “UN Slated [sic] for Stifling Net Debate” and the subhead “The UN has been criticised for stifling debate about net censorship after it disrupted a meeting of free-speech advocates in Egypt,” BBC News reported the following: “UN security demanded the removal of a poster promoting a book by the OpenNet Initiative (ONI) during a session at the Internet Governance Forum in Egypt. The poster mentioned internet censorship and China’s Great Firewall. The UN said that it had received complaints about the poster and that it had not been ‘pre-approved.’ ‘If we are not allowed to discuss topics such as Internet censorship, surveillance, and privacy at a forum on Internet governance, then what is the point of the IGF?’ Ron Deibert, co-founder of the OpenNet Initiative, told BBC News.”

It came out of nowhere. Our plan to hold a book reception for
Access Controlled
, the second in a three-part series of books on cyberspace, probably would have been yet another sleepy affair had Chinese authorities not intervened. Instead their intervention had a “Streisand Effect,” a phenomenon, according to Wikipedia, “whereby an attempt to hide or remove a piece of information has the unintended consequence of publicizing the information more widely.” Had the Chinese government not been censoring access to Wikipedia, they might have heard of the “Streisand Effect” and left the book launch alone. Alas, no.

In 2009, the annual Internet Governance Forum (IGF) meeting was held in Sharm el-Sheikh, Egypt. An incongruous location – the massive conference centre sits in the middle of the desert like a postmodern pyramid – but swarms of attendees from around the world descended on the facility, lanyards draped around necks, shoulders drooping from the weight of overstuffed conference tote bags. The IGF was set up by the UN as a forum to encourage multi-stakeholder discussions on Internet governance, discussions that included civil society groups and the private sector in an arena typically reserved for governments. It might as well have had a red target painted on it for those intent on preventing that very thing from happening.

Our book launch was disrupted before it even began by UN officials and security guards. First, they approached me demanding that we cease distributing pamphlets advertising the event, circling in pen a reference to Tibet in our promotional material. I was confused. “What’s going on here?” I asked. After some back and forth, I agreed to stop distributing promotional material as the event was about to begin and the benefits of additional PR seemed negligible. Moments later, however, the security officials returned, this time insisting that we remove our banner displayed outside the room. An enlarged version of the book cover, the banner included back-of-jacket promotional blurbs and descriptions. When I asked what the problem was, an official pointed to the description of the “Great Firewall of China” and spoke in hushed tones about a “member state” issuing a formal complaint. “You can’t be serious,” I said.

The banner was brought into the room and inelegantly laid on the floor, the debate with security guards continuing as curious onlookers gathered around. Activists, diplomats, and scholars circled us, the number of people in the room growing as word spread. Inevitably, smartphones were pulled out, pictures and videos taken, most of them immediately posted on the Internet. (Many are still available online.) Though we protested vehemently, the banner was
gathered up and escorted out of the room, followed by a procession of catcalls and mock applause. What I expected to be a nonevent had turned into a political melee. Press inquiries began almost as soon as the book launch was over, and I found myself fielding calls from the BBC, CBC, and news outlets from around the world. “Did the United Nations censor your book launch?” “Did they really tear up your poster?” I was repeatedly asked. The IGF’S president, Markus Kummer, made matters worse by excusing the shutdown as a problem not with what was printed on the banner, but with the banner itself. “Commercial banners are not permitted at the IGF,” he claimed, his remarks plainly contradicted by dozens of other banners and posters spread throughout the facility. Kummer appeared less concerned about offending free speech than he was the Chinese government. Included in the crowd was the UN Special Rapporteur for Human Rights, Frank La Rue. A few months later, he issued a statement proclaiming the Internet “a human right.” I wondered to what extent the events in that room inspired his actions.

Internet pundits like to think of autocratic countries like China as aging dinosaurs flailing about trying to stay afloat amidst the tsunami of digital information. In truth, while the West invented the Internet, gave it its original design and vision, and, latterly, assigned to it all sorts of powers and capabilities for rights and freedoms, what cyberspace will look like in the future and how it will be used will largely depend on
who
uses it. The Internet may have been born in Silicon Valley or Cambridge, Massachusetts, but its destiny lies in Shanghai, Delhi, and the streets of Rio de Janeiro, the places where its next billion users are going to come from. China’s actions at the IGF may have been ham-fisted, but they were not an accidental or ad hoc reaction. Rather, they were part of a concerted effort to control cyberspace, an effort we all need to understand in its details.

•  •  •

“In America
, the Internet was originally designed to be free of choke points, so that each packet of information could be routed quickly around any temporary obstruction,” James Fallows wrote in the
Atlantic Monthly
in March 2008. “
In China, the Internet came with choke points built in.” Fallows is referring to the Great Firewall of China (GFW), the country’s Internet censorship system, subject of the IGF banner controversy, and a descriptor that is both metaphor and not.

Secrecy surrounds the GFW but it is China’s Internet backbone and guardian, the country’s deepest layer of communications infrastructure through which all Internet traffic must eventually pass, specifically at three international gateways that connect China’s Internet to the wider world: the Beijing-Tianjin-Qingdao connection point, in Shanghai, and in Guangzhou. For a country with more than 500 million Internet users surfing, texting, downloading, emailing, this is a small set of funnels, but the routers automatically inspect all traffic moving in and out, acting as a kind of border patrol. Requests for content that contains banned keywords, domains, or IP addresses are punted back unceremoniously. Unlike other countries that impose national Internet censorship regimes and that present back to the user a “blocked” or “forbidden” page, the Chinese system sends a wrench into the user’s machine, a “reset” packet that disables the connection and sends back a standard error message giving the impression that the content requested doesn’t exist (“file not found”) or that something is wrong with the Internet. It’s an ingenious way to frustrate users: if you make some websites persistently inaccessible, slow, or maddeningly unreliable for long enough, most people will eventually look elsewhere. Meanwhile, certain Chinese-based content is made widely and freely available for those who want to surf
a
Net, if not
the
Net.
What other functionalities are contained in these gateway routers – surveillance through deep packet inspection of email, for instance – is anyone’s guess, but most cyberspace analysts suspect that the gateways are designed not just to block content but also to siphon up and monitor communications.

BOOK: Black Code: Inside the Battle for Cyberspace
2.8Mb size Format: txt, pdf, ePub
ads

Other books

White Heat (Lost Kings MC #5) by Autumn Jones Lake
Red Sky in the Morning by Margaret Dickinson
Beguiled Again: A Romantic Comedy by Patricia Burroughs
Nocturnes and Other Nocturnes by Claude Lalumière
Under Orders by Dick Francis