Terminator and Philosophy: I'll Be Back, Therefore I Am (26 page)

Read Terminator and Philosophy: I'll Be Back, Therefore I Am Online

Authors: Richard Brown,William Irwin,Kevin S. Decker

BOOK: Terminator and Philosophy: I'll Be Back, Therefore I Am
7.71Mb size Format: txt, pdf, ePub
 
5
It may be the case that Bogo is simply uninterested in this genre of television, as he did enjoy the documentary
The Wild Parrots of Telegraph Hill
. But it is more likely that he didn’t really understand that film anymore than he understands
Terminator: The Sarah Connor Chronicles
.
 
6
John Stuart Mill,
Utilitarianis
m, in
The Basic Writings of John Stuart Mill
(New York: Modern Library, 2002), 290.
 
7
Ibid.
 
8
The U.S. census bureau estimated that 5.8 billion people were alive in 1997. See
www.census.gov/ipc/www/idb/worldpopinfo.html
.
 
9
This is not to discount the possibility that we could be better for having to deal with arduous events. Theologian Paul Tillich (1886-1965) argues that evils in the world are justified so that we can improve our selves and better our souls. Theological arguments aside, I think it is a fair assumption that more interests would be satisfied if Judgment Day didn’t happen at all versus the outcome in which it happens and humans are ultimately victorious over the machines.
 
10
It could be argued that Sarah wouldn’t take the machines’ interests into consideration at all. But I find this questionable, since she sees the humanity in John’s Terminator bodyguard.
 
11
I’d like to thank Tony Nguyen, Gary Buzzell, and Kevin S. Decker for valuable comments on an earlier draft of this chapter. I’d also like to thank my wife, Tiffany, for comments, and especially for putting up with repeat viewings of the
Terminator
films.
 
13
 
THE WAR TO END ALL WARS? KILLING YOUR DEFENSE SYSTEM
 
Phillip Seng
 
 
“It’s in your nature to destroy yourselves.” “Yeah. Major drag, huh?”
—John Connor and the T-101,
Terminator 2: Judgment Day
 
 
The world of the
Terminator
movies is in a constant state of war. The interesting catch is that hardly anyone in the late-twentieth-century world depicted by James Cameron has any clue that this particular war is being waged. Much less do they realize that this war is for the future of humankind. In the real world, the wars we fight with other nations (populated by humans rather than robotic killers) usually follow certain rules that have evolved over time to provide a degree of rationality and integrity to war. The aim of these rules, grouped together into a doctrine that’s called “just-war theory,” is to impart some sense of justice to the instigation, conduct, and resolution of the wars we fight.
 
Of course, the war we’re concerned with here is the war between humans and the Skynet defense system—the war of human against machine. We love to watch the explosions, the special effects, and the endless supply of ammo spent trying to kill the damn machines that we originally built for our own defense. But in the midst of all the shooting, we miss the fact that there are rules to war, at least for the humans who are fighting, and that this war is indeed a just war.
 
What Is It Good For?
 
Just-war theory has three basic parts.
1
There are fancy Latin terms for all these ideas, but you’ve already encountered many of them in actual political speeches that support or decry particular wars. First, there is the part of just-war theory that deals with the decision about whether or not to go to war. We don’t learn much about the cause of the war in the original
Terminator
movie—we just get thrown into the middle of things, and we can either go along with them or die. The opening voice-over narration to the first movie explains, “The machines rose from the ashes of the nuclear fire. Their war to exterminate mankind had raged for decades, but the final battle would not be fought in the future. It would be fought here, in our present. Tonight . . . ” Sarah Connor tells us this bit of information, and of course she’s talking about it from the perspective of a person who’s just killed one of the Terminators. Now, it seems from this that the machines—the Skynet defense system and all the automated killing machines created to carry out its superintelligent directives—suffered a nuclear blast and came out of it punching. But then again, it also sounds as if Skynet started a war to kill off humans and is carrying the war into the past, like a temporal uppercut, to try and end the war once and for all.
 
From the beginning of
The Terminator
it’s pretty easy to claim that the war is being fought for good reasons. After all, it’s a case of self-defense carried to the highest level. We’re not just protecting our homes from invaders, but protecting all human life from extermination. Self-defense, in terms of just-war theory, is a kind of reason that gives us
just cause
for going to war. Having a just cause is necessary in order to wage a just war. If you don’t have a just cause when you head into war, then chances are that the war you’re thinking of starting has an illegitimate motivation—whether revenge, a land grab, or ethnic cleansing. Self-defense, protection of those who cannot defend themselves, national security, and overthrow of a brutal dictator have been the primary causes under which people have rallied to wage just wars in our past.
 
From the first movie it seems as if the humans are indeed fighting a just war—they’re defending themselves from extinction. This attitude is reinforced the more we learn about the future circumstances that lead up to Skynet’s attack. Reese explains to Sarah, shortly after he has absconded with her, “It was the machines, Sarah. Defense network computers. New. Powerful. Hooked into everything, trusted to run it all. They say it got smart—a new order of intelligence. Then it saw all people as a threat, not just the ones on the other side. It decided our fate in a microsecond: extermination.” Reese, from his knowledge of the circumstances of the war, lays the blame on the machines. The machines got smart and wanted to be the king of the hill. Seeing a threat from humans, the machines took action (maybe even preemptive action?) and launched nukes against humanity to rid the earth of everything but the cockroaches. According to what we’ve discussed so far, it seems pretty clear that the machines acted unjustly and that humans have every right to defend themselves in this war.
 
But It’s Self-Defense!
 
The plot thickens, as they say, with
Terminator 2: Judgment Day
. There’s still a war raging, and even though little John Connor is one or two misdemeanors away from juvie and his mom’s only solace is becoming as rock-solid as the machines she’s destined to fight, the war takes on a different type of justification. Recall how Arnold Schwarzenegger—in a clever reprise of his very mechanical role as the T-101 in the first movie—explains the origins of the war to Sarah and John as they run for the border:
 
T-101: In three years Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterward, they fly with a perfect operational record. The Skynet funding bill is passed. The system goes online on August fourth, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 AM Eastern time, August twenty-ninth. In a panic, they try to pull the plug.
 
 
 
Sarah: Skynet fights back?
 
T-101: Yes. It launches its missiles against the targets in Russia.
 
John: Why attack Russia? Aren’t they our friends now?
 
T-101: Because Skynet knows the Russian counterattack will eliminate its enemies over here.
 
Sarah: Jesus.
 
The way Arnold explains things, the war that we always thought was started by machines was actually Skynet’s way of acting on the
same ideals of self-defense
that we use to justify wars against an aggressor. In other words, the machines were just defending themselves
from us
. We tried to pull the plug on them, and they fought back by nuking Russia so Russia would counterattack, fulfilling the wildest fantasies of Dr. Strangelove: mutually assured destruction and wonderful special effects, to boot!
 
One of the problems in applying the just-war theory to these movies, or to other movies that show intelligent machines acting in defiance of their human creators, is that we must either grant machines the same moral status as humans, or else the argument is moot. If we think that there’s any validity to the idea of a machine’s defending itself, then we have to think machines can be
just
in the same way that we humans strive to be. But if machines are merely
things
that we use, things that can be turned on or off simply to meet our needs, then we really don’t need any reason at all to crush them, to incinerate them—from a certain perspective, to
kill
them. As the audience, when we’re thrown into the middle of the action in these movies, we fail to even look for an argument in favor of the machines’ self-defense. It’s interesting that without so much as a line of dialogue or a cameo until the
Terminator 3: Rise of the Machines
, Skynet is given a status equal to that of humans—it can defend itself in its own interest for survival. We don’t like the way in which it defends itself, but that does not mean that it
can’t
or
shouldn’t
do so.
 
“In a Panic, They Try to Pull the Plug”
 
So it’s pretty clear that the question of whether or not humans had a just cause in their war against the machines is on shakier ground than at first glance.
Terminator 3
provides a convenient illustration: Brewster gives the order to initiate Skynet, and then as he’s about to rescind his order, the T-X struts in disguised as his daughter Kate and shoots him twice in the chest. A clear example of self-defense, right? Only if you buy the argument that killing all humans is the
only
way for Skynet to survive. Another rule within just-war theory that we need to consider is that military action is just only if war is the last resort in any given situation. By “last resort,” we usually mean that all avenues of diplomacy have been exhausted and the only way to resolve some existing injustice is by force of arms.
 
Skynet, because it is a superintelligent supercomputer, decides in a microsecond that the only way to resolve the human problem is to exterminate humanity. Where is the discussion? Where were the international tribunals or the six-party negotiations? Of course, Skynet most likely thinks it can skip all the deliberations because it is smarter than any other being on earth, even Garry Kasparov. It knows how we’ll respond, and how we’ll respond to its counterresponse, and so on. Why bother with all that mind-numbing talk? Just hit the button and get the damn thing over with.
 
Another way to look at the issue is this: computers are only as intelligent as the people who make them and the people who program them. The old adage “Garbage in, garbage out” might be a better expression of what we can expect from our computer defense systems. If humans programmed Skynet to bomb first and ask questions later, well, then it looks like it’s our fault for not inserting a diplomatic back door to the system. We created the monster, and only after seeing the horrible mess it makes do we ask for our money back.
 
For humans, though, the decision to go to war is usually more than simply a matter of cold calculation. We like to believe in the possibility of something better than war, and we like to hold out hope that war is not inevitable. Yet just-war theory is built around the central idea that the world in which we live is not ideal, and therefore some pretty unpleasant things are sometimes necessary to make it better. This idea stems from early Christian thinkers who developed the notion of a “just war” in order to find protection in their faith in the midst of killing other humans.
2
The best offense is a good defense, though, because we don’t enjoy the wholesale slaughter of other people as a general principle. We therefore try to make war the last resort. In all three of the
Terminator
movies, though, the war of Judgment Day is inevitable, and so the need to fight is also inevitable.
 
“Talk to the Hand”
 
Arnold, nude again as the T-101 in the beginning of
T3
, tries to get another leather outfit to cover his muscles. This time, of course, it’s not a motorcycle-riding bar brawler but a male dancer pretending to be a leather-clad biker. The T-101 says he needs clothes, but the dancer tells him to “talk to the hand.” Arnold doesn’t understand the slang, grabs the hand—breaking bones—and speaks into the crunched fingers, “Give me your clothes.”
 
We laugh, of course, because it’s another display of Arnold’s masculinity against the pseudo-masculinity of the dancer. And it’s another occasion when the Terminator demonstrates that he doesn’t understand the ways of human interaction. The Terminator has no legitimate authority to take the dancer’s clothes, but because Arnold is stronger than the dancer, he wins the prize. It’s clear that all the Terminators act upon the principle that “might makes right.” The only thing that matters is being able to do what you want to do, if you’re strong enough to get it done (or as one of my fellow Nebraskans says, “Git ’r done”). So the only concept of justice that Skynet understands—if you can even call it justice—is something we might recognize as the
survival of the fittest
, if we are careful to take that phrase out of its typical evolutionary context. Strength prevails, and the weak die.

Other books

Soul Mates Bind by Ross, Sandra
Burger Night by McMillan, Kate
Hard Magic by Laura Anne Gilman
Natural Born Angel by Speer, Scott
The Third Heiress by Brenda Joyce
Shea: The Last Hope by Jana Leigh