Foundation and Earth (58 page)

Read Foundation and Earth Online

Authors: Isaac Asimov

BOOK: Foundation and Earth
5.16Mb size Format: txt, pdf, ePub

“Perfectly, sir. I was manufactured, and existed for a time—how brief a time it seems to me, now—on the Spacer world of Aurora.”

“The one with the—” Trevize paused.

“Yes, sir. The one with the dogs.”

“You know about that?”

“Yes, sir.”

“How do you come to be here, then, if you lived at first on Aurora?”

“Sir, it was to prevent the creation of a radioactive Earth that I came here in the very beginnings of the settlement of the Galaxy. There was another robot with me, named Giskard, who could sense and adjust minds.”

“As Bliss can?”

“Yes, sir. We failed, in a way, and Giskard ceased to operate. Before the cessation, however, he made it possible for me to have his talent and left it to me to care for the Galaxy; for Earth, particularly.”

“Why Earth, particularly?”

“In part because of a man named Elijah Baley, an Earthman.”

Pelorat put in excitedly, “He is the culture-hero I mentioned some time ago, Golan.”

“A culture-hero, sir?”

“What Dr. Pelorat means,” said Trevize, “is that he is a person to whom much was attributed, and who may have been an amalgamation of many men in actual history, or who may be an invented person altogether.”

Daneel considered for a moment, and then said, quite calmly, “That is not so, sirs. Elijah Baley was a real man and he was one man. I do not know what your legends say of him, but in actual history, the Galaxy might never have been settled without him. In his honor, I did my best to salvage what I could of Earth after it began to turn radioactive. My fellow-robots were distributed over the Galaxy in an effort to influence a person here—a person there. At one time I maneuvered a beginning to the recycling of Earth’s soil. At another much later time, I maneuvered a beginning to the terraforming of a world circling the nearby star, now called Alpha. In neither case was I truly successful. I could never adjust human minds entirely as I wished, for there was always the chance that I might do harm to the various humans who were adjusted. I was bound, you see—and am bound to this day—by the Laws of Robotics.”

“Yes?”

It did not necessarily take a being with Daneel’s mental power to detect uncertainty in that monosyllable.

“The First Law,” he said, “is this, sir: ‘A robot may
not injure a human being or, through inaction, allow a human being to come to harm.’ The Second Law: ‘A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.’ The Third Law: ‘A robot must protect its own existence, as long as such protection does not conflict with the First or Second Law.’ —Naturally, I give you these laws in the approximation of language. In actual fact they represent complicated mathematical configurations of our positronic brain-paths.”

“Do you find it difficult to deal with those Laws?”

“I must, sir. The First Law is an absolute that almost forbids the use of my mental talents altogether. When dealing with the Galaxy it is not likely that any course of action will prevent harm altogether. Always, some people, perhaps many people, will suffer, so that a robot must choose minimum harm. Yet, the complexity of possibilities is such that it takes time to make that choice and one is, even then, never certain.”

“I see that,” said Trevize.

“All through Galactic history,” said Daneel, “I tried to ameliorate the worst aspects of the strife and disaster that perpetually made itself felt in the Galaxy. I may have succeeded, on occasion, and to some extent, but if you know your Galactic history, you will know that I did not succeed often, or by much.”

“That much I know,” said Trevize, with a wry smile.

“Just before Giskard’s end, he conceived of a robotic law that superseded even the first. We called it the ‘Zeroth Law’ out of an inability to think of any other name that made sense. The Zeroth Law is: ‘A robot may not injure humanity or, through inaction, allow humanity to come to harm.’ This automatically means that the First Law must be modified to be: ‘A robot may not injure a human being, or, through inaction, allow a human being to come to harm, except where that would conflict with the Zeroth Law.’ And similar modifications must be made in the Second and Third Laws.”

Trevize frowned. “How do you decide what is injurious, or not injurious, to humanity as a whole?”

“Precisely, sir,” said Daneel. “In theory, the Zeroth Law was the answer to our problems. In practice, we could never decide. A human being is a concrete object. Injury to a person can be estimated and judged. Humanity is an abstraction. How do we deal with it?”

“I don’t know,” said Trevize.

“Wait,” said Pelorat. “You could convert humanity into a single organism. Gaia.”

“That is what I tried to do, sir. I engineered the founding of Gaia. If humanity could be made a single organism, it would become a concrete object, and it could be dealt with. It was, however, not as easy to create a superorganism as I had hoped. In the first place, it could not be done unless human beings valued the superorganism more than their individuality, and I had to find a mind-cast that would allow that. It was a long time before I thought of the Laws of Robotics.”

“Ah, then, the Gaians
are
robots. I had suspected that from the start.”

“In that case, you suspected incorrectly, sir. They are human beings, but they have brains firmly inculcated with the equivalent of the Laws of Robotics. They have to value life,
really
value it. —And even after that was done, there remained a serious flaw. A superorganism consisting of human beings only is unstable. It cannot be set up. Other animals must be added—then plants—then the inorganic world. The smallest superorganism that is truly stable is an entire world, and a world large enough and complex enough to have a stable ecology. It took a long time to understand this, and it is only in this last century that Gaia was
fully
established and that it became ready to move on toward Galaxia—and, even so, that will take a long time, too. Perhaps not as long as the road already traveled, however, since we now know the rules.”

“But you needed me to make the decision for you. Is that it, Daneel?”

“Yes, sir. The Laws of Robotics would not allow me, nor Gaia, to make the decision and chance harm to humanity. And meanwhile, five centuries ago, when it seemed that I would never work out methods for getting round all the difficulties that stood in the way of establishing Gaia, I turned to the second-best and helped bring about the development of the science of psychohistory.”

“I might have guessed that,” mumbled Trevize. “You know, Daneel, I’m beginning to believe you
are
twenty thousand years old.”

“Thank you, sir.”

Pelorat said, “Wait a while. I think I see something. Are you part of Gaia yourself, Daneel? Would that be how you knew about the dogs on Aurora? Through Bliss?”

Daneel said, “In a way, sir, you are correct. I am associated with Gaia, though I am not part of it.”

Trevize’s eyebrows went up. “That sounds like Comporellon, the world we visited immediately after leaving Gaia. It insists it is not part of the Foundation Confederation, but is only associated with it.”

Slowly, Daneel nodded. “I suppose that analogy is apt, sir. I can, as an associate of Gaia, make myself aware of what Gaia is aware of—in the person of the woman, Bliss, for instance. Gaia, however, cannot make itself aware of what I am aware of, so that I maintain my freedom of action. That freedom of action is necessary until Galaxia is well established.”

Trevize looked steadily at the robot for a moment, then said, “And did you use your awareness through Bliss in order to interfere with events on our journey to mold them to your better liking?”

Daneel sighed in a curiously human fashion. “I could not do much, sir. The Laws of Robotics always hold me back. —And yet, I lightened the load on Bliss’s mind, taking a small amount of added responsibility on myself, so that she might deal with the wolves
of Aurora and the Spacer on Solaria with greater dispatch and with less harm to herself. In addition, I influenced the woman on Comporellon and the one on New Earth, through Bliss, in order to have them look with favor on you, so that you might continue on your journey.”

Trevize smiled, half-sadly. “I ought to have known it wasn’t I.”

Daneel accepted the statement without its rueful self-deprecation. “On the contrary, sir,” he said, “it was you in considerable part. Each of the two women looked with favor upon you from the start. I merely strengthened the impulse already present—about all one can safely do under the strictures of the Laws of Robotics. Because of those strictures—and for other reasons as well—it was only with great difficulty that I brought you here, and only indirectly. I was in great danger at several points of losing you.”

“And now I
am
here,” said Trevize. “What is it you want of me? To confirm my decision in favor of Galaxia?”

Daneel’s face, always expressionless, somehow managed to seem despairing. “No, sir. The mere decision is no longer enough. I brought you here, as best I could in my present condition, for something far more desperate. I am dying.”

102.

PERHAPS IT WAS BECAUSE OF THE MATTER-OF-FACT way in which Daneel said it; or perhaps because a lifetime of twenty thousand years made death seem no tragedy to one doomed to live less than half a percent of that period; but, in any case, Trevize felt no stir of sympathy.

“Die? Can a machine die?”

“I can cease to exist, sir. Call it by whatever word you wish. I am old. Not one sentient being in the Galaxy that was alive when I was first given consciousness
is still alive today; nothing organic; nothing robotic. Even I myself lack continuity.”

“In what way?”

“There is no physical part of my body, sir, that has escaped replacement, not only once but many times. Even my positronic brain has been replaced on five different occasions. Each time the contents of my earlier brain were etched into the newer one to the last positron. Each time, the new brain had a greater capacity and complexity than the old, so that there was room for more memories, and for faster decision and action. But—”

“But?”

“The more advanced and complex the brain, the more unstable it is, and the more quickly it deteriorates. My present brain is a hundred thousand times as sensitive as my first, and has ten million times the capacity; but whereas my first brain endured for over ten thousand years, the present one is but six hundred years old and is unmistakably senescent. With every memory of twenty thousand years perfectly recorded and with a perfect recall mechanism in place, the brain is filled. There is a rapidly declining ability to reach decisions; an even more rapidly declining ability to test and influence minds at hyperspatial distances. Nor can I design a sixth brain. Further miniaturization will run against the blank wall of the uncertainty principle, and further complexity will but assure decay almost at once.”

Pelorat seemed desperately troubled. “But surely, Daneel, Gaia can carry on without you. Now that Trevize has judged and selected Galaxia—”

“The process simply took too long, sir,” said Daneel, as always betraying no emotion. “I had to wait for Gaia to be fully established, despite the unanticipated difficulties that arose. By the time a human being—Mr. Trevize—was located who was capable of making the key decision, it was too late. Do not think, however, that I took no measure to lengthen my life
span. Little by little I have reduced my activities, in order to conserve what I could for emergencies. When I could no longer rely on active measures to preserve the isolation of the Earth/moon system, I adopted passive ones. Over a period of years, the humaniform robots that have been working with me have been, one by one, called home. Their last tasks have been to remove all references to Earth in the planetary archives. And without myself and my fellow-robots in full play, Gaia will lack the essential tools to carry through the development of Galaxia in less than an inordinate period of time.”

“And you knew all this,” said Trevize, “when I made my decision?”

“A substantial time before, sir,” said Daneel. “Gaia, of course, did not know.”

“But then,” said Trevize angrily, “what was the use of carrying through the charade? What good has it been? Ever since my decision, I have scoured the Galaxy, searching for Earth and what I thought of as its ‘secret’—not knowing the secret was
you
—in order that I might confirm the decision. Well, I
have
confirmed it. I know now that Galaxia is absolutely essential—and it appears to be all for nothing. Why could you not have left the Galaxy to itself—and me to myself?”

Daneel said, “Because, sir, I have been searching for a way out, and I have been carrying on in the hope that I might find one. I think I have. Instead of replacing my brain with yet another positronic one, which is impractical, I might merge it with a human brain instead; a human brain that is not affected by the Three Laws, and will not only add capacity to my brain, but add a whole new level of abilities as well. That is why I have brought you here.”

Trevize looked appalled. “You mean you plan to merge a human brain into yours? Have the human brain lose its individuality so that you can achieve a two-brain Gaia?”

“Yes, sir. It would not make me immortal, but it might enable me to live long enough to establish Galaxia.”

“And you brought
me
here for that? You want my independence of the Three Laws and my sense of judgment made part of you at the price of my individuality? —No!”

Daneel said, “Yet you said a moment ago that Galaxia is essential for the welfare of the human—”

“Even if it is, it would take a long time to establish, and I would remain an individual in my lifetime. On the other hand, if it were established rapidly, there would be a Galactic loss of individuality and my own loss would be part of an unimaginably greater whole. I would, however, certainly never consent to lose my individuality while the rest of the Galaxy retains theirs.”

Daneel said, “It is, then, as I thought. Your brain would not merge well and, in any case, it would serve a better purpose if you retained an independent judgmental ability.”

Other books

Amos Gets Famous by Gary Paulsen
From Harvey River by Lorna Goodison
A Bright Particular Star by Elizabeth Hanbury
Vampire's Companion by Strong, Jory
Fender Bender Blues by Niecey Roy
Sunburn by Rosanna Leo
Wacko Academy by Faith Wilkins
Flying Feet by Patricia Reilly Giff