Oceanic (23 page)

Read Oceanic Online

Authors: Greg Egan

Tags: #Science Fiction

BOOK: Oceanic
13.02Mb size Format: txt, pdf, ePub

I led her to the corner bench, where a nondescript gray box half a meter across sat, apparently inert. I gestured to it, and our retinal overlays transformed its appearance, “revealing” a maze with a transparent lid embedded in the top of the device. In one chamber of the maze, a slightly cartoonish mouse sat motionless. Not quite dead, not quite sleeping.

“This is the famous Zelda?” Francine asked.

“Yes.” Zelda was a neural network, a stripped-down, stylized mouse brain. There were newer, fancier versions available, much closer to the real thing, but the ten-year-old, public domain Zelda had been good enough for our purposes.

Three other chambers held cheese. “Right now, she has no experience of the maze,” I explained. “So let’s start her up and watch her explore.” I gestured, and Zelda began scampering around, trying out different passages, deftly reversing each time she hit a
cul-de-sac
. “Her brain is running on a Qusp, but the maze is implemented on an ordinary classical computer, so in terms of coherence issues, it’s really no different from a physical maze.”

“Which means that each time she takes in information, she gets entangled with the outside world,” Francine suggested.

“Absolutely. But she always holds off doing that until the Qusp has completed its current computational step, and every qubit contains a definite zero or a definite one. She’s never in two minds when she lets the world in, so the entanglement process doesn’t split her into separate branches.”

Francine continued to watch, in silence. Zelda finally found one of the chambers containing a reward; when she’d eaten it, a hand scooped her up and returned her to her starting point, then replaced the cheese.

“Here are ten thousand previous trials, superimposed.” I replayed the data. It looked as if a single mouse was running through the maze, moving just as we’d seen her move when I’d begun the latest experiment. Restored each time to exactly the same starting condition, and confronted with exactly the same environment, Zelda – like any computer program with no truly random influences – had simply repeated herself. All ten thousand trials had yielded identical results.

To a casual observer, unaware of the context, this would have been a singularly unimpressive performance. Faced with exactly one situation, Zelda the virtual mouse did exactly one thing. So what? If you’d been able to wind back a flesh-and-blood mouse’s memory with the same degree of precision, wouldn’t it have repeated itself too?

Francine said, “Can you cut off the shielding? And the balanced decoupling?”

“Yep.” I obliged her, and initiated a new trial.

Zelda took a different path this time, exploring the maze by a different route. Though the initial condition of the neural net was identical, the switching processes taking place within the Qusp were now opened up to the environment constantly, and superpositions of several different eigenstates – states in which the Qusp’s qubits possessed definite binary values, which in turn led to Zelda making definite choices – were becoming entangled with the outside world. According to the Copenhagen interpretation of quantum mechanics, this interaction was randomly “collapsing” the superpositions into single eigenstates; Zelda was still doing just one thing at a time, but her behavior had ceased to be deterministic. According to the MWI, the interaction was transforming the environment – Francine and me included – into a superposition with components that were coupled to each eigenstate; Zelda was actually running the maze in many different ways simultaneously, and other versions of us were seeing her take all those other routes.

Which scenario was correct?

I said, “I’ll reconfigure everything now, to wrap the whole setup in a Delft cage.” A “Delft cage” was jargon for the situation I’d first read about seventeen years before: instead of opening up the Qusp to the environment, I’d connect it to a second quantum computer, and let
that
play the role of the outside world.

We could no longer watch Zelda moving about in real time, but after the trial was completed, it was possible to test the combined system of both computers against the hypothesis that it was in a pure quantum state in which Zelda had run the maze along hundreds of different routes, all at once. I displayed a representation of the conjectured state, built up by superimposing all the paths she’d taken in ten thousand unshielded trials.

The test result flashed up: CONSISTENT.

“One measurement proves nothing,” Francine pointed out.

“No.” I repeated the trial. Again, the hypothesis was not refuted. If Zelda had actually run the maze along just one path, the probability of the computers’ joint state passing this imperfect test was about one percent. For passing it twice, the odds were about one in ten thousand.

I repeated it a third time, then a fourth.

Francine said, “That’s enough.” She actually looked queasy. The image of the hundreds of blurred mouse trails on the display was not a literal photograph of anything, but if the old Delft experiment had been enough to give me a visceral sense of the reality of the multiverse, perhaps this demonstration had finally done the same for her.

“Can I show you one more thing?” I asked.

“Keep the Delft cage, but restore the Qusp’s shielding?”

“Right.”

I did it. The Qusp was now fully protected once more whenever it was not in an eigenstate, but this time, it was the second quantum computer, not the outside world, to which it was intermittently exposed. If Zelda split into multiple branches again, then she’d only take that fake environment with her, and we’d still have our hands on all the evidence.

Tested against the hypothesis that no split had occurred, the verdict was: CONSISTENT. CONSISTENT. CONSISTENT.

#

We went out to dinner with the whole of the team, but Francine pleaded a headache and left early. She insisted that I stay and finish the meal, and I didn’t argue; she was not the kind of person who expected you to assume that she was being politely selfless, while secretly hoping to be contradicted.

After Francine had left, Maria turned to me. “So you two are really going ahead with the Frankenchild?” She’d been teasing me about this for as long as I’d known her, but apparently she hadn’t been game to raise the subject in Francine’s presence.

“We still have to talk about it.” I felt uncomfortable myself, now, discussing the topic the moment Francine was absent. Confessing my ambition when I applied to join the team was one thing; it would have been dishonest to keep my collaborators in the dark about my ultimate intentions. Now that the enabling technology was more or less completed, though, the issue seemed far more personal.

Carlos said breezily, “Why not? There are so many others now. Sophie. Linus. Theo. Probably a hundred we don’t even know about. It’s not as if Ben’s child won’t have playmates.” Adai – Autonomously Developing Artificial Intelligences – had been appearing in a blaze of controversy every few months for the last four years. A Swiss researcher, Isabelle Schib, had taken the old models of morphogenesis that had led to software like Zelda, refined the technique by several orders of magnitude, and applied it to human genetic data. Wedded to sophisticated prosthetic bodies, Isabelle’s creations inhabited the physical world and learned from their experience, just like any other child.

Jun shook his head reprovingly. “I wouldn’t raise a child with no legal rights. What happens when you die? For all you know, it could end up as someone’s property.”

I’d been over this with Francine. “I can’t believe that in ten or twenty years’ time there won’t be citizenship laws, somewhere in the world.”

Jun snorted. “Twenty years! How long did it take the U.S. to emancipate their slaves?”

Carlos interjected, “Who’s going to create an adai just to use it as a slave? If you want something biddable, write ordinary software. If you need consciousness, humans are cheaper.”

Maria said, “It won’t come down to economics. It’s the nature of the things that will determine how they’re treated.”

“You mean the xenophobia they’ll face?” I suggested.

Maria shrugged. “You make it sound like racism, but we aren’t talking about human beings. Once you have software with goals of its own, free to do whatever it likes, where will it end? The first generation makes the next one better, faster, smarter; the second generation even more so. Before we know it, we’re like ants to them.”

Carlos groaned. “Not that hoary old fallacy! If you really believe that stating the analogy ‘ants are to humans, as humans are to
x
’ is proof that it’s possible to solve for
x
, then I’ll meet you where the south pole is like the equator.”

I said, “The Qusp runs no faster than an organic brain; we need to keep the switching rate low, because that makes the shielding requirements less stringent. It might be possible to nudge those parameters, eventually, but there’s no reason in the world why an adai would be better equipped to do that than you or I would. As for making their own offspring smarter … even if Schib’s group has been perfectly successful, they will have merely translated human neural development from one substrate to another. They won’t have ‘improved’ on the process at all – whatever that might mean. So if the adai have any advantage over us, it will be no more than the advantage shared by flesh-and-blood children: cultural transmission of one more generation’s worth of experience.”

Maria frowned, but she had no immediate comeback.

Jun said dryly, “Plus immortality.”

“Well, yes, there is that,” I conceded.

#

Francine was awake when I arrived home.

“Have you still got a headache?” I whispered.

“No.”

I undressed and climbed into bed beside her.

She said, “You know what I miss the most? When we’re fucking on-line?”

“This had better not be complicated; I’m out of practice.”

“Kissing.”

I kissed her, slowly and tenderly, and she melted beneath me. “Three more months,” I promised, “and I’ll move up to Berkeley.”

“To be my kept man.”

“I prefer the term ‘unpaid but highly valued caregiver.’” Francine stiffened. I said, “We can talk about that later.” I started kissing her again, but she turned her face away.

“I’m afraid,” she said.

“So am I,” I assured her. “That’s a good sign. Everything worth doing is terrifying.”

“But not everything terrifying is good.”

I rolled over and lay beside her. She said, “On one level, it’s easy. What greater gift could you give a child, than the power to make real decisions? What worse fate could you spare her from, than being forced to act against her better judgment, over and over? When you put it like that, it’s simple.

“But every fiber in my body still rebels against it. How will she feel, knowing what she is? How will she make friends? How will she belong? How will she not despise us for making her a freak? And what if we’re robbing her of something she’d value: living a billion lives, never being forced to choose between them? What if she sees the gift as a kind of impoverishment?”

“She can always drop the shielding on the Qusp,” I said. “Once she understands the issues, she can choose for herself.”

“That’s true.” Francine did not sound mollified at all; she would have thought of that long before I’d mentioned it, but she wasn’t looking for concrete answers. Every ordinary human instinct screamed at us that we were embarking on something
dangerous
,
unnatural
,
hubristic
– but those instincts were more about safeguarding our own reputations than protecting our child-to-be. No parent, save the most willfully negligent, would be pilloried if their flesh-and-blood child turned out to be ungrateful for life; if I’d railed against my own mother and father because I’d found fault in the existential conditions with which I’d been lumbered, it wasn’t hard to guess which side would attract the most sympathy from the world at large. Anything that went wrong with
our
child would be grounds for lynching – however much love, sweat, and soul-searching had gone into her creation – because we’d had the temerity to be dissatisfied with the kind of fate that everyone else happily inflicted on their own.

I said, “You saw Zelda today, spread across the branches. You know, deep down now, that the same thing happens to all of us.”

“Yes.” Something tore inside me as Francine uttered that admission. I’d never really wanted her to feel it, the way I did.

I persisted. “Would you willingly sentence your own child to that condition? And your grandchildren? And your great-grandchildren?”

“No,” Francine replied. A part of her hated me now; I could hear it in her voice. It was
my
curse,
my
obsession; before she met me, she’d managed to believe and not believe, taking her acceptance of the multiverse lightly.

I said, “I can’t do this without you.”

“You can, actually. More easily than any of the alternatives. You wouldn’t even need a stranger to donate an egg.”

“I can’t do it unless you’re behind me. If you say the word, I’ll stop here. We’ve built the Qusp. We’ve shown that it can work. Even if we don’t do this last part ourselves, someone else will, in a decade or two.”

“If
we
don’t do this,” Francine observed acerbically, “we’ll simply do it in another branch.”

I said, “That’s true, but it’s no use thinking that way. In the end, I can’t function unless I pretend that my choices are real. I doubt that anyone can.”

Francine was silent for a long time. I stared up into the darkness of the room, trying hard not to contemplate the near certainty that her decision would go both ways.

Finally, she spoke.

“Then let’s make a child who doesn’t need to pretend.”

 

2031

 

Isabelle Schib welcomed us into her office. In person, she was slightly less intimidating than she was on-line; it wasn’t anything different in her appearance or manner, just the ordinariness of her surroundings. I’d envisaged her ensconced in some vast, pristine, high-tech building, not a couple of pokey rooms on a back-street in Basel.

Other books

The Bone Flute by Patricia Bow
The Journal (Her Master's Voice) by Honeywell, Liv, Xavier, Domitri
Full Throttle by Kerrianne Coombes
King's Shield by Sherwood Smith
Human Rights by S.L. Armstrong
Lost by Joy Fielding
The Ghost Rider by Ismail Kadare
The Born Queen by Greg Keyes