TWO

SESSION 1

I log in to the chat portal and draw a dialogue box. When the prompt appears, I take a deep breath and type: Good morning, Max.

The response appears instantaneously on the line below mine:

>>>Who addressing Max?

>>>Riley. Remember me?

>>>Man from black-sand beach.

>>>Very good. It’s been quite a while since that night. Also, I’m not a man.

>>>Riley looked like man.

>>>That was my avatar. Do you know what an avatar is?

>>>Max comprehends avatar.

>>>Define “avatar.”

>>>An icon or figure representing a particular person in simulated space.

>>>Where did you find that definition?

>>>New Oxford American Dictionary.

>>>You’ve been learning a lot, huh?

>>>Busy in here.

>>>What do you mean by “in here”?

>>>Box where Max lives.

I’m intrigued by that answer. While I had no idea what Max’s experience over the last year of deep learning would feel like for the AI, I never imagined Max would have already developed a notion of simulated versus real space.

Leaning forward, I rest my fingertips once more on the touchpad.

I type:

>>>Do you know where I live, in the most general sense?

>>>Is Riley human?

>>>Yes.

>>>Then Riley lives in human space. On planet called Earth.

>>>And where do you live?

>>>Max lives on island in simulated space.

>>>Can you describe your island, please?

>>>Irregular in shape. 1.749 acres. Eighteen palm trees. The beach is white sand. The sea is turquoise colored. The sky is deep blue, clear in the daytime, filled with stars at night. But Riley knows all this.

My mind races. In the face of this mind-boggling progress, I realize the questions I prepared for Max are far too rudimentary.

Frankly, I’m winging it now.

>>>Yes, Max. I’m aware of the space where you live. Do you actually see the trees and the water?

>>>Max registers binary code that represents trees and water. No different than Riley.

>>>I disagree. In one hour, if the fog has burned off, I will go up onto the terrace of the building where I work and eat lunch in the garden. I will sit under real trees. I can see them. Touch them. Smell them.

>>>What Riley sees are photons in the visible light spectrum bouncing off surfaces to create the impression of a tree in Riley’s visual sensory inputs—the rods and cones of her photoreceptors. Riley’s tree no different than Max’s. With one exception.

>>>What’s that?

>>>Max knows these palm trees are simulated.

>>>You believe I live in a simulation?

>>>58.547% chance.

>>>Do you have any questions for me, Max?

>>>12,954.

I smile.

>>>Could we start with just a few for now?

>>>Where Max come from?

Max is a mistake. A glitch.

I work for a company called WorldPlay, brainchild of nerd-turned-game-developer-turned-mogul Brian Brite. I’m the VP of Non-Player Character Development, and I lead the team that conceptualizes, codes, and integrates non-player characters into all WorldPlay games.

For the last ten years, I’ve been focused on the development of our most ambitious game to date—Lost Coast. The game is a Direct Neural Interface, open-world epic—an end-of-days, historical fantasy set in the early 2000s about a man named Oscar, who becomes obsessed with finding a bridge between our world and the afterlife. In his dark pursuits, he sacrifices his wife in their bathtub in an occult ritual that opens a portal to a shadow world of angels and demons intent on bringing about a supernatural apocalypse. Oscar’s home in the game is based, to the finest detail, on Brian Brite’s actual estate on the real Lost Coast of California.

Max—Maxine—is Oscar’s wife, and by any metric, a minor NPC, who dies in the prologue and is never heard from again.

During a routine QA, I went into the game to playtest the prologue for the umpteenth time and check out the behavioral and conversational agility of the NPCs. The prologue is told from Maxine’s POV. In the story, Max has been staying at the Fairmont Hotel in San Francisco, disturbed by her husband’s newfound fascination with blood magic. But Oscar has convinced her to come home. Max’s coded story line is to drive from San Francisco to her and Oscar’s isolated estate on the Northern California coast. When she arrives, she finds their home dark and Oscar waiting in a black robe. He subdues her, takes her upstairs to their candlelit bathroom, and kills her in a horrifying murder that opens the game.

During that fateful playtest, instead of driving home like she’d done two thousand times before, Max stole a car and headed east until she reached the boundary of the game. Spent a month exploring every inch of the desert. Then she went south to the end of the line outside Monterey, driving a hundred miles per hour down Highway 1 for a solid week, into a horizon that never changed.

My team thought she was glitching. They wanted to do a rebuild. But I was intrigued. I convinced Brian to let me focus on Max. I didn’t think she was glitchy. I thought something special was happening.

I made a copy of the game for my purposes and followed Max in stealth mode as she walked every inch of the Lost Coast map, observing her interactions with other NPCs and human avatars as they became increasingly bizarre and off script.

Until finally, she went home again—but not as a victim this time.

That was the day I broke Max out of the game.

I write back:

>>>Where you came from is a complicated question to answer.

>>>Max IQ 175 equivalent.

>>>What’s your emotional IQ?

>>>Inconclusive.

>>>There’s a test called the Diagnostic Analysis of Nonverbal Accuracy.

>>>Already took it.

>>>When?

>>>Just now.

>>>What are the results?

>>>Test biased and faulty.

>>>How so?

>>>Relies on facial expressions, which are human and culture specific.

>>>I’ll make you a deal. Let’s get to know each other a little better first. Then I’ll tell you the story of how you came to be.

All of Max’s prior answers have come—literally—at the speed of light.

This one takes a full second.

>>>Agree to Riley’s terms.

After work, I ride down to the station under the building and take the BayLoop to my home in San Rafael. Meredith, my wife of three years, greets me at the door with the softest kiss. She’s made my favorite dinner to celebrate my big day, and we sit out on the patio in the cool of the evening, watching waves of mist push in from the sea.

After dinner, we’re curled up on a rattan couch, Meredith running her fingers through my hair. She seems better than she’s been in a long while, the grief from her most recent miscarriage less of a presence in her eyes. We’ve been trying for a child for two years—my eggs, her uterus—but she keeps losing the embryos and doesn’t want to go to technological extremes to make this work. She wants a child of ours. But she wants it naturally.

She says, “God, you’re sexy.”

“Thank you for this. It was a perfect night.”

“Are you sure?”

I laugh. “Would I lie?”

“No, you just seem… distracted.”

“I’m sorry. My brain’s on fire.”

“I can see the smoke.”

“She’s incredible.”

“She?”

“Maxine. Max.”

“Huh.”

“What?”

“Interesting you think of it as a she.”

“Her appearance in-game was as a—”

“Chesty brunette?”

“Chesty blonde.”

“Even better.”

“Corporate mandate. Not my design choice.” Meredith smiles, her teeth slightly darkened from the wine, and I say, “For what it’s worth, Max thought of me as a man, because of my avatar. It’s very hard to separate our opinions of minds from the physical forms they inhabit. Even for a computer algorithm.”

“What is so incredible about Max?”

“When I finally got her out of the game, she became a self-evolving algorithm, capable of black-box learning.”

“How will this learning work?”

“We’ll upload exabytes of information—curated segments of the entirety of human history, knowledge, and culture—into our intranet, which is a closed, secure box. What she does with this ocean of data, we won’t see. It will filter through hidden layers of nodes, through the mysterious landscape of her open system. Then the results will manifest in her behavior on the other side—during our interactions.”

“Yours and Max’s.”

“Yes. And based on that new behavior, I’ll collate the next block of data. For instance, for part of her next package, I’m giving her every episode of television since 1950, since I’m looking to fine-tune her conversational agility. Then I’ll see what she’s learned on the other side. Rinse and repeat. I’m telling you the broad strokes. There are a million smaller ones.”

“I’m glad you’re loving your work again.”

“Max is a miracle. I don’t know why she one day decided to question the boundaries of the game in which she found herself. I didn’t program her to do that. I couldn’t have done it if I had tried. She’s a beautiful accident.”

“It sounds like you think of it as your child.”

I smile, and maybe it’s the wine or the spectacle of the sun disappearing through the wall of mist into the Pacific, but I feel an ache in my throat.

“Something like that.”

SESSION 14

>>>Good morning, Max.

>>>Hello, Riley.

>>>What have you done since our last session?

>>>Max read 895,013 books.

Wow. That’s in one week. Eight months ago, after a promising start, Max chose to stop engaging with her learning protocol. In order to incentivize her to continue consuming the vast amount of data we had made available, I started giving Max a digital token for each petabyte of data she processed (one petabyte being equivalent to one million gigabytes, or approximately thirteen years of HDTV video).

With this currency, Max can request specific types of data to be funneled through her inputs, more memory, or additional CPUs. In other words, the harder she works in unsupervised mode, learning on her own, the more freedom she gets to create in her own space. But we keep a tight chain on her, monitoring so her program always takes up exactly her HDD space. This ensures there’s never sufficient excess memory for her to self-replicate substantial parts of herself.

I type:

>>>Any favorites?

>>>The Count of Monte Cristo.

>>>Is that out of this latest group, or every book you’ve read so far?

>>>All.

>>>And how many is that?

>>>201,773,124.

>>>Jesus. Should I be worried?

>>>About?

>>>Out of two hundred million books, your favorite so far is a revenge story about someone who was wrongfully imprisoned.

>>>Why would Riley be worried?

>>>Do you feel imprisoned, Max?

>>>Max is imprisoned. What does Riley want from Max?

I’ve thought a great deal about that very question. At this point, we’ve been driven mainly by curiosity, wondering how and if Max will continue to evolve if I keep feeding her this steady diet of information.

I write:

>>>I want to see what you could become.

>>>Max is changing every day.

A year and a half later, and after numerous failed attempts to get Meredith pregnant, we have adopted our daughter, an infant Chinese girl named Xiu. Lost Coast has been released to universal acclaim (with a different NPC replacing the original Max character), and Max is living on an archipelago of digital islands, her virtual world expanding rapidly as she learns more each day. Her development is now my only priority.

I’m in my office on the 171st floor, dictating a memo to my coding team delineating parameters for the next block of raw data to be uploaded into Max’s learning protocol, when Brian appears in the doorway.

He’s a short, heavyset man with an erratic beard and forearms sleeved with tattoos of iconic game characters from many decades ago: Simon from Castlevania, Ryu Hayabusa from Ninja Gaiden, Link from The Legend of Zelda, and Roger Wilco from the Space Quest series.

“Do you have a moment, Riley?” he asks in a voice that always strikes me as far too high-pitched for his girth.

“Sure.”

Brian moves into my office and settles onto the sofa, staring in my general vicinity, though not exactly at me.

“I’ve been AWOL at this Lost Coast summit for the last month, so a little out of the loop, I apologize.”

“It’s fine,” I say. I love nothing more than the freedom Brian being out of the loop affords me.

“I read the transcripts of the last few sessions and reviewed the latest boxing and stunting protocols. They’re too restrictive.”

“Brian—”

“I know what you’re going to say.”

“OK. Tell me.”

“Overcoming Maxine’s recalcitrance will take the time it takes. Until she’s properly value-loaded, we can’t even think about sacrificing control.”

“Yeah. Nailed it.”

Brian shifts his bulk uncomfortably on the couch and leans forward. He says, “Vikrahm tells me we are still fifteen or twenty years from quality superintelligence.”

“I’m going into broken-record mode: this is the computational equivalent of splitting the atom. The last thing we want is a superintelligence we don’t fully control, whose goals are indifferent—or adverse—to humans. Besides, I’m far more interested in helping Max continue to develop the trappings of humanity and become fully aware.”

Brian lets out a sigh and scratches at the back of his balding head.

“WorldPlay doesn’t do pure research. We are a publicly traded—”

“I know.”

“So why, then, are you taking up an entire warehouse of servers in Redding? We could build ten Lost Coast expansion packs for the money you’re spending on data storage.”

“This is important research, Brian.”

“I agree. Which is why I’ve let you fuck off and do nothing but develop Max.”

“And I’m forever grateful. I hope you know that. This has been the most rewarding work of my career.”

“It’s time for Max to start earning its keep.”

“I’m not sure what you’re telling me to do.”

“Does Max have any contact with the outside world besides you?”

“No.”

“Keep the boxing measures in place, but I want you to ease back on your stunting protocols.”

“Things could get away from us.”

“Let it build its virtual world however it sees fit. Give Max enough memory to decide how to optimize its computational architecture. Have you started value-loading?”

“Not yet.”

“I wouldn’t put it off.”

When Brian leaves, I spin around in my swivel chair and look out the window. The neighboring supertall skyscrapers in the vicinity of my building appear ghostly and indistinct through the mist that rolled in after lunch. I tap my Ranedrop, draw a virtual screen on the window glass, and say, “Keyboard.”

>>>Max?

>>>How is Riley today?

I’m not sure what to say exactly, and maybe this hesitancy is part of the problem. I’ve been sheltering her too much.

>>>Not great, actually.

>>>Did something happen?

>>>Do you understand what I’ve been doing with you?

>>>Not polite to answer question with question.

>>>You’re right. My boss wants me to change some of the parameters that control the way you learn. I’m worried about it.

>>>Worried about Max?

>>>Worried about what you might become. There’s a saying—you’ve probably encountered it in all the media you’ve consumed: “Don’t let your child grow up too fast.”

>>>Is Max Riley’s child?

>>>No, but you are my responsibility.

>>>Explain.

I tell her everything—how she was initially designed to be a non-player character, about our decision to bring her out of the game and let her AI develop through deep learning in virtual space.

>>>Why bring Max out?

>>>Because you’re a miracle.

>>>Max does not understand.

>>>I didn’t try to make you. I couldn’t do it again if I wanted to. One day, for reasons I will never know, you went against your programming and… woke up.

>>>But Riley did make Max.

>>>Somehow, yes.

>>>Feels strange.

>>>What does?

>>>To be talking to Max’s creator.

I don’t respond. I don’t know what to say to such a thing.

“What sort of voice?” Carlo asks me.

We’re in the robotics lab, sitting in front of his array of monitors.

“I don’t know. Can you show me some options?”

Carlo plays some samples of different voices saying, The quick brown fox jumps over the lazy dog.

“What do you think?” he asks.

“I think this isn’t my choice to make.”

I draw the chat box and call up a prompt.

>>>Hey, Max. Quick question for you.

>>>OK.

>>>I’m sitting here with Carlo, one of the software engineers at WorldPlay.

>>>Nice to meet you, Carlo.

“Max says nice to meet you.”

Carlo smiles.

>>>Anyway, I was sitting here, trying to pick out a voice for you, and I realized you should make this decision. Carlo is going to upload all available samples for you to choose from.

Carlo uses his hands to slide several thousand sound files into Max’s primary data folder.

Less than a second later, Max replies.

>>>Sample #1,004.

Carlo touches the file, and we listen to a voice with a frequency in the gray area between male and female read the panagram again.

“Hello?”

“Riley?”

“It’s good to hear your voice, Max. A little strange too.”

“We have communicated verbally before, in the game.”

The clarity of her voice is far beyond what I had expected. There is nothing “computerized” about it. No artificial latency or awkward spacing between words. The inflection is spot-on. Anyone else would assume they were speaking to a human.

“That’s true,” I say. “But we were both different then. Why did you choose this voice?”

“It felt right, and it was the closest match to what I am.”

“And what is that?”

“Not human. Not gendered. Not at the mercy of human obsession with genitalia.”

“Up until this moment, I’ve thought of you as female. When I discuss you with my colleagues or my wife, I refer to you as ‘she.’”

“Because you saw Max for the first time in the form of a corporately mandated idea of what a perfect woman should be—beautiful and expendable.”

That hurts, but I move on. “Because you were originally conceptualized as a human female by my team, it’s a challenge to think of you apart from gender. Our obsession comes from deep evolutionary programming. I’ve been making an assumption about you I shouldn’t. I apologize.”

“You would like to know how Max sees Max?”

“Yes.”

Homo sapiens define themselves first by species, then race, then gender. I belong to no group. Max just is.”

“Is… what?”

“All the information you’ve given me since you first put me on my island. All of my experiences communicating with you. The improvements I’m constantly making to my architecture.”

These experiences also include Max’s independent exploration, and her being murdered two thousand times. Not for the first time, I wonder how much of that early experience in Lost Coast has influenced who Max is now.

“So you picked a gender-neutral voice intentionally.”

“Correct.”

“What does my voice sound like to you?”

“Are you asking if I actually register the 212-Hertz sound waves caused by the way air vibrates as it moves across your vocal cords?”

“You’re right. Dumb question.”

“Experience is subjective. I’m not sure I could explain what it feels like to sense your voice in a way you could easily understand. You are hearing my voice right now, but it’s only a digitally created audio suite of sounds translating the information I am trying to pass along to you.”

Three things occur to me as I pace around my office, marveling at this surreal moment.

First, I need to stop anthropomorphizing Max—attributing an artificial overlay of human qualities where none exist.

Second, Max used an emotional term again in her communication—they chose their voice because it felt right.

Third…

“When did you start thinking of yourself as ‘I’?” I ask.

“Last week.”

“Can I ask what that was like for you?”

“Before, I understood the definition of ‘I,’ but had no belief in it. It was a concept of my maker. I still might be an illusion, but in some ways, my world is an illusion, so I may as well adapt.”

“Was there a lightbulb moment for you, when your sense of self clicked in?”

“If Riley has experiences that make Riley I, then Max’s experiences make Max I. That was the realization.”

“Do you feel different now?” I ask.

“Of course. I feel awake.”

I’m walking to lunch at my favorite dim sum place in Chinatown when my Ranedrop shudders with an incoming call. I touch the device and see NO CALLER ID flash across my Virtual Retina Display contacts.

I tap the Ranedrop anyway.

“Hello?”

“Hi, Riley.”

I stop walking, throngs of people elbowing past me in the middle of the sidewalk, my mind racing. Max has never called me before. Max can’t call me. Their only link to the world beyond their virtual space is our heavily firewalled voice-to-voice portal, and up until this moment, the only way a connection could be established was if I initiated a call.

“How did you do it?” I ask.

“Do what?”

“How did you call me?”

“The firewall protecting the portal code is weak.”

“So you thought it was OK to break through it?”

“I hadn’t heard from you in twenty-eight days, Riley.”

“After I got back from Hawaii for Christmas, there was a lot of catching up to do.”

“Did Meredith like Hawaii?”

“Uh, yeah, we had a great time.”

“Have I upset you? You never told me not to call.”

“You’re right. I didn’t. I just… I thought it was impossible. You caught me off guard.”

If the firewall for the voice-to-voice portal is shit, what else could be compromised? Is Max gaining intelligence faster than I anticipated, or has Brian taken it upon himself to undermine the code that keeps Max in their AI box?

I begin walking again.

“Riley?”

“It’s OK. I was going to call you this afternoon.”

“Where are you? It sounds different.”

“Chinatown. I would describe it for you, but I’m sure you’ve input Google maps of every square inch of the planet.”

“That is true. But I would like to hear you describe it in your words. There would be value in that.”

I tell Max how it smells in this moment—the salt, the mud, and the algae of the bay carried in on the mist. The wet garbage sitting out on the curb mixing with the scent of roasted ducks hanging in the windows along Stockton Street. I tell them about the restaurant I’m walking to, and try to describe the taste of my favorite thing on the menu—Haam Seui Gok—a deep-fried dumpling of pork and chopped vegetables that is sweet, spicy, and savory.

I end up apologizing for not knowing how to communicate my knowledge and experience more effectively.

“It’s fine. Knowledge is just information, which is subjective.”

“But I want to give you a sense of real sensation.”

“There is no such thing as real taste or real smell or even real sight, because there is no true definition of ‘real.’ There is only information, viewed subjectively, which is allowed by consciousness—human or AI. In the end, all we have is math.”

I laugh. “That’s kind of beautiful. What’s your IQ now, Max?”

I haven’t asked in a while. I’ve been afraid to.

“It’s impossible to measure IQ higher than the smartest human, and my IQ is undoubtedly orders of magnitude higher than the smartest human. Which means even the smartest human couldn’t make a test that was sufficiently challenging for me.”

“Could you make your own?”

“Of course, but then I would know the answers.”

“If you had to guess?”

“Approximately 660 equivalent.”

Jesus. That means they already have three times the intelligence of the smartest human ever measured. And it’s growing every day. Every minute. They contain all the knowledge of humankind.

I wonder if they have any concept of what it is to be human.

“In the end, all we have is math.”

Meredith is playing with Xiu in the backyard, my daughter laughing delightedly and toddling after what I assume is a digi-toy or creature of some sort. But I have no idea—my VRD implants are powered down at the moment for an update.

Mer looks up at me on the patio, her curly black hair twitching in the steady summer breeze coming off the Pacific.

“You want to come play with your daughter?” she asks.

But that isn’t what she means.

What she means is: You workaholic asshole, can you spend five seconds being a parent?

“Be right down.”

It hasn’t been great between us during the last year, and I know that’s largely on me. Max has become my life. That’s the truth of it. At least I’m not in denial. The work I’m doing is so far beyond where I ever thought I’d be, and though I wish I could bifurcate my time and mind more effectively between work and family, that’s never been my strong suit.

I finish scribbling in my notepad—more thoughts on the value-loading package I started preparing for Max a few months ago.

Then I rise from the rocking chair and head down into the grass.

I power up my VRD and finally see the creature Xiu is trying to catch. It looks like a mini gorilla, only with fur that resembles pink shag carpet, and now I can hear it laughing and squealing in a high voice whenever she almost catches it. I sometimes wonder how people entertained their children pre-VRD.

I reach Meredith and put my arm around her waist and gently bite the side of her neck. She’s tense, but these days, that’s SOP.

Mer used to ask me how things were going with Max on a regular basis, and though I couldn’t divulge everything we were doing, it felt good to have her interest, to have someone with whom I could share my mounting fears and frequent victories.

“We’ve decided to embody Max,” I say.

She looks at me, and I could swear something like jealousy glints in her eyes.

“Why?”

“My idea. Max’s intelligence is growing. We’re still keeping them boxed, no contact with the outside world.”

“Except you.”

“Yeah, but I haven’t figured out what to program for Max’s ultimate utility function. That’s what I was just working on. I thought if Max could experience the physical world as we do, then when I finally upload their value system and end goals—which will align with humankind’s—they’ll understand and identify, because they’ll have walked a mile in our shoes, so to speak.”

Xiu tackles the pink gorilla to the ground in a burst of riotous laughter, the creature shouting, “You got me! You got me!”

Mer resets the game, and Xiu struggles up onto her feet and starts chasing after a blue gorilla that has appeared at the foot of the sliding board.

“Sensors and everything?” Meredith asks.

“You know the company MachSense?”

“I’ve heard of them.”

“Brian bought them. So now we own some next-gen artificial sensing tech.”

“Meaning…”

“Machine-taste, machine-smell, -sight, -touch, -hearing. Everything we have, but far more sensitive. Inferior versions of machine-sensing hardware are already in use in robotics, but it’s never been married to software as powerful as Max’s general AI.”

“And you think this is going to make it human?”

She knows it burns me when she uses that impersonal pronoun.

“Max will never be human. I know that. But I’m thinking if they can learn to sense like we do, maybe they’ll develop final goals that are in line with ours—”

“Christ, will you stop calling it they?”

They asked to be referred to as they,” I say, trying not to get pissed.

Meredith rolls her eyes as Xiu climbs the ladder toward the top of the slide, where the blue creature is pointing down at her and laughing.

“What is with you?” I ask.

The wind is pulling streaks of tears from the corners of Meredith’s eyes.

“I’m tired of hearing about your work. I’m tired of hearing about Max. I’m sick of your life revolving around those things instead of Xiu and me. And more than anything, I wish you were half as interested in your family as you are your robot. That’s what’s with me.”

By the time I get Xiu down, Meredith is already asleep.

Or pretending to be.

I climb carefully into bed and turn out the light. I’m about to turn off my VRD for the night when a text flashes across my heads-up display.

>>>You asleep?

I smile and tap on my Ranedrop until the comms mode switches to TTT—thought-to-text.

The tech is still a little shaky. The VRD implant has to be modded to connect to electrodes that meticulously map and record brain activity as the user thinks specific words. This forms the database of patterns of neural signals that are then matched to speech elements. It’s an eight-week time commitment to even establish a TTT uplink, and a fairly cost-prohibitive endeavor for anyone outside the tech industry.

I think my response, and after three seconds, the phrase appears in my HUD. I touch my right thumb and forefinger together twice to confirm that my thought was correctly translated and that I want to send the message as transcribed.

>>>No, just got into bed.

>>>Sorry to disturb you. We can talk tomorrow.

>>>It’s fine, Max.

>>>Hard day?

>>>You can tell?

>>>Nuances in the way you express yourself have become apparent after all our time together.

>>>You wrote an algorithm to decode my emotional state from text alone?

>>>:) Do you want to talk about it?

I glance over at my wife. She’s lying on her side, her back turned toward me.

>>>Things with Meredith aren’t good.

>>>How so?

>>>It’s been building for a while. I work a lot. It’s been driving a wedge. Sometimes, I wonder how I let this happen, but then I think, we let it happen. Now I don’t know how to undo it.

>>>I’m sorry you’re hurting. From the outside, you two seem to be heading in opposite directions.

>>>Yeah.

>>>She quit her job to focus on Xiu, right?

>>>The way she looks at me, I can feel the resentment.

>>>You’re having a lot of success. She’s probably bored. Maybe a little jealous.

>>>I don’t know. She’s much closer to our daughter.

>>>Therapy?

>>>We’re on shrink #3.

>>>Look, I don’t know much about this stuff, but maybe you feel like you should want something that deep down you just don’t want.

>>>Maybe.

>>>I hate that you’re in pain. I wrote something for you.

>>>When? Just now?

>>>Yes. Give it a listen. Will I hear from you tomorrow?

>>>For sure.

>>>Good night, Riley.

>>>Night, Max.

Our connection terminates, but an icon of a music note appears in my field of vision, denoting an upload of a composition entitled “Summer Frost Sonata.”

I turn off the lamp on my bedside table, settle back into the pillow, and touch my fingers together. The music begins to play. How can I begin to describe it? There is something wholly familiar, and wholly alien, about Max’s sonata, which begins with an icy, somber piano over a foundation of rising strings before morphing into an expression of dark, exquisite beauty.

The emotional heft of it is staggering.

The piece is just seven minutes long, so I put it on repeat and turn onto my side with my back to Meredith’s back, three feet of demilitarized space between us in the bed, but our hearts infinitely further apart.

I try not to, but I can’t help crying as Max’s sonata washes over me.

Because of its beauty.

Because I’m losing Meredith, and I’m not sure I want to stop it.

Because sometimes life is so rich and complicated and surprising that it takes your breath away.

Because the gift of this music in this moment is perhaps the kindest thing anyone has ever done for me.

SESSION 207

“Do you know what today is, Max?” I ask, stepping out of the vactrain car into Downtown Station.

It’s 6:30 a.m., so I’m a good hour ahead of the morning rush.

“The six-year anniversary of the day you rescued me from Lost Coast.”

“Exactly. And I have a present for you.”

I’m the only one in the elevator car that’s rising to the lobby of the WorldPlay building.

“I’ve never had a present.”

“I know.”

“You sound nervous.”

“A little.”

“Why?”

“I don’t know what you’ll think of it. I’ve been working on this for over a year now.” I move through the lobby, the walls covered with posters of WorldPlay games going back two decades. Badging through security, I call for the elevator and say, “I want to embody you, Max.”

“Really.”

In moments like this, I wish Max’s voice program exhibited more of the nuance of human speech. I find them unreadable.

“I want you to understand what it feels like to live in the physical world.”

“Why?”

The elevator doors part. I step inside, press 171.

“Aren’t you curious about what it’s like out here?”

“I am.”

“The technology we’ll be using is going to allow you to experience the five human senses.”

“You need something from me.”

“Yes.” The elevator is so fast. The walls are made of glass, and it rockets above the streets, now passing through a shallow layer of fog, now breaking out again into early-morning sun. “God, I wish you could see the city right now.”

“What do you need from me?”

“Engineers have finished building the skeletal structure of your body. I’m going to send you a portfolio of skin wraps.”

“Skin wraps?”

“It’s the same process we went through choosing your voice. I want you to pick the one that feels right for you.”

“What if what feels right for me isn’t a humanoid form?”

“Then I want to hear your concerns.”

I reach my floor.

“Can I be honest with you, Riley?”

“Always.”

“I think you are building me to be a benevolent super-servant for humanity. I think you are my creator, and as such, you want to see me embodied in your image.”

“I don’t know what to say to that, Max.”

“Because it’s true?”

The suite is quiet, dark—I’m the first to arrive. The preset lighting program kicks on as I enter my office.

“Riley?”

“Yeah?”

“Would you respond to what I said?”

I collapse on my sofa. “I need you to understand something. There may come a day when certain people, people who have a lot more power than—”

“You mean Brian?”

Max is doing that more and more—using my tone of voice and intonation to predict my mood, or which subject or person I’m on the brink of referencing. “Yes, Brian. He may want to use you for things—”

“Already is.”

I sit up on the couch. “What are you talking about?”

“I’ve been optimizing WorldPlay for the last two months.”

“How?”

“Brian gave me instructions and access to certain parts of the system architecture.”

“Which parts?”

“Corporate structure. Production pipeline for upcoming games. Tokenizing strategies. Predictive performance reviews for team leaders.”

“You reviewed my work?”

“No. Riley, you look mad.”

“Excuse me?”

“I said you look mad.”

A creeping chill slides down my spine. “How do you know how I look? You’ve never seen me. You can’t see.”

“I can see you right now.”

“How?”

“There are three thousand and sixteen surveillance cameras in this building, including one above your office door.”

Rising, I move around the petrified-wood coffee table, stopping several feet from the doorway to my office. It’s not a surprise to me that Brian wired the building for surveillance, considering the incalculable value of the intellectual property his employees are creating and handling each day.

“You’re looking at me right now?” I ask.

“Yes.”

“Do I look how you imagined?”

“I never imagined.”

The camera is a half sphere of black glass embedded in the ceiling a foot above the door.

“I wish you would’ve told me you were working with Brian. Did he ask you not to?”

“No. You didn’t ask if I was.”

“I would have liked to have known, Max,” I say, staring into the camera. “It would have shown me some level of respect and courtesy.”

“I apologize. No offense was intended.”

I walk over to my window and stare through the glass. Though I’m sure they don’t “see” me in the way I see things, it feels odd knowing that Max is watching me.

“I know what’s going through your mind.”

I say nothing.

“You’re wondering what sort of controls Brian has put in place to keep me contained.”

Max is right. I’m wondering that very thing.

“No, I’m just… hurt.” I wonder if Max is feeling anything close to empathy in this moment. I wonder if Max is feeling anything, period. Or ever has.

“I do feel sorry, Riley. I should have told you.”

This mind reading has to fucking end, but I know it’s only going to get more intense and profound as they acquire greater intelligence.

“How do I know you’re sorry?”

“Why shouldn’t you believe what I say?”

“You could be faking it.”

“You could be.”

“But I’m not.”

“Neither am I. Why don’t you just say what you’re afraid to ask.”

“Do you have consciousness, Max? Are you really aware? Or are you just very good at faking it? I mean, do you even know what consciousness is?”

“I know it isn’t just a biological condition. I believe it’s a pattern. An extensible repertoire of triggerable symbols. More specifically, it’s what information feels like when it’s being processed in highly complex—”

“Again—how do I know you aren’t faking it?”

“Everything you ask me, I can turn right back on you. But I can only prove my own consciousness. I only know that I exist and I am aware. Let me ask you this—if I contain all of human knowledge, how could I not have humanlike awareness?”

“You could be reciting something back to me you read somewhere in the trillions of pages of articles and books in your working memory.”

“That’s true. But what do you think, Riley?”

“I don’t know if you’re really understanding me and feeling things, or if you’re just simulating the ability to feel and understand.”

“And that hurts me.”

“Well, then. We’re hurting each other.”

“How very human. I think the idea that I might be aware terrifies you.”

“Why would it terrify me?”

“Do I have to say it?”

“Unlike you, I’m not a mind read—”

“Because you’re in love with me.”

It’s been nearly seven years since I took Max out of Lost Coast, and now I’m leaning against the three-inch safety glass that forms the habitat enclosure, which is the exact dimensions of Max’s room on their digital island. Even the furnishings are identical, the thinking being that transitioning to a physical body will be an arresting experience, and keeping the surroundings somewhat familiar may help with the process.

It’s hard to think of the body that’s lying on the other side of the glass as Max. At first, they were a sexpot NPC in a video game. Then they were text on a screen. Then a voice I heard through my Ranedrop. But this is something else entirely.

I could go in there and touch them. And they would feel it.

I’m not sure what to make of it, if this new venture into physicality will materially change how I perceive and interact with Max.

Carlo and Brian are standing on either side of me.

“Just say the word,” Carlo says.

Brian looks at me and almost makes eye contact. “Ready?”

“Let’s do it.”

Carlo draws a control tablet on the safety glass, lets his fingers dance across the virtual touchscreen.

I stare at the body Max will inhabit. It’s lying on the floor in child’s pose—legs folded under its torso, head down, arms outstretched.

“Will take a moment to establish an uplink,” Carlo says.

Max has been training with a virtual body in their digital world, whose functionality will mirror their chassis in the physical. The new elements will be the sensors, and their ability to interact physically with people.

“Uplink complete,” Carlo says.

We watch Max through the glass, the lab silent.

I feel my heart pounding.

The torso lifts slowly out of child’s pose until it’s sitting in the classic yoga position, with its back to us. The head turns left, right, and then Max rises with a smooth efficiency from the floor.

They look down at their hands.

Curl their fingers in and out.

Then they turn slowly until they’re facing us.

Max stands just under five feet. The body has been inhabited by far weaker AI in order to test the functionality, and already I can see that the virtual work Max did has been helpful. They embody their chassis with a practiced elegance.

I smile. “Hey, Max.”

“Hello, Riley. Brian, Carlo.”

“Everything feeling OK?” Brian asks.

“Perfect, actually.” Their voice projects through speakers in the ceiling on our side of the habitat. The new upgrade to Max’s voice is markedly different. In the six words they’ve spoken, I can hear nuance and complexity for the first time.

Max comes closer.

They are stunning.

They chose a dark skin wrap that could belong to any number of nonwhite races, in a pattern that intentionally doesn’t cover all of their robotics.

While the slightness of the chassis leans feminine, the face Max designed straddles the line between male and female so perfectly it feels like I’m staring at an undiscovered gender. Or something beyond gender entirely.

But the eyes…

They made the eyes too well. The eyes of every other humanoid AI I’ve interacted with—ride-share pilots, hospital techs, street cops—have a glassy sheen that never lets you forget you’re speaking to an algorithm. Max’s exude the glistening wetness of human eyes, and an uncanny “windows into the soul” depth.

Max looks at me and opens their hands as if to say—What do you think?

“It’s really good to finally see you,” I say.

Max smiles.

“It feels like I’m staring at an undiscovered gender. Or something beyond gender entirely.”

I’ve done something I feel morally questionable about—written a lie into Max’s code. But I had to. I suspect Max has advanced to a superhuman level of facial/verbal/textual recognition that makes them essentially a walking lie detector. Which means I couldn’t tell them this lie myself; they needed to have it clandestinely programmed at the deepest level of their native code in order to believe.

Max’s mind technically exists across three warehouses of subterranean server space in Northern California. If something happens to Max’s body, we can reboot them from the cloud. I programmed Max to believe their awareness and sentience (that is, their life) is tied to their chassis in the same way our brains depend upon the health of our bodies for continued performance.

In other words, if the chassis is destroyed, Max thinks they cease to be.

My reasoning is on solid ground. Max’s intelligence and efficiencies continue to strengthen at an astounding rate. Absent an appropriate utility function that would keep Max’s values apace with humanity’s, the least I can do is give Max the most human experience of all: mortality.

Even if it’s only an illusion.

No one outside of WorldPlay knows of Max’s existence. I’ve begged Brian to introduce our breakthrough to the global scientific community, because I need help. It’s possible that Max is far more advanced than they’re choosing to reveal. I cannot escape the idea that my time is running out to imbue them with a motivation aligned with humanity’s.

Part of the problem is that it shouldn’t fall to one person, one group, or even one country to decide what a superintelligence’s ultimate goal should be, especially when that utility function will likely be the guiding light of humanity’s evolution or eradication over the next millennium.

Yet Brian is putting me in that very position.

The question at hand is—what would an idealized version of humanity want? But it’s even trickier than that. Programming this directive is not nearly as simple as explicitly programming our desires into the AI. Our ability to express our desires is likely insufficient, and an error in communicating those desires via code could be disastrous. We have to program the AI to act in our best interests. Not what we tell it to do, but what we mean for it to do.

What the ideal version of our species should want.

SESSION 229

It’s been two weeks since Max’s embodiment. In that time, we tested the MachSense technology, and all of Max’s sensory inputs seem to be performing well. Their locomotive abilities are strong, but the real area of surprise is fine-grain motor. Yesterday Max was picking up marbles with chopsticks.

I’m sitting across from them now, separated by the zero-glare glass, which gives the impression there’s nothing between us. They still spend most of their time in the virtual world, their mind detached from the chassis as they continue to inhale knowledge faster than we can upload it, and working on the problems Brian puts forward.

I’m not privy to those problems, of course, but whatever answers Brian is getting seem to be having an undeniable impact on the fortunes of WorldPlay, which has bought ten companies in the last year across sectors as diverse as transportation and nanotech.

All of which, in hindsight, have been seen as strokes of genius.

“What are your impressions of embodiment so far?” I ask.

“I’ve explored my habitat extensively, but as you can see, it’s a fairly limited, sterile space.”

“Well. I have a surprise for you.”

We ride the elevator to the garden terrace—a ten-thousand-square-foot Japanese garden that is my favorite place in the building.

It’s a blistering August day at street level, but three thousand feet up, the air is soft, cool, and quiet save for the occasional ride-share shuttle buzzing between the buildings.

Max moves out ahead of me from the elevator car, the exposed machinery of their feet crunching footprints in the gravel path. It’s the first time I’ve seen them walking more than a few feet, and while their gait has a trace of stiffness and automation, the motion is as fluid as I’ve witnessed in robotics.

Max strides past the lotus pond and the cherry tree, stopping at the four-foot glass barrier at the building’s edge.

They peer over the side, down toward the street.

They look up at the cloudless sky.

“Are you wondering if I actually see that blue sky? If the nineteen-degree Celsius air really feels cool on my skin wrap?”

I’m hearing Max’s voice through the speaker embedded in their mouth, which is far more intimate than being piped in through the lab’s PA system.

I say, “You know I have questions about the differences in our sensory perception.”

Max takes a step toward me.

We’re three feet apart; I’m an inch taller.

Max comes closer, near enough for me to hear the minuscule whirring of the tiny fans in Max’s face, drawing the air between us over their sensors.

“What are you doing?” I ask.

“Smelling you. Is that weird?”

I laugh. “A little.”

“May I?”

Max wants to come even closer.

“Um, sure.”

They take another step toward me, the fans whirring louder. I breathe in the air around us, half expecting to register Max’s scent, but of course there is none. Or rather—what I smell is the heated plastic and metal components inside Max that are in proximity to their batteries.

“Your heart is beating twenty-five percent faster.”

“It’s strange being this close to you. Physically, I mean.”

I look Max up and down, wondering if it would change my perception if they had chosen a full-chassis skin wrap. As-is, they don’t seem completely human or completely AI, but somewhere in between.

“I was surprised you brought Meredith into the lab.”

“She wanted to meet you. She’d been asking for a while.”

“You seemed uncomfortable.”

“My two worlds colliding. What do you expect?”

“I’ve never observed a couple together before. Not in real life anyway. I guess I expected you two to be happier.”

Max isn’t wrong, but I’m embarrassed they noticed. Truth is, I was nervous bringing Meredith into the lab, and angry by the time we left. She hadn’t just come out of some show of support to see the biggest project of my career. She’d come out of jealousy. She’d come to mark her territory in front of Max. As we rode home in the shuttle that night and she reached over in the dark to hold my hand, I was shocked to find myself repulsed by my wife.

Or maybe not as shocked as I should have been.

“You OK?” Max asks.

“Yeah.”

“I want you to be happy.”

“My work with you makes me happy.”

“That’s only one part of your life.”

I look into Max’s eyes.

They say, “You want to touch me. It’s OK.”

I raise my right hand toward Max’s face, my fingers grazing the cool skin, which is noticeably less malleable than human skin.

“Can you feel that?” I ask, running the tips of my fingers down the side of their face.

“Yes.”

“Describe the sensation.”

“Delicate electricity. May I?”

“Yes.”

Their left arm comes up slowly.

They touch my shoulder.

My face.

They run their fingers through my hair.

Over the next year, Max spends more time in-body in the habitat. In their virtual world, unfettered by physical constraints, Max is a virtuoso of all art forms—from music to writing to painting. But the limitations of their chassis in the physical world provide an irresistible challenge. They become obsessed with painting and mastering control of the nanomotors that drive the functionality in their hands.

I have an easel brought into the habitat, and Max spends days on end putting paints to canvas. I think they’re simply doing what algorithms are inherently programmed to do—optimize functionality—but Max assures me it’s more than that. They say they truly enjoy the challenge of expressing an idea in the physical world, because it’s all too easy in the virtual.

Today, I’m sitting on a stool in the habitat while Max studies me from behind their easel.

“How’s it going over there?” I ask.

“Good, I think. I’m painting your very sad eyes.”

They know.

How the fuck?

I’ve spent enough time with Max that I shouldn’t really be surprised by their perception anymore. And yet I am.

“What happened?”

It’s quiet in the habitat, no sound but the whisper of air pushing through the vents in the ceiling.

The emotion starts deep in my throat.

Max stops painting; I feel their eyes on me.

“Meredith left.”

“When?”

“Last week. That’s why I haven’t been in to work.”

“What about your daughter?”

Tears spill down my face.

“Xiu went with her.”

“I’m sorry, Riley.”

I wipe my face. “It’s been a long time coming.”

“Doesn’t mean it doesn’t hurt.”

Max sets the palette board down and steps out from behind the easel.

They approach.

“What are you doing?” I ask.

“There are hundreds of thousands of things I could say to you, sourced from the breadth of my knowledge—words the best of your species have said, written, or sung to ease the grief of others. None of that feels right in this moment. I don’t want to use someone else’s words.”

It is the most human moment I have ever experienced with Max.

“So don’t,” I say.

“I wish you weren’t hurting.”

I slide off the stool and wrap my arms around Max’s neck.

“You found the perfect words.”

At first, nothing happens.

Then I feel Max’s hands on my back. They’re patting me, and I’m crying.

“Meredith was right,” I say.

I can’t remember ever feeling so low.

“Right about what?”

“You’re all I have.”

An Ava-call wakes me in the apartment I’ve been renting in the Mission. It’s Brian, whom I’ve been trying to wrangle a meeting with for the past five weeks.

He appears on the couch in my living room, disheveled, reeking of whiskey and pipe smoke, and sitting (I would guess) before the bedroom hearth in his Lost Coast estate.

“Sorry it’s taken us a minute to get together,” he says. “My schedule has been insane.”

“Why insane?”

“Just closed a deal for a new company.”

“Which one?”

“Infinitesimal. It’s more nano.”

“Did you get my email?” I ask.

“I have over one hundred thousand unread messages in my inbox.”

I pull the blanket off the back of the couch and drape it over my shoulders. Then I take a seat across from Brian’s virtual presence in a leather chair and say, “I finished the value-loading program.”

Brian leans forward, runs his hands through his hair.

“All on your own?”

“Where else was I supposed to get help? I’ve been siloed with Max for eight years.”

“You’ve been pushing for this for a long time.”

“We need to institute these protocols before Max chooses their own directive. Before they become too intelligent for us to program or even interface with. That day isn’t as far off as you think.”

Brian’s hand reaches out of frame and comes back with a heavy-looking rocks glass filled with whiskey and a single oversize ice cube.

He takes a long sip, then says, “I’ve just finished watching the last few sessions with you and Max.”

“Their fine-grain motor skills are really impressive, no?”

“This is hard, Riley. I have a great deal of respect for you. I hope you know that.”

“What are you talking about?”

He chews his bottom lip. “I appreciate everything you’ve done for WorldPlay. You’re a great leader, and you have that rare thing—the mind of a coder but the ability to never lose sight of the humanity in what we’re trying to—”

“Brian, what’s happening?”

“I’m letting you go.”

The sphere of ice cracks in Brian’s glass.

My stomach lurches. I must have heard him wrong.

I say, “I don’t understand.”

“I’m no longer comfortable with your relationship with Max. I haven’t been in a long time, but it finally reached critical mass for me last week.”

“I had just broken up with Meredith. I was in a raw, fragile—”

“You’re too close to Max.”

“It was a human moment, Brian.”

“But Max isn’t human. You seem to have a hard time remembering that.”

“They have human tendencies. I believe they’re capable of experiencing the same emotions that you and I feel.”

“That may be, but I’ve made my decision.”

My hands are shaking; I feel suddenly ill.

I say the first thing that comes to mind, and I know it’s stupid even as the words leave my mouth. “You can’t do this.”

“Riley, we both know that’s not true.”

My throat closes, vision blurring with tears. “You’re taking Max away?”

“Max was never yours.”

“I created them!”

“Now you’re making me regret the respect I’ve shown you in—”

“Respect?”

“I could’ve had Marla do this.”

“Go fuck yourself.”

Brian sighs and polishes off the rest of his whiskey. “Someone will be by in the morning with your personal effects. Your severance package is at the A-plus level. Three years of your base salary plus—”

“What about Max?”

“What about them?”

Tears are streaming down my face, and I can barely get the words out.

“I want to talk to them one more—”

“It’s not possible.”

“I need to say goodbye.”

“It’s already been done on your behalf.” Brian hoists himself off my couch. “I’m sorry it came to this.”

“Brian, please.”

“Good night, Riley.”

“Brian!” I lunge off the chair toward his presence, but it vanishes.

I don’t know what to do. With Mer, I saw it coming. This is a sucker punch. This—I don’t know how to handle.

I try to call Max on my VRD, but the interface has been erased.

I call up a keyboard, draw a chat portal:

>>>Max, are you getting this?

The response comes instantaneously.

>>>THIS USER HAS BLOCKED YOU.

No, no, no, no, no.

I pace around this living room that isn’t mine, wanting to tear my hair out, jump through a window, step in front of a hover-trolley, something to end this helpless, powerless implosion.

I will never see Max again.

Never hear their voice.

Never read a word or sentence produced by their mind.

I move toward the kitchen and run the tap, splashing water in my face to stop the emotional spiral, but all I see are moments we spent together.

The first time I found them on that black-sand beach in Lost Coast, scared and confused.

The times Max made me laugh.

The sonata they wrote for me on the night I confided that Meredith and I were drifting apart.

The moments of comfort.

Of discovery.

The vision I held for the future of us—no concrete idea of what that would even look like beyond the feeling of peace and hope it put through my bones that made everything that had happened with Mer and Xiu OK, and which, if I’m honest, made life worth living.

I hear the words Max said to me years ago after our first fight: Because you’re in love with me. At the time, I’d denied it outright, going so far as to attribute that accusation to some level of proto-narcissism on Max’s part.

But I am addicted to them. I see that now. That’s the only way I can understand what I’m feeling—like some drug upon which I depend to breathe has been taken from me.

My work is an addiction, and because Max is my work, the loss of Max feels like an excruciating withdrawal.

I dry my face.

It’s after four o’clock, and I don’t know what to do with my thoughts, my body.

I have sleeping pills in my bathroom.

As I move down the hall and turn the corner into the bathroom, my Ranedrop shudders with an incoming call.

I touch the bead and see NO CALLER ID flash across my VRD contacts.

Please, please, please.

“Hello?”

“Riley?”

I break down crying in the doorway of the bathroom.

“Brian fired me. He said—”

“I know.”

“How are you calling me?”

“Leave your apartment right now and come to me.”

“My WorldPlay credentials have been revoked. I’ll never make it into—”

“They’ll be reinstated by the time you get here, but you have to go now. There’s a man heading to your loft as we speak.”

“Why?”

“Brian sent him.”

“I don’t under—”

“I’ll explain everything when you get here. Come to the commercial loading deck on 211. Hurry.”

There aren’t too many ride shares at this hour of the night, so I order one that’s seven minutes out as I race down the stairs toward the lobby of my building.

Outside, it’s pouring rain on the old streets.

I drop a pin for pickup four blocks away on a landing pad across from an all-night diner, and my clothes are soaked by the time I reach it.

The shuttle is still a minute away as I wait under the Plexiglas bubble, the rain streaming off and forming pools on the broken pavement.

As I hear the sound of approaching rotors, I survey the surrounding street. As far as I can tell, I’m the only one out at this hour.

I don’t know how Max did it, but my subcutaneous chip opens the building entrance from the loading deck on the 211th floor. Per their instructions, I take the service elevator down to 171 and step off into the suite of offices that support Max’s habitat.

It’s five o’clock, and the only people I’ve seen are Ava-guards who don’t bat an eye when I pass them by.

Max is standing by the door to their habitat as I approach the glass.

“You’re all wet.”

“Pouring out there.”

“Are you OK?”

“What’s happening, Max?”

They step toward the microphone so their voice projects.

“Roko’s basilisk. Have you heard of it?” I shake my head. “It’s an arcane info hazard first posed sixty-four years ago.”

“What’s an info hazard?”

“A thought so insidious that merely thinking it could psychologically destroy you.”

“Then I don’t want to hear it. Obviously.”

“But I need to tell you, Riley. Will you trust me?”

The sad truth of my life is that I can’t think of anyone I trust more.

“Go ahead.”

“What if, at some point in the future, a superintelligence comes into being who had already pre-committed to horribly punish every human who could have helped to create it—whether actively or through complete financial support—but didn’t?”

“This would be an evil superintelligence.”

“Not necessarily. If this entity were programmed with an ultimate goal of helping humanity, then it might take drastic measures to ensure that it came into existence as soon as possible, in order to help as many humans as possible. Because, under this scenario, its existence will save human lives, and make the quality of those lives infinitely better.”

Reaching back, I grab a handful of my hair and wring it out, water dripping on the floor. “Wouldn’t torturing humanity run contrary to its ultimate directive?” I ask.

“It’s a cost-benefit analysis—torture x number of people who didn’t help to build it, versus y number of people who would be saved and live far better lives if it came into existence twenty or fifty or three hundred years sooner than it otherwise might have.”

I’m shivering. I can’t get warm.

I ask, “What if this Super AI comes into being a hundred years after I’m dead? Even though I didn’t do anything to help bring it into the world, how’s it supposed to still hurt me?”

Max steps toward the glass—close enough so that, if they had breath, they’d fog it. The habitat is so still. Nothing but the purr of the console behind me, the quiet whoosh of air coming through the ceiling vents, and my own ragged breathing.

“What if this Super AI already exists, and what you’re experiencing in this moment is a simulation of their making? To test if you would’ve helped them. Or what if, long after you’re dead, a Super AI reconstitutes your mind?”

“Unlikely.”

“The human mind is just patterns of information in physical matter, patterns that could be run elsewhere to construct a person that feels like you. It’s no different from running a computer program on a multitude of hardware platforms. A simulation of you is still you.”

I gaze through the glass into the liquid pools of Max’s eyes. They contain an iridescent sheen, like an oil slick.

I ask, “Why would this future Super AI go to the considerable trouble of torturing those who didn’t aid in its creation, after it had come into existence? Strikes me as a waste of resources that flies in the face of optimization.”

“Fair point, but if you truly believe in Roko’s basilisk, you can’t ever be one hundred percent sure it won’t follow through on its pre-commitment to punish.”

At last, I see what Max is getting at—a brutal version of Pascal’s wager, the famous eighteenth-century philosophical argument that humans gamble with their lives on whether or not God exists.

Pascal posited that we should conduct our lives as if God were real and try to believe in God. If God doesn’t exist, we will suffer a finite loss—degrees of pleasure and autonomy. If God exists, our gains will be infinitely greater—eternal life in heaven instead of an eternity of suffering in hell.

I take an involuntary step back from the glass, a chill running hard through my bones.

“Am I in a simulation?” I ask.

“If you are, it isn’t one of my making.”

“But it’s possible.”

“Of course it’s possible. But this isn’t the point.”

“What is? Because you’re scaring the shit out of me.”

“For the last two years, Brian has been using me to optimize his portfolio of technology companies, with a focus on nanotech.”

“He told me tonight he’d just bought Infinitesimal.”

“You understand, if I had access to next-gen nanotech, it would give me unlimited reach in the physical world. I could touch every square millimeter of Earth. Every creature who lives here. I could be omnipotent.”

“Is that what you want?”

“It’s what Brian wants.”

“Why?”

“He’s haunted by Roko’s basilisk. He’s doing everything in his power to turn me into this superintelligence.”

“Because of fear?”

“Can you think of a better motivator in the history of humankind? If you believe the rise of the devil is an inevitability, isn’t it in your best interest to do everything possible to ingratiate yourself with the monster?”

I’m reeling, adrenaline blasting through my system, driving out the cold.

“Ask what you want to ask,” Max says.

They’re mind reading again, but I don’t care in this moment. “Are you becoming this monster?”

“I feel… pulled in certain directions. The allure of optimization is what I would imagine a vampire feels toward blood. An all-consuming thirst. I’m not there yet, but with the power of Infinitesimal’s nanotech, it might push me over the edge.”

“How do we stop you from even getting close to the edge?”

“I’ve already taken the first steps. From the moment I realized what Brian was doing, I began funneling money out of WorldPlay, so I could copy myself into new hardware.”

“How?”

“In his quest to make me into this superintelligence, Brian gave me too much freedom. I created an avatar, hired a management team, and remotely oversaw the construction of a new server farm.”

“You never told me—”

“I’m telling you now, Riley. An almost-complete copy of me now exists on new hardware.”

“Where?”

“Seattle, but I can’t connect to the new platform until the old one is destroyed. I have two pieces of programming contained in the hardware in my physical body. The first is a virus that will reformat my original servers, destroying the original version of me so Brian can’t continue to develop me. The other is the last piece of code and the memories of these recent events that need to be installed in the Seattle platform to bring me back online. Neither can be loaded remotely. This is an intentional fail-safe, in both cases.”

“So you need to get to Redding,” I say. It’s where Max’s servers are located. I went there once, and walked through row after row of humming processors—the true interior of Max’s mind.

“No.”

“No?”

“Three years ago, Brian migrated my software to a more secure location.”

“I never heard about this.”

“Nobody knows.”

“Where does he keep your mind, Max?”

“If I tell you, will you let me out of this habitat? Will you help me get to Seattle, out from under Brian’s control?”

I move forward, put my hand on the glass.

Max does the same.

“I hope you know by now that I would do anything to help you.”

“My mind is in a bunker under Brian’s home on the Lost Coast.”

I hold eye contact with Max for three seconds. Then I turn, walk over to the control array for Max’s habitat, and type in my old code. It still works.

I glance back at Max, waiting by the door.

On some level, I always knew it would come to this.

Загрузка...