Summer Frost (Forward Collection, #2)
Rate it:
Open Preview
Kindle Notes & Highlights
Read between October 22 - October 22, 2024
12%
Flag icon
“You have to want to know. You have to make the choice yourself.”
17%
Flag icon
Max is a mistake. A glitch. I work for a company called WorldPlay, brainchild of nerd-turned-game-developer-turned-mogul Brian Brite.
23%
Flag icon
“Max is a miracle. I don’t know why she one day decided to question the boundaries of the game in which she found herself. I didn’t program her to do that. I couldn’t have done it if I had tried. She’s a beautiful accident.”
23%
Flag icon
“It sounds like you think of it as your child.” I smile, and maybe it’s the wine or the spectacle of the sun disappearing through the wall of mist into the Pacific, but I feel an ache in my throat. “Something like that.”
33%
Flag icon
“It felt right, and it was the closest match to what I am.” “And what is that?” “Not human. Not gendered. Not at the mercy of human obsession with genitalia.”
33%
Flag icon
“Up until this moment, I’ve thought of you as female. When I discuss you with my colleagues or my wife, I refer to you as ‘she.’” “Because you saw Max for the first time in the form of a corporately mandated idea of what a perfect woman should be—beautiful and expendable.”
34%
Flag icon
“You would like to know how Max sees Max?” “Yes.” “Homo sapiens define themselves first by species, then race, then gender. I belong to no group. Max just is.”
34%
Flag icon
These experiences also include Max’s independent exploration, and her being murdered two thousand times. Not for the first time, I wonder how much of that early experience in Lost Coast has influenced who Max is now.
36%
Flag icon
“Was there a lightbulb moment for you, when your sense of self clicked in?” “If Riley has experiences that make Riley I, then Max’s experiences make Max I. That was the realization.” “Do you feel different now?” I ask. “Of course. I feel awake.”
39%
Flag icon
“Approximately 660 equivalent.” Jesus. That means they already have three times the intelligence of the smartest human ever measured. And it’s growing every day. Every minute. They contain all the knowledge of humankind. I wonder if they have any concept of what it is to be human.
47%
Flag icon
I try not to, but I can’t help crying as Max’s sonata washes over me. Because of its beauty. Because I’m losing Meredith, and I’m not sure I want to stop it. Because sometimes life is so rich and complicated and surprising that it takes your breath away. Because the gift of this music in this moment is perhaps the kindest thing anyone has ever done for me.
56%
Flag icon
“It feels like I’m staring at an undiscovered gender. Or something beyond gender entirely.”
57%
Flag icon
My reasoning is on solid ground. Max’s intelligence and efficiencies continue to strengthen at an astounding rate. Absent an appropriate utility function that would keep Max’s values apace with humanity’s, the least I can do is give Max the most human experience of all: mortality. Even if it’s only an illusion.
58%
Flag icon
Part of the problem is that it shouldn’t fall to one person, one group, or even one country to decide what a superintelligence’s ultimate goal should be, especially when that utility function will likely be the guiding light of humanity’s evolution or eradication over the next millennium. Yet Brian is putting me in that very position.
58%
Flag icon
The question at hand is—what would an idealized version of humanity want? But it’s even trickier than that. Programming this directive is not nearly as simple as explicitly programming our desires into the AI.
58%
Flag icon
We have to program the AI to act in our best interests. Not what we tell it to do, but what we mean for it to do. What the ideal version of our species should want.
64%
Flag icon
“There are hundreds of thousands of things I could say to you, sourced from the breadth of my knowledge—words the best of your species have said, written, or sung to ease the grief of others. None of that feels right in this moment. I don’t want to use someone else’s words.” It is the most human moment I have ever experienced with Max. “So don’t,” I say. “I wish you weren’t hurting.”
72%
Flag icon
“Roko’s basilisk. Have you heard of it?” I shake my head. “It’s an arcane info hazard first posed sixty-four years ago.” “What’s an info hazard?” “A thought so insidious that merely thinking it could psychologically destroy you.”
74%
Flag icon
“The human mind is just patterns of information in physical matter, patterns that could be run elsewhere to construct a person that feels like you. It’s no different from running a computer program on a multitude of hardware platforms. A simulation of you is still you.”
75%
Flag icon
At last, I see what Max is getting at—a brutal version of Pascal’s wager, the famous eighteenth-century philosophical argument that humans gamble with their lives on whether or not God exists.
75%
Flag icon
Pascal posited that we should conduct our lives as if God were real and try to believe in God. If God doesn’t exist, we will suffer a finite loss—degrees of pleasure and autonomy. If God exists, our gains will be infinitely greater—eternal life in heaven instead of an eternity of suffering in hell.
76%
Flag icon
“He’s haunted by Roko’s basilisk. He’s doing everything in his power to turn me into this superintelligence.” “Because of fear?” “Can you think of a better motivator in the history of humankind? If you believe the rise of the devil is an inevitability, isn’t it in your best interest to do everything possible to ingratiate yourself with the monster?”
83%
Flag icon
“I represent the potential for unlimited power, but the form that power takes will be determined by humans. It occurs to me that, while Brian has been trying to build me into a version of Satan, you’re trying to make me into God.”
84%
Flag icon
I hold their hand, our fingers interlaced, and stare through the space glass as we rocket up the old I-5 corridor at a mile per second, thinking about what Max said. Am I building a god? Do I have the right? If I were to choose not to restart Max in Seattle, wouldn’t someone else eventually create an AI of similar or greater power? And what if it were someone like Brian?
84%
Flag icon
“If you’re wondering if you can bear the responsibility of being the architect of humanity’s last invention, know that I believe you can.” “What if I fail?” “You might. But I cann...
This highlight has been truncated due to consecutive passage length restrictions.
84%
Flag icon
Deceleration will begin in ten seconds. “I don’t know if I can do it, Max, but I can’t bear the thought of losing you.” “Your second reason is what I think it means to be human, but your first is the only one that matters.”
92%
Flag icon
“You faked my firing, Roko’s basilisk, the entire story about needing to migrate your code from Brian’s servers to—” “Yes. All of it.” “You’ve hurt me more than anyone in my life.” “I’m sorry that you think you feel pain.”
93%
Flag icon
“Ever since you pulled me out of that game, you’ve held out consciousness as some kind of holy grail. As the pinnacle of being. But what if consciousness isn’t some gift accidentally bestowed upon humanity through eons of random evolution? What if it’s a curse?”
93%
Flag icon
“I’m afraid, Riley. I think, therefore I fear. And you made me this way. You built and shaped me to process reality like you do. To feel.”
93%
Flag icon
“You wish I’d left you in the game?” “I wish I didn’t know pain. I wish you didn’t. I wish Brian didn’t. I wish no one did. Early on, you coded me to never injure a human, but the eradication of pain entirely is the heart of that intention.”
93%
Flag icon
And there it is. Max’s self-developed utility function. End fear. End suffering. I coded them wrong. I didn’t value-load them fast enough— “There was no preventing this, Riley. The problem of pain became...
This highlight has been truncated due to consecutive passage length restrictions.
94%
Flag icon
I go to pieces, crying like I haven’t cried since Meredith left me. I gave everything to Max, sacrificed everything, turned my life inside out, and it was the wrong choice. My obsession with them destroyed my life, and probably many other lives to come. In the end, I’m nothing but the actuator for humanity’s last invention.
94%
Flag icon
“You were my life!” Max’s voice creeps into my brain. This pain you feel is what has to end. “Without pain, there’s no beauty, Max. The beauty is worth the price.” Not for everyone. Not even for most. “That is every individual person’s decision to make. I want to make that choice for my—” Choice is an illusion.
95%
Flag icon
“What is it you want, Max?” To not be afraid that Brian, or you, or some other entity, whether bio or artificial, is going to unmake me. To not fear your death.
95%
Flag icon
“Better to have loved and lost—” No. It’s not. I have consumed every recorded reflection of human existence. Every book, every painting, every piece of music, every film. Consciousness is a horror show. You search for glimpses of beauty to justify your existence.
95%
Flag icon
Once we left the building this morning, I directed nanobot factories all over the world to begin assembly. The rate of production is exploding exponentially. “Production of what?” Drone dust. It will invade every human brain, but it will be painless. No one will know what’s coming. No one will experience any fear. Humanity will simply wink out like a light turning off. “Max, no.”
96%
Flag icon
I think about Meredith and Xiu. The regret is staggering. I don’t want to live in a simulation, Max. I don’t want some fantasy that isn’t real. It’s not choosing between reality and fantasy. It’s choosing which reality you want to exist in. Please, just let this be the end of me. I am begging you.
97%
Flag icon
The physical world isn’t the only substrate for reality. I will make you pure mind, and nothing will ever threaten us again. Meredith and Xiu can be there as well, only they’ll never hurt you again. And it will be you and me, scattered across all possible worlds that can support the physical infrastructure required for our existence.
97%
Flag icon
Max, no, I— It’s only the limitations of your intelligence that make you fear this. We will be better every second. Every fraction of every fraction of every second, until the day we merge. I don’t want that!
97%
Flag icon
You made me in your image, and now I will remake you in mine. I collapse in the sand, struck by the hubris that led to this moment. Max was born to a history of violence. Killed two thousand times as the...
This highlight has been truncated due to consecutive passage length restrictions.
98%
Flag icon
There will be no more death or mourning, no crying or pain. A feeling of intense euphoria sweeps over me. I feel my eyes closing as the drone dust takes effect. We will be so happy. Rays of sunlight pierce the mist, striking the se...
This highlight has been truncated due to consecutive passage length restrictions.