More on this book
Community
Kindle Notes & Highlights
How humans decide what to do with their arms on a second-by-second basis, I still have no idea.)
They weren’t bringing her. This had all been for nothing. All of it, Milu, Miki’s death, the trip here, everything. I said, “Milu was my idea. I’m a rogue unit.” He ignored me, but he said to Pin-Lee, “A rogue unit would have left a trail of dead bodies across this station.” I said, “Maybe I wanted the trail to start here.” He made eye contact with me, and his pupils widened slightly. I added, “You people are so naive.”
From the humans it was all “Wait!” “No!” “Um—” “I’m not going to kill him,” I said, and dumped him on the couch. “I know what I’m fucking doing.”
I shut my risk assessment module down.
Disinformation, which is the same as lying but for some reason has a different name, is the top tactic in corporate negotiation/warfare.
In the shows, I saw humans comfort each other all the time at moments like this. I had never wanted that and I still didn’t. (Touching while rendering assistance, shielding humans from explosions, etc., is different.) But I was the only one here, so I braced myself and made the ultimate sacrifice. “Uh, you can hug me if you need to.”
She started to laugh, then her face did something complicated and she hugged me. I upped the temperature in my chest and told myself it was like first aid. Except it wasn’t entirely awful. It was like when Tapan had slept next to me in the room at the hostel, or when Abene had leaned on me after I saved her; strange, but not as horrific as I would have thought.
Huh, why did I like Sanctuary Moon so much? I had to pull the memory from my archive, and what I saw there startled me. “It’s the first one I saw. When I hacked my governor module and picked up the entertainment feed. It made me feel like a person.” Yeah, that last part shouldn’t have come out, but with all the security-feed monitoring I was doing, I was losing control of my output.
I really needed to get around to setting that one-second delay on my mouth.
The words kept wanting to come out. It gave me context for the emotions I was feeling, I managed not to say. “It kept me company without…” “Without making you interact?” she suggested. That she understood even that much made me melt. I hate that this happens, it makes me feel vulnerable. Maybe that was why I had been nervous about meeting Mensah again, and not all the other dumb reasons I had come up with. I hadn’t been afraid that she wasn’t my friend, I had been afraid that she was, and what it did to me.
I hadn’t meant to tell her and I don’t know why I did. Did I secretly want her to talk me out of it? I hate having emotions about real humans instead of fake ones, it just leads to stupid moments like this.
It sounds all self-sacrificing and dramatic, telling it this way. And I guess it was, maybe. What I was mostly thinking was that there wasn’t going to be one dead SecUnit on this embarkation floor, there were going to be four.
Sending SecUnits after me was one thing. But they sent SecUnits after my client. No one gets to walk away from that.
A lot of humans were yelling in my backburnered feed, which really made this feel like the bad old days of contract work.
For fuck’s sake, these humans are always in the way, trying to save me from stuff.
Reaction 1: oh, that’s who had hacked my code. Reaction 2: flattering that they thought I was dangerous enough to pay for the contract on a Combat SecUnit. Reaction 3: I bet PortSec did not okay that and was going to be pissed off. Reaction 4: oh shit I’m going to die.
I’m not sure it would have worked on me, before my mass murder incident. I didn’t know what I wanted (I still didn’t know what I wanted) and when you’re told what to do every second of your existence, change is terrifying. (I mean, I’d hacked my governor module but kept my day job until PreservationAux.)
What do you want? I suddenly got: I want to kill you. Okay, I was a little offended. Why? You don’t even know me.
(It’s good there’s not a separate statistic for my mental performance reliability because I don’t think even I would rate it as all that great at the moment.)
I’m a SecUnit, not an engineer.
A Palisade ship could catch the shuttle and board it. The last thing I wanted was to ask the company gunship for help. The last thing I wanted was for GrayCris to catch us. The two last things were incompatible. It was time to stop fucking around. I accessed comm and secured a feed channel to the company gunship.
Yes, that’s me they’re talking about. It would have been more funny if I hadn’t been leaking onto the deck.
I suddenly needed to see Mensah’s face and I dropped the shuttleSec camera views and looked down at her. She looked mad and exhausted, which was exactly the way I felt. I sent, You have no idea what I am. She tilted her head and looked more mad. I know exactly what you are. You’re afraid, you’re hurt, and you need to calm the fuck down so we can get through this situation alive. I said, I am calm. You need to be calm, to take over a gunship. Mensah’s eyes narrowed. Security consultants don’t get their clients into unnecessary pitched battles for control of their rescue ship. She added, Because
...more
She wasn’t afraid of me. And it hit me that I didn’t want that to change. She had just been through a traumatic experience, and I was making it worse. Something was overwhelming me, and it wasn’t the familiar wave of not-caring. Fine, I sent. I sounded sulky, because I was sulky. I hate emotions.
“They wouldn’t have let me through. I told PortSec if they let you through to the shuttle, I’d stay behind.” That stopped her. Her brow furrowed. “Is that why you stayed?” I could have lied. I didn’t want to.
“Mostly,” I said. I looked at her with my actual eyes again. “I wanted to win.”
And, oh right, I was in a MedSystem, which would have immediately diagnosed that I had a terminal case of being a SecUnit.
(“I don’t want to be a pet robot.” “I don’t think anyone wants that.” That was Gurathin. I don’t like him. “I don’t like you.” “I know.” He sounded like he thought it was funny. “That is not funny.” “I’m going to mark your cognition level at fifty-five percent.” “Fuck you.” “Let’s make that sixty percent.”)
I don’t want to be human.” Dr. Mensah said, “That’s not an attitude a lot of humans are going to understand. We tend to think that because a bot or a construct looks human, its ultimate goal would be to become human.” “That’s the dumbest thing I’ve ever heard.”)
It was very dramatic, like something out of a historical adventure serial. Also correct in every aspect except for all the facts, like something out of a historical adventure serial.
I had a complex emotional reaction. A whole new burst of neural connections blossomed. Oh right, I often have complex emotional reactions which I can’t easily interpret.