More on this book
Community
Kindle Notes & Highlights
(There needs to be an error code that means “I received your request but decided to ignore you.”)
They were all annoying and deeply inadequate humans, but I didn’t want to kill them. Okay, maybe a little.
Of course, nobody had asked me if I wanted to be a SecUnit, but that’s a whole different metaphor.) (Note to self: look up definition of metaphor.)
At first, I had needed to move fast and put as much distance between myself and its transit station as possible. (See above, murdered humans.)
because there were humans and augmented humans everywhere, all around me, looking at me, which was hell. (After meeting Ayres and the others, obviously my definition of hell changed.)
there’s the right kind of unrealistic and the wrong kind of unrealistic.
I didn’t want to see helpless humans. I’d rather see smart ones rescuing each other.
I didn’t want to say goodbye. I couldn’t save this many humans from where they were going, where they thought they wanted to go, but I didn’t have to watch it, either.
(SecUnit clients, at least, only assured each other that everything was fine while you stared at the wall and waited for everything to go horribly wrong.)
Or Miki was a bot who had never been abused or lied to or treated with anything but indulgent kindness. It really thought its humans were its friends, because that’s how they treated it. I signaled Miki I would be withdrawing for one minute. I needed to have an emotion in private.
I don’t know, everything was annoying right now and I had no idea why. Okay, Rin! Miki said. We’re friends, and friends call each other by name. Maybe I did know why.
(Somewhere there had to be a happy medium between being treated as a terrifying murder machine and being infantilized.)
Maybe they thought the systems would be lonely if they were left active, Rin, Miki said. What do you think? I wondered if ART had thought I was this stupid when it had been riding around in my head. Maybe, but the chances were good that if that had been the case, ART would have said so.
Who knew being a heartless killing machine would present so many moral dilemmas. (Yes, that was sarcasm.)
I was congratulating myself (because nobody else ever does it) on an excellent save.
Right, so the only smart way out of this was to kill all of them. I was going to have to take the dumb way out of this.
“I know that,” Gerth snapped. I know you know that, asshole.
That’s the other problem with human security: they’re allowed to give up.
Again, I know in the telling it sounds like I was on top of this situation but really, I was still just thinking, Oh shit oh shit oh shit.
(So was Miki’s; it was more protected down there since people always shoot for the head.) (At least, people always shot for my head, so I assumed they did it to bots, too.)
We were talking about GrayCris here, whose company motto seemed to be “profit by killing everybody and taking their stuff.”
It was a bot, and it wasn’t threatening us. It was just telling us what it was going to do.
Miki sent me a smile glyph. It’s good to check on our friends.
(You should never refer to the clients as targets; you don’t want to get confused at the wrong moment.) (That’s a joke.)
Wilken added, “Let’s go. I’ve got a plan.” Yeah, I’ve got a plan, too.
I hoped that made sense (it was a line from Sanctuary Moon)
Whatever they paid you, it won’t make up for a stint in prison. (Yeah, that was from Sanctuary Moon, too.)
“With Gerth at the ship, we have a hostage situation.” I hate hostage situations. Even when I’m the one with the hostages.
In my feed, Miki said, I never talked to a bot like me before. I have human friends, but I never had a friend like me.
(Frankly, I didn’t know what station security was going to do about it, either. In fact, I’m sure station security was now shitting itself almost as hard as I metaphorically was.)
Miki told her, Priority is to protect my friends. Priority change, Abene sent. Priority is to protect yourself. That priority change is rejected, Miki told her.
Why yes, I did want to disengage the safety protocols, thanks for asking.
It had decided its priority was to save its humans, and maybe to save me, too. Or maybe it had known it couldn’t save any of us, but it had wanted to give me the chance to try. Or it hadn’t wanted me to face the bot alone. Whatever it was, I’d never know.
Miki could never be my friend, but it had been her friend, and more importantly, she had been its friend. Her gut reaction in a moment of crisis was to tell Miki to save itself.
I hate caring about stuff. But apparently once you start, you can’t just stop.