More on this book
Community
Kindle Notes & Highlights
my code was developed by a bond company known for intense xenophobic paranoia, tempered only by desperate greed.)
“No hugging,” I warned her. It was in our contract. “Do you need emotional support? Do you want me to call someone?”
I saved that for future reference. Unidentified One seemed to have gone to some trouble with the wording of that threat, it would be a shame if they never experienced it firsthand.
The dim corridor lights brightened as we went by, an autonomic reflex. For a human, it would have been like seeing a dead body twitch. ART wasn’t here,
It was bad enough that ART must be dead, it wasn’t fair that the humans it had loved so much were dead, too.
The good thing about being a construct is that you can’t reproduce and create children to argue with you.
Trusted friend? “Oh, fuck you.” That still counts as speaking.
“So, you have a relationship with this transport.” I was horrified. Humans are disgusting. “No!” Ratthi made a little exasperated noise. “I didn’t mean a sexual relationship.”
“This is not like Preservation Alliance territory. You can only get a station responder when you’re inside a station’s defined area of influence, and they won’t forward distress beacons and they don’t send responders through wormholes. At most, they’ll pass the call to a local retrieval company, which would contact us and contract to rescue us. We’d have to pay them up front, and probably end up owing the station for passing the call along, though that depends on local regulations.”
“Anyone who thinks machine intelligences don’t have emotions needs to be in this very uncomfortable room right now.”
“I’m a wibbly bulkhead,” Arada muttered. (The wibbliness was why I trusted Arada. Overconfident humans who don’t listen to anybody else scare the hell out of me.)
(If I got angry at myself for being angry I would be angry constantly and I wouldn’t have time to think about anything else.) (Wait, I think I am angry constantly. That might explain a lot.)
“Rentals” is a creepy way to talk about people. Yes, Amena, no shit, I know that. (And I knew this was all new and horrifying to Amena but it was just same old same old for me and Eletra and her permanently indentured family. Which was why I was saying this silently to myself instead of out loud to the whole ship.)
“Why do you want me to pretend to be an augmented human? This way is easier.” You don’t like it, ART said.
offered to bring Ras’s body over, too, but Leonide had said it wasn’t necessary and we could dispose of it. That upset the humans and it sort of upset me, too,
ART replied, only to me, It is safer. I’ve lost my crew, I won’t lose you.
I am actually alone in my head, and that’s where 90 plus percent of my problems are.
(Confession time: that moment, when the humans or augmented humans realize you’re really here to help them. I don’t hate that moment.)
logic with traumatized humans never works. (I could make a remark there about logic not working with humans, period, but I’m not going to do that.)
I asked 3, What do you want? SecUnit 3 said, To help you retrieve our clients. Then it added, After that I have no information.
I had read the HelpMe.file and accepted it as truthful. But there was a difference between accepting data as accurate and experiencing it. The humans would not abandon this SecUnit even though part of our function was to be disposable if necessary.
That will kill you, I told it. I know, it said, what do you think my function is, you idiot? Just do it.