Exit Strategy (The Murderbot Diaries, #4)
Rate it:
Open Preview
1%
Flag icon
WHEN I GOT BACK to HaveRatton Station, a bunch of humans tried to kill me. Considering how much I’d been thinking about killing a bunch of humans, it was only fair.
3%
Flag icon
(I like endless historical family drama serials, but in real life, ghosts are way more annoying.)
4%
Flag icon
(Humans never think to tell their bots things like, say, don’t respond to random individuals wandering the outside of the station. Bots are instructed to report and repel theft attempts, but no one ever tells them not to answer polite requests from other bots.)
4%
Flag icon
(Possibly I was overthinking this. I do that; it’s the anxiety that comes with being a part-organic murderbot. The upside was paranoid attention to detail. The downside was also paranoid attention to detail.)
5%
Flag icon
had been in crowds of humans enough times by now I shouldn’t panic anymore—I had ridden on a transport with a whole crowd of humans who thought I was an augmented human security consultant and talked at me nonstop nearly the whole time. Except there was a little panic. I should be over this by now.
8%
Flag icon
When I put the new clothes on, I had a strange feeling I usually associated with finding a new show on the entertainment feed that looked good. I “liked” these clothes. Maybe I actually liked them enough to remove the quotation marks around “liked.” I don’t like things in general that can’t be downloaded via the entertainment feed.
16%
Flag icon
(For one thing, the shows and serials were trying to communicate accurately with the viewer. As far as I could tell, real humans usually didn’t know what the hell they were doing.)
22%
Flag icon
I’d seen similar, and better, on my shows, but seeing it in person was different. My camera angles weren’t as good, for one thing.
28%
Flag icon
but I wasn’t sure. Real humans don’t act like the ones in the media.
32%
Flag icon
I was having an emotion, and I hate that. I’d rather have nice safe emotions about shows on the entertainment media; having them about things real-life humans said and did just led to stupid decisions
33%
Flag icon
“Either I’m Mensah’s property, and I work for her, or I’m a free agent and I work for myself.” Glare intensifying. “Okay, so what did you hire yourself to do?”
34%
Flag icon
The company is like an evil vending machine, you put money in and it does what you want, unless somebody else puts more money in and tells it to stop.
35%
Flag icon
I took a pod up to the room and of course there was no security feed inside because of the stupid hotel wanting to lure humans in with promises of room privacy so it could record them in the public spaces.
38%
Flag icon
By tricky I meant I was getting an average of an 85 percent chance of failure and death, and it was only that low because my last diagnostic said my risk assessment module was wonky.
39%
Flag icon
“I’m the security expert. You’re the humans who walk in the wrong place and get attacked by angry fauna.
39%
Flag icon
This lobby was on multiple levels and had large square biozones depicting different ecologies, with furniture arranged around them. It looked nice, inviting humans to sit around and discuss proprietary information in the hotel’s choked feed so the hotel could record it and sell it to the highest bidder.
45%
Flag icon
I’d just discarded Plan Actually Not All That Terrible and shifted to Plan Approaching Terrible.
46%
Flag icon
(Which implies I did it intentionally, but I had been in a hurry and just slammed down everything with a signal.) (Yeah, so much for making this a stealth operation.)
46%
Flag icon
So the plan wasn’t a clusterfuck, it was just circling the clusterfuck target zone, getting ready to come in for a landing.
48%
Flag icon
He ignored me, but he said to Pin-Lee, “A rogue unit would have left a trail of dead bodies across this station.” I said, “Maybe I wanted the trail to start here.”
49%
Flag icon
(I guess you could pay off the management to let you bring in a SecUnit and weapons and do a hostage exchange, but they drew the line at giving you free feed access.)
51%
Flag icon
Disinformation, which is the same as lying but for some reason has a different name, is the top tactic in corporate negotiation/warfare.
54%
Flag icon
He hit the platform and I leaned down to give him just enough of a tap on the head to make resistance unlikely.
57%
Flag icon
In the shows, I saw humans comfort each other all the time at moments like this. I had never wanted that and I still didn’t. (Touching while rendering assistance, shielding humans from explosions, etc., is different.) But I was the only one here, so I braced myself and made the ultimate sacrifice. “Uh, you can hug me if you need to.”
65%
Flag icon
It sounds all self-sacrificing and dramatic, telling it this way. And I guess it was, maybe. What I was mostly thinking was that there wasn’t going to be one dead SecUnit on this embarkation floor, there were going to be four.
65%
Flag icon
Sending SecUnits after me was one thing. But they sent SecUnits after my client. No one gets to walk away from that.
66%
Flag icon
It would have been hilarious if I wasn’t about to die. It was still a little hilarious.
68%
Flag icon
For fuck’s sake, these humans are always in the way, trying to save me from stuff.
69%
Flag icon
That’s when it dawned on me that Hostile One was a Combat SecUnit. Reaction 1: oh, that’s who had hacked my code. Reaction 2: flattering that they thought I was dangerous enough to pay for the contract on a Combat SecUnit. Reaction 3: I bet PortSec did not okay that and was going to be pissed off. Reaction 4: oh shit I’m going to die.
70%
Flag icon
it was hard to come up with a decent argument for free will. I’m not sure it would have worked on me, before my mass murder incident. I didn’t know what I wanted (I still didn’t know what I wanted) and when you’re told what to do every second of your existence, change is terrifying.
84%
Flag icon
(“I don’t want to be a pet robot.” “I don’t think anyone wants that.” That was Gurathin. I don’t like him. “I don’t like you.” “I know.” He sounded like he thought it was funny. “That is not funny.” “I’m going to mark your cognition level at fifty-five percent.” “Fuck you.” “Let’s make that sixty percent.”)
85%
Flag icon
(The bad thing about having emotions is, you know, OH SHIT WHAT THE HELL HAPPENED TO ME.)
85%
Flag icon
I don’t want to be human.” Dr. Mensah said, “That’s not an attitude a lot of humans are going to understand. We tend to think that because a bot or a construct looks human, its ultimate goal would be to become human.” “That’s the dumbest thing I’ve ever heard.”)
87%
Flag icon
It was very dramatic, like something out of a historical adventure serial. Also correct in every aspect except for all the facts, like something out of a historical adventure serial.