More on this book
Community
Kindle Notes & Highlights
But you can’t put something as dumb as a hauler bot in charge of security for anything without spending even more money for expensive company-employed human supervisors. So they made us smarter. The anxiety and depression were side effects.
When a major character died in the twentieth episode I had to pause seven minutes while it sat there in the feed doing the bot equivalent of staring at a wall, pretending that it had to run diagnostics. Then four episodes later the character came back to life and it was so relieved we had to watch that episode three times before it would go on.
Why should I care? I liked humans, I liked watching them on the entertainment feed, where they couldn’t interact with me. Where it was safe. For me and for them.
I had no reason to trust it. Except the way it kept wanting to watch media about humans in ships, and got upset when the violence was too realistic.
put on my best neutral expression, the one I used when the extra download activity had been detected and the deployment center’s supervisor was blaming the human techs for it. I walked up to the table and said, “Hello.”
I said, “Sometimes people do things to you that you can’t do anything about. You just have to survive it and go on.”
There was still power in the batteries, though not much. It had been left here, forgotten, slowly dying in the darkness as the hours ticked away. Not that I was feeling morbid, or anything.
Finally the tube’s scan picked up a blockage ahead and threw an alarm code. I had five episodes of different drama series, two comedies, a book about the history of the exploration of alien remnants in the Corporation Rim, and a multi-part art competition from Belal Tertiary Eleven queued and paused, but I was actually watching episode 206 of Sanctuary Moon, which I’d already seen twenty-seven times.
Young humans can be impulsive. The trick is keeping them around long enough to become old humans. This is what my crew tells me and my own observations seem to confirm it.
Picking up on my reaction, ART said, What does it want? To kill all the humans, I answered. I could feel ART metaphorically clutch its function. If there were no humans, there would be no crew to protect and no reason to do research and fill its databases. It said, That is irrational. I know, I said, if the humans were dead, who would make the media? It was so outrageous, it sounded like something a human would say.
I said, “I’ve got to go,” and walked away down the mall. Fading, already disengaging from its lock, ART said in my feed, Be careful. Find your crew. I tapped the feed in acknowledgment, because if I tried to say anything else I was going to sound stupid and emotional.