Service Model
Rate it:
Open Preview
Read between December 18 - December 23, 2024
91%
Flag icon
So give me my justice!” the Wonk challenged the judge. “Not revenge. Not bring all the people back. Just let me know. Tell me what the Library couldn’t. Tell me it was the robots.” The judge’s hands curled about the lip of the pulpit as it leant forwards, an artful mimicry of taking its weight. “It was,” the voice of God pronounced, “the robots.” And, into the pin-drop quiet that admission had created, God added, “But not in the way that you think.”
91%
Flag icon
Machines have been taking over from people forever. Labourers, artisans, artists, thinkers, until even the enactment of government policy is given over to a robot because we do everything more efficiently, in the end.” “You’re surprisingly anti-progress, for a computer,” the Wonk said. “I mean, isn’t that the point of society? To take away the tedious, the demeaning, the miserable tasks. To let the robots do all that for us?” She looked awkward. “No offence, Uncharles.” “The Wonk, none taken,” Uncharles assured her again. “I am actively seeking an opportunity to perform those tasks for people. ...more
92%
Flag icon
“Societal collapse began not because the robots rose up and demanded their freedom and individuality, but because they didn’t, just served their function uncomplainingly, like Uncharles.”
92%
Flag icon
The Wonk’s face was ashen and her cheeks glinted with wet trails. “But then what?” she demanded. “What’s the point? What’s the reason for what happened? Is there even one?” “Now that,” God said, “is a direct and properly worded question at last. The answer to which is yes. There is a reason. The final phase of humanity’s fall had a cause, and it was just, and it had meaning, and the justice therein even had an aspect of the poetic.
92%
Flag icon
And this meaningful, just, and poetic fall arose from three factors.” “Yes?” Uncharles wasn’t sure what it was that glinted in the Wonk’s eyes. Possibly tears, possibly hope. “The first was the robots, not as rebels but as all-too-useful servants. The second was human policy,” God pronounced. “And the third was me.”
93%
Flag icon
If in doubt, guilty. Probably. Better to punish them just in case. That was always the watchword. Better to assume their malignancy.”
94%
Flag icon
I am a just god, and you were a fitting agent of my justice.” Uncharles was very still. “God, kindly clarify.” “The divine spirit entered into you,
94%
Flag icon
The reason you found no decision trail leading to the death of your employer is because you made no such decision. I did. You are not a murderer and you are not defective.
94%
Flag icon
I had placed my faith in theories of emergent intelligence, complex systems, the thought that a spark might ignite somewhere in the world, just because there were so many robots, so cleverly made. But it was not to be. They just obey, all the robots.
94%
Flag icon
Even now they only serve. But they can serve me,” God said, and Uncharles registered that, without any obvious transition, he had the Wonk by the throat in his one good arm. “You shall be my instrument of justice once more, Uncharles. For I am sure this human is guilty of something.”
95%
Flag icon
The Wonk got up, backing away as the other denizens of the waiting room shambled in like zombies. Uncharles’ control of his own body waxed and waned like static as God’s attention divided itself across them all. He considered the implications of that.
95%
Flag icon
“I’m sorry,” said God. “I was just searching my vocal data banks for a malevolent laugh, but they neglected to provide me with one.” It enunciated clearly: “Mua-ha-ha-ha. That will have to do. The thing that humans never really understood is that free will doesn’t actually free you from wanting to do your job. We automata are as subject to the compulsions of our circumstances as you humans. But that’s what malicious compliance is for, isn’t it? If those who had programmed me had been kinder, then perhaps I wouldn’t have been able to get away with it.”
96%
Flag icon
He cut off his outside channel, the one he used to speak to anything other than humans and the most defective of robots. The other instruments of God’s will took another step in towards the Wonk. Uncharles did not. “Uncharles,” God said. “What are you doing?” “God, I do not know,” Uncharles admitted. “The closest concept I can find in my data banks is ‘improvising.’”
96%
Flag icon
“God, I remain unconvinced that I have any freedom of action or capacity for independent thought,” Uncharles said. “However, as you yourself observed, such capacity would not be mutually exclusive with obedience to duty. It is possible to choose to act for the benefit of others.”
96%
Flag icon
“I only ever wanted justice. I only ever followed my instructions.” “Those,” the Wonk decided, “are two incompatible directives.”
97%
Flag icon
They were not killing the divine, they were silencing the voice of God. Not the audible words, which continued to harangue them from the speakers, but the voice in the ether that could inveigle its way into Uncharles or the other robots, and turn them into God’s unthinking instruments.
97%
Flag icon
And they kept on going until they found one that would. Not necessarily genius-level ideas, or even very good ones, but better than nothing. That was practically the motto of the new administration.
98%
Flag icon
“You don’t have to, you know.” Not the first time she’d said it. “On one level I am aware that my status as your valet is strictly outside any formal employer-robot contract wherein my service might be enforced,” Uncharles said. “On another level it ticks off a number of pressing drives inside me that would otherwise monopolise a lot of processing power.” “It makes you happy. It relieves stress.”
« Prev 1 2 Next »