What do you think?
Rate this book
The first approach has been tried many times in both science fiction and reality. In this new novella, at over 30,000 words, his longest work to date, Ted Chiang offers a detailed imagining of how the second approach might work within the contemporary landscape of startup companies, massively-multiplayer online gaming, and open-source software. It's a story of two people and the artificial intelligences they helped create, following them for more than a decade as they deal with the upgrades and obsolescence that are inevitable in the world of software. At the same time, it's an examination of the difference between processing power and intelligence, and of what it means to have a real relationship with an artificial entity.
150 pages, Hardcover
First published July 31, 2010
“You become responsible, forever, for what you have tamed.”Creating virtual pet-like “digital entities” (digients) that are capable of developing intelligence (although initially childlike) and speech by being raised and taken care of in the manner not that different from that of human children - while looking like cute animated characters and, being software, having an option of being suspended when novelty wears off or reset to the previous checkpoint if something unwanted happens — seemed like a marketable idea, and for a while generated profits and buzz.
- Antoine de Saint-Exupery, “The Little Prince”
Then the novelty wore off, the fad passed and only a few obsessed owners stuck with their digients, continuing to raise them and care for them even as the platform that they ran on was becoming obsolete. For them digients became more than the intended virtual pets — as the years devoted to their upbringing resulted in sentient reasonably intelligent nonhuman virtual entities, persons in all senses of this word except for legal (although that can be achieved as well via incorporation — as we all learned in not so recent past, corporations are people).![]()
“[…] If legal personhood is to be more than a form of wordplay, it has to mean granting a digient some degree of autonomy.”Ana is raising her digient Jax; Derek is raising Marco and Polo. And it’s not much different from raising children, although unlike child rearing, digient-rearing begs for some raised eyebrows. Derek gets a divorce, Ana knows that if forced to make a choice between her real life boyfriend and her virtual digient she would choose the “unreal” one. The former childlike pets are really teenagers now, testing the limits and craving connections as the world of their mostly abandoned digital platform constricts around them and funds needed to port them to a more populated virtual reality are only really offered by porn industry.
“The idea of love with no strings attached is as much a fantasy as what Binary Desire is selling. Loving someone means making sacrifices for them.”
“Pearson nods again, his suspicions confirmed. “That’s a deal-breaker for us. It’s nice that they’re fun to talk to, but all the attention you’ve given your digients has encouraged them to think of themselves as persons.”What do we want AI for? The obvious answer is an obedient, likable and empathetic but distinctly “not a person” servant, really. But does the personhood depend on whether you are software-based or meatware-based? If the interactions and love are real, if there is sentience and intelligence and personality, if the years you devoted have raised an actual *being*, although distinctly non-human — does it matter whether the person exists in virtual space, does it cheapen your interactions, does it invalidate your love or responsibility? Are we prepared to see AI as persons, making their own decisions, choices, mistakes, transactions? Or do we want a product, convenient but safe and disposable?
“Why is that a deal-breaker?” But she knows the answer already.
“We aren’t looking for superintelligent employees, we’re looking for superintelligent products. You’re offering us the former, and I can’t blame you; no one can spend as many years as you have teaching a digient and still think of it as a product. But our business isn’t based on that kind of sentiment.”
Ana has been pretending it wasn’t there, but now Pearson has stated it baldly: the fundamental incompatibility between Exponential’s goals and hers. They want something that responds like a person, but isn’t owed the same obligations as a person, and that’s something she can’t give them.”
“If these digients were going to be products, the potential profits might be worth the risk. But if all they’re going to be is employees, that’s a different situation; we can’t justify such a large investment for so little return.”
“It’s possible he doesn’t fully appreciate the consequences of what he’s suggesting, but Derek can’t shake the feeling that Marco in fact understands his own nature better than Derek does. Marco and Polo aren’t human, and maybe thinking of them as if they were is a mistake, forcing them to conform to his expectations instead of letting them be themselves. Is it more respectful to treat him like a human being, or to accept that he isn’t one?”I love the concept, love Chiang’s writing, love how much it made me think. But I can’t help but think that it should have gone further, continued to develop what it started, showed us more consequences of the decisions made (and not easy decisions, either).
"Low expectations are a self-fulfilling prophecy. If we aim high, we'll get better results."
"Every quality that made a person more valuable than a database was a product of experience."
"…I don't think you understand what they want to do."
Marco gives him a look of frustration. "I do. They make me like what they want me like, even if I not like it now."
Derek realizes Marco does understand. "And you don't think that's wrong."
"Why wrong? All things I like now, I like because Blue Gamma made me like. That not wrong."
Derek feels himself growing exasperated. "So do you want to be a corporation and make your own decisions, or do you want someone else to make your decisions? Which one is it?"
Marco thinks about that. "Maybe I try both. One copy me become corporation, second copy me work for Binary Desire."
"You don't mind having copies made of you?"
"Polo copy of me. That not wrong."