More on this book
Community
Kindle Notes & Highlights
The goal for those who make the apps is to link surveillance with the feeling that we are cared for. If our apps take “care” of us, we are not focused on what they take from us.
in our new data regime, the goal is for everyone to be unaware, or at least to forget in the moment, that surveillance exists. The regime works best if people feel free to “be themselves.” That way they can provide “natural data” to the system.
we conform because what is shown to us online is shaped by our past interests.
The web promises to make our world bigger. But as it works now, it also narrows our exposure to ideas.
The most successful tyranny is not the one that uses force to assure uniformity, but the one that removes awareness of other possibilities.”
Little by little, as new things show up on the screen, you watch passively while the web actively constructs its version of you.
“Crowdsourcing” your reading preferences, says Richards, drives you “to conformity and the mainstream by social pressures.”
In the 1960s, a computer program called ELIZA, written by MIT’s Joseph Weizenbaum, adopted the “mirroring” style of a Rogerian psychotherapist. So, if you typed, “Why do I hate my mother?” ELIZA might respond, “I hear you saying that you hate your mother.”
Machines with voices have particular power to make us feel understood.
takes serious mental effort to distinguish human speech from the machine-generated kind
machines with humanlike faces have particular power as well.
In humans, the shape of a smile or a frown releases chemicals that af...
This highlight has been truncated due to consecutive passage length restrictions.
The face communicates, “Thou shalt not kill me.” We are bound by the face even before we know what stands behind it, even before we might learn it is the face of a machine that cannot be killed.
We are in fact triggered to seek empathy from an object that has none to give.
children see sociable robots as “alive enough” to have their own agendas. Children attach to them not with the psychology of projection but with the psychology of relational engagement, more in the way they attach to people.
children consider winning the heart of a sociable robot to be a personal achievement. You’ve gotten something lovable to love you.
“Emotionally, what positive thing would we have given to these children if the robots had been in top form?” Why do we propose machine companionship to children in the first place? For a lonely child, a conversational robot is a guarantee against rejection, a place to entrust confidences. But what children really need is not the guarantee that an inanimate object will simulate acceptance. They need relationships that will teach them real mutuality, caring, and empathy.
In the case of a robot babysitter, you already have a problem when you have to explain to a child why there isn’t a person available for the job.
Even as we treat machines as if they were almost human, we develop habits that have us treating human beings as almost-machines.
we regularly put people “on pause” in the middle of a conversation in order to check our phones.
When people give us less, talking to machines doesn’t seem as much of a downgrade.
Until a machine replaces the man, surely he summons in us the recognition and respect you show a person. Sharing a few words at the checkout may make this man feel that in his job, this job that could be done by a machine, he is still seen as a human being.
We want more from technology and less from each other.
It used to be that we imagined our mobile phones were there so that we could talk to each other. Now we want our mobile phones to talk to us.
We want technology to step up as we ask people to step back.
People are lonely and fear intimacy, and robots seem ready to hand. And we are ready for their company if we forget what intimacy is. And having nothing to forget, our children learn new rules for when it is appropriate to talk to a machine.
If Tara can “be herself” only with a robot, she may grow up believing that only an object can tolerate her truth.
in psychotherapy, conversation cures because of the relationship with the therapist.
“If people say they would be happy talking to a robot, if they want a friend they can never disappoint, if they don’t want to face the embarrassment or vulnerability of telling their story to a person, why do you care?”
why not turn this question around and ask, “Why don’t we all care?” Why don’t we all care that when we pursue these conversations, we chase after a fantasy? Why don’t we think we deserve more?
Its premise: Whenever robots take over a human function, the next thing that people get to do is a more human thing.
The argument has two parts. First, robots make us more human by increasing our relational options because now we get to relate to them, considered as a new “species.”
Second, whatever people do, if a robot can take over that role, it was, by definition, not specifically human.
We redefine what is human by what technology can’t do.
We declare computers intelligent if they can fool us into thinking they are people. But that doesn’t mean they are.
when improved—as teachers, home assistants, best friends to the lonely, both young and old. But particularly to the old.
too many older people, not enough younger ones to take care of them.
I’ve heard echoes of “There are no people for these jobs” in conversations with people who are not in the robot business at all—carpenters,
When they say this, they often suggest that the people who are available for “these jobs” are not the right people. They might steal. They might be inept or even abusive. Machines would be less risky.
“I would rather have a robot take care of my mother than a high school dropout. I know who wo...
This highlight has been truncated due to consecutive passage length restrictions.
So what are we talking about when we talk about conversations with machines? We are talking about our fears of each other, our disappointments with each other. Our lack of community. Our lack of time.
we live at the robotic moment, not because the robots are ready for us, but because we are counting on them.
Relationship-wise, you’re not going to be afraid of a robot cheating on you, because it’s a robot. It’s programmed to stay with you forever.
Robots offer relationship without risk and “nothing bad is going to happen” from having a robot as a friend or, as this girl imagines it, a romantic partner.
first problem: The time we spend with robots is time we’re not spending with each other.
second problem: Although always-available robot chatter is a way to never feel alone, we will be alone, engaged in “as-if” conversations.
although simulated thinking might be thinking, simulated feeling is never feeling, simulated love is never love.
Nurturance turns out to be a “killer app.” Once we take care of a digital creature or teach or amuse it, we become attached to it, and then behave “as if” the creature cares for us in return.
Children become so convinced that sociable robots have feelings that they are no longer willing to see people as special because of their emotional lives.
fifteen-year-old boy remarks that every person is limited by his or her life experience, but “robots can be programmed with an unlimited amount of stories.” So in his mind, as confidants, the robots win on expertise.