The Future of Feeling: Building Empathy in a Tech-Obsessed World
Rate it:
Open Preview
79%
Flag icon
I called O’Neil because I had read her book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, about all the ways algorithms can end up creating more problems than they solve and what can be done to prevent that.
79%
Flag icon
“We want to be careful about imputing motives, but you should have a spectrum of assumptions,” she said. “One of them is that they wished it was easier to fix, but they didn’t actually care about it enough or have enough time to work on it or whatever. The second possibility is that they didn’t think about it at all, they just did the easiest possible thing that seemed to work. The third is that they wanted it to be hard.”
81%
Flag icon
“All of us are jacked into this system,” Tristan Harris, a former Google employee, told Lewis. “All of our minds can be hijacked. Our choices are not as free as we think they are.”
82%
Flag icon
Apple, Google, and Facebook have all recently introduced so-called digital wellness features. The latest Apple operating system includes a screen-time tracker that shows not just how often you’re using your phone, but what you’re using it for. I have friends who have started to challenge themselves to get the number down—one averages about four hours per day, with the majority of that time spent on Instagram. My average is closer to two hours, depending on how much I’m craving distraction, and the vast majority of that is Instagram for me too. I know it isn’t good for me to spend so much time ...more
82%
Flag icon
Luckily there are apps for that as well—from Moment, which tracks your usage and reminds you of daily limits, to Stay on Task, which will just pop up and ask if you are on task throughout the day, somewhat passive-aggressively reminding you to stay off social-networking and other apps and get to work.
83%
Flag icon
“If people don’t see communities that are typically thought of as ‘other’ as human, I don’t think that a VR piece is going to humanize them.” It’s a false premise, Limbal argued, that relies on a trope of putting a “human face” on someone who is considered “other,” i.e., not the cultural norm of white, male, straight, cis, and able-bodied.
83%
Flag icon
called The Changing Same: The Racial Justice Project.
83%
Flag icon
“Especially in the emerging media space, we cannot allow the dominant culture to tell how our stories will unfold, or how they can be told, and who the audience is that we’re appealing to,” she said. “We’re putting a stake in the ground.”
84%
Flag icon
“Technologies that allow us to track and sense emotions and share or broadcast them with others are going through a huge explosion of development and breakthrough research right now,” said Jane McGonigal, director
86%
Flag icon
I try to ask myself the following questions: How might this improve my life or experience, or those of others? What is the potential for it to be manipulated, and are there safeguards? Is there incentive for the people in charge to monitor this—do they have skin in the game? And ultimately, do I think the rewards will outweigh the risks?
86%
Flag icon
Are terms, practices, and concerns transparent and open to critique by users? Does this technology, or do its founders and funders, have a reputation for intentionally or inadvertently harming and marginalizing others? Are they actively working to change/avoid that?
86%
Flag icon
Those are the parameters I have decided are most important to me right now as I consider my own relationship to technology, as well as in a not-too-distant future in which my niece and nephews—and eventually, maybe, my own children—interact daily with tools and bots I can’t even imagine yet. Your questions might be different, and they will likely evolve as time goes on. But we should continue to ask them, because barring something cataclysmic, our future will likely be even more tech focused than t...
This highlight has been truncated due to consecutive passage length restrictions.
« Prev 1 2 Next »