Zucked: Waking Up to the Facebook Catastrophe
Rate it:
Open Preview
Started reading February 6, 2019
17%
Flag icon
Knowing that we had accomplished everything we could have hoped for at the time I began mentoring him, I sent Zuck a message saying that my job was
17%
Flag icon
done. He was appreciative and said we would always be friends. At this point, I stopped being an insider, but I remained a true believer in Facebook.
17%
Flag icon
What I did not grasp was that Zuck’s ambition had no limit. I did not appreciate that his focus on code as the solution
17%
Flag icon
to every problem would blind him to the human cost of Facebook’s outsized success. And I never imagined that Zuck would craft a culture in which criticism and disagreement apparently had no place.
17%
Flag icon
The company finished 2010 with 608 million monthly users. The rate of user growth remained exceptionally high, and minutes of use per user per day continued to rise. Early in 2011, Facebook received an investment of five hundred million dollars for 1 percent of the company, pushing the valuation up to fifty billion dollars.
17%
Flag icon
Facebook was not only the most exciting company since Google, it showed every indication that it would become one of the greatest tech companies of all time. New investors were clamoring to buy shares. By June 2011, DoubleClick announced that Facebook was the most visited site on the web, with more than one trillion visits. Nielsen disagreed, saying Facebook still trailed Google, but it appeared to be only a matter of time before the two companies would agree that Facebook was #1.
17%
Flag icon
In March 2011, I saw a presentation that introduced the first seed of doubt into my rosy view of Facebook. The
17%
Flag icon
That year, the highlight for me was a nine-minute talk by Eli Pariser, the board president of MoveOn.org. Eli had an insight that his Facebook and Google feeds had stopped being neutral.
17%
Flag icon
Even though his Facebook friend list included a balance of liberals and conservatives, his tendency to click more often on liberal links had led the algorithms to prioritize such content, eventually crowding out conservative content entirely. He worked with friends to demonstrate that the change was universal on both Facebook and Google. The platforms were pretending to be neutral, but they were filtering content in ways that were invisible to users. Having argued that the open web offered
17%
Flag icon
an improvement on the biases of traditional content editors, the platforms were surreptitiously implementing algorithmic filters that lacked the value system of human editors. Algorithms would not act in a socially responsible way on their own. Users would think they were seeing a balance of content when in fact they were trapped in what Eli called a “filter bubble” created and enforced by algorithms. ...
This highlight has been truncated due to consecutive passage length restrictions.
18%
Flag icon
lead to unexpected, negative consequences. Other publishers were jumping on board the personalization bandwagon. There might be no way fo...
This highlight has been truncated due to consecutive passage length restrictions.
18%
Flag icon
Eli’s conclusion? If platforms are going to be gatekeepers, they need to program a sense of civic responsibility into their algorithms. They need to be transparent about the rules that determine what gets through the filter. And they need to...
This highlight has been truncated due to consecutive passage length restrictions.
18%
Flag icon
of the most insightful talks I had ever heard. Its import was obvious. When Eli finished, I jumped out of my seat and made a beeline to the stage door so that I could introduce myself. If you view the talk on TED.com today, you will immediately appreciate its importance. At the time I did not see a way for me to act on Eli’s insight at Facebook. I no longer had regular contact with Zuck, much less inside information. I was not u...
This highlight has been truncated due to consecutive passage length restrictions.
18%
Flag icon
Eli’s talk percolated in my mind. There was no good way to spin filter bubbles. All I could do was hope that Zuck and Sheryl would have the sense not to use them in ways that would harm users. (You can listen to Eli Pariser’s “B...
This highlight has been truncated due to consecutive passage length restrictions.
18%
Flag icon
The metadata that Facebook and others collected enabled them to find unexpected patterns, such as “four men who collect baseball cards, like novels by Charles Dickens, and check Facebook after midnight bought a certain model of Toyota,” creating an opportunity to package male night owls who collect baseball cards and like Dickens for car ads. Facebook allows advertisers to identify each user’s biases and appeal to them individually.
18%
Flag icon
Insights gathered this way changed the nature of ad targeting. More important, though, all that data goes into Facebook’s (or Google’s) artificial intelligence and can be used by advertisers to exploit the emotions of users in ways that increase the likelihood that they purchase a specific model of car or vote in a certain way. As the technology futurist Jaron Lanier has noted, advertising on social media platforms has evolved into a form of manipulation.
18%
Flag icon
Unlike traditional media or even search, social networking provided signals about each user’s emotional state and triggers. Relative to the monochrome of search, social network advertising offered Technicolor, the
18%
Flag icon
equivalent of Oz vs. Kansas in The Wizard of Oz.
18%
Flag icon
Facebook took no chances with Google+. The company went to battle stations and devoted every resource to stopping Google on the beach of social networking. The company cranked up its development efforts, dramatically increasing the size limits for posts, partnering with Skype, introducing the Messenger texting product, and adding a slew of new tools for creating
19%
Flag icon
reversed. In early October 2012, the company announced it had surpassed one billion monthly users, with 600 million mobile users, 219
19%
Flag icon
billion photo uploads, and 140 billion friend connections. Despite the mess of the IPO—and not being privy to the issues with ads—I took great pride in Facebook’s success.
19%
Flag icon
Thanks to the philosophy of “move fast and break things,” no one at Facebook was satisfied with a record-setting IPO. They began hacking away at the problem of monetizing users.
19%
Flag icon
The team may have been young, but they were smart, highly motivated, and persistent. Their leadership, with Sheryl Sandberg at the top, created a successful sales culture. They took a long view and learned from every mistake.
20%
Flag icon
From one billion users at year-end 2012, Facebook grew to 1.2 billion in 2013, 1.4 billion in 2014, 1.6 billion in 2015, nearly 1.9 billion in 2016, and 2.1 billion in 2017. From
20%
Flag icon
just more than $5 billion in sales in the IPO year of
20%
Flag icon
2012, Facebook grew to $7.8 billion in 2013, $12.5 billion in 2014, $17.9 billion in 2015, $27.6 billion in 2...
This highlight has been truncated due to consecutive passage length restrictions.
21%
Flag icon
The Children of Fogg
21%
Flag icon
It’s not because anyone is
21%
Flag icon
evil or has bad intentions. It’s because the game is getting attention at all costs. —TRISTAN HARRIS
21%
Flag icon
The platforms prey on weaknesses in human psychology, using ideas from propaganda, public relations, and slot machines to create habits, then addiction. Tristan called it “brain hacking.”
21%
Flag icon
needed someone who could help me understand what I had observed over the course of 2016. Tristan’s vision explained so much of what
21%
Flag icon
I had seen. His focus was on public health, but I saw immediately the implications for elections and economics.
21%
Flag icon
He told me that he had been trying to get engineers at technology companies l...
This highlight has been truncated due to consecutive passage length restrictions.
21%
Flag icon
After graduation, Tristan enrolled in the graduate computer science master’s program at Stanford. In his first term, he took a class in persuasive technology with Professor B. J. Fogg, whose textbook, Persuasive Technology, is the standard in the field. Professors at other universities teach the subject,
21%
Flag icon
but being at Stanford gave Fogg outsized influence in Silicon Valley.
21%
Flag icon
Like a magician doing a card trick, the computer designer can create
21%
Flag icon
the illusion of user control when it is the system that guides every action.
21%
Flag icon
Fogg’s textbook lays out a formula for persuasion that clever programmers can exploit more effectively on each new generation of technology to hijack users’ minds. Prior to smartphones like the iPhone and Android, the danger was limited. After the transition to smartphones, users did not stand a chance. Fogg did not help. As described ...
This highlight has been truncated due to consecutive passage length restrictions.
22%
Flag icon
conceptual design for an ethically questionable persuasive technology—the more unethical the better.” He thought this was the best way to get students to ...
This highlight has been truncated due to consecutive passage length restrictions.
22%
Flag icon
I eventually had an opportunity to speak to Fogg. He is a thoughtful and friendly man who feels he is being unfairly blamed for the consequences of persuasive technology on internet platforms. He told me that he made several attempts to call attention to the dangers of persuasive technology, but that Silicon
22%
Flag icon
charged with increasing the number of users, time on site, and engagement with ads. They have been very successful. When we humans interact with internet platforms, we think we are looking at cat videos and posts from friends in a simple news feed. What few people know is that behind the news feed is a large and advanced artificial intelligence. When we check a news feed, we are playing multidimensional chess against massive artificial intelligences that have nearly perfect information about us. The goal of the AI is to figure out which content will keep each of us highly engaged and ...more
22%
Flag icon
companies like Facebook (and Google) now include behavioral prediction engines that anticipate our thoughts and emotions, based on patterns found in the reservoir of data they have accumulated about users. Years of Likes, posts, shares, comments, and Groups have taught Facebook’s AI how to monopolize our attention. Thanks to all this data, Facebook can offer advertisers exceptionally high-quality targeting. The challenge has been to create ad products that extract maximum value from that targeting. The battle for attention requires constant innovation. As the industry learned with banner ads ...more
22%
Flag icon
INTERNET PLATFORMS HAVE EMBRACED B. J. Fogg’s approach to persuasive technology, applying it in every way imaginable on their sites. Autoplay and endless feeds eliminate cues to stop. Unpredictable, variable rewards stimulate behavioral addiction. Tagging, Like buttons, and notifications trigger social validation loops. As users, we do not stand a chance. Humans have evolved a common set of responses to certain stimuli—“flight or fight” would be an example—that can be exploited by technology. When confronted
22%
Flag icon
with visual stimuli, such as vivid colors—red is a trigger color—or a vibration against the skin near our pocket that signals a possible enticing reward, the body responds in predictable ways: a faster heartbeat and the release of a neurotransmitter, dopamine. In human biology, a faster heartbeat and the release of dopamine are meant to be momentary responses that increase the odds of survival in a life-or-death situation. Too much of that kind of stimulus is a bad thing for any human, but the effects are particularly dangerous in children and adolescents. The first wave of consequences
22%
Flag icon
includes lower sleep quality, an increase in stress, anxiety, depression, an inability to concentrate, irritability, and insomnia. That is just the beginning. Many of us develop nomophobia, which is the fear of being separated from one’s phone. We are conditioned to check our phones constantly, craving ever more stimulation from our platforms of choice. Many of us develop problems relating to and interacting with other people. Kids get hooked on games, texting, Instagram, an...
This highlight has been truncated due to consecutive passage length restrictions.
22%
Flag icon
technology mediates human relationships, the social cues and feedback loops that would normally cause a bully to experience shunning or disgust by their peers are not present. Adults get locked into filter bubbles, which Wikipedia defines as “a state of intellectual isolation that can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-b...
This highlight has been truncated due to consecutive passage length restrictions.
22%
Flag icon
and Google. But filter bubbles are not unique to internet platforms. They can also be found on any journalistic medium that reinforces the preexisting beliefs of its audience, while suppressing any stories that might contradict them. Partisan TV channels like Fox News and MSNBC maintain powerful filter bubbles, but they cannot match the impact of Facebook and Google because television is a one-way, broadcast medium. It does not allow for pe...
This highlight has been truncated due to consecutive passage length restrictions.
22%
Flag icon
In the endless pursuit of engagement, Facebook’s AI and algorithms feed each of us a steady diet of content similar to what has engaged us most in the past. Usually that is content we “like.” Every click, share, and comment helps Facebook refine its AI just a little bit. With 2.2 billion people clicking, sharing, and commenting every month—1.47 billion every day—Facebook’s AI knows more about users than they can imagine. All that data in one place would be a target for bad actors, even if it were well-protected. But ...
This highlight has been truncated due to consecutive passage length restrictions.
23%
Flag icon
Facebook knows so much about each user that they can often tune News Feed to promote emotional responses. They cannot do this all
23%
Flag icon
the time to every user, but they do it far more than users realize. And they do it subtly, in very small increments.