More on this book
Community
Kindle Notes & Highlights
Read between
March 3 - March 7, 2021
we have fallen for the idea that these services are “free.” In reality, we pay with our data into a business model of extracting human attention.
Platforms started to mimic casinos, with innovations like the infinite scroll and addictive features aimed at the brain’s reward systems.
But with the advent of the Internet, it became possible to create commodities out of our lives—our behavior, our attention, our identity. People were processed into data. We would serve as the raw material of this new data-industrial complex.
America is now living in the aftermath of the first scaled deployment of a psychological weapon of mass destruction.
Facebook is no longer just a company, I told them. It’s a doorway into the minds of the American people, and Mark Zuckerberg left that door wide open for Cambridge Analytica, the Russians, and who knows how many others. Facebook is a monopoly, but its behavior is more than a regulatory issue—it’s a threat to national security. The concentration of power that Facebook enjoys is a danger to American democracy.
If you’re trying to hack a person’s mind, you need to identify cognitive biases and then exploit them.
I once met the primatologist Jane Goodall, and she said something that always stuck with me. Mingling at a reception, I asked why she researched primates in the wild instead of in a controlled lab. It’s simple, she said: Because they don’t live in labs. And neither do humans. If we are to really understand people, we have to always remember that they live outside of data sets.
This leads to a psychological bias called affect heuristic, where people use mental shortcuts that are significantly influenced by emotion. It’s the same bias that makes people say things they later regret in a fit of anger—in the heat of the moment they are, in fact, thinking differently.
This primes people for identity-motivated reasoning, which is a bias that essentially makes people accept or reject information based on how it serves to build or threaten group identity rather than on the merits of the content. This motivated reasoning is how Democrats and Republicans can watch the exact same newscast and reach the opposite conclusion.
Bannon’s quest was quasi-religious, with him assuming the role of messiah.
Racism can be aversive, where a person consciously or subconsciously avoids a racial group (e.g., gated communities, sexual and romantic avoidance, etc.), and racism can be symbolic, where a person holds negative evaluations of a racial group (e.g., stereotypes, double standards,
just-world hypothesis (JWH). This is a cognitive bias where some people rely on a presumption of a fair world: The world is a fair place where bad things “happen for a reason” or will be offset by some sort of “moral balancing” in the universe. We found that people who displayed the JWH bias were, for example, more prone to victim-blaming in hypothetical scenarios of sexual assault. If the world is fair, then random bad things should not happen to innocent people, and therefore there must have been a fault in the victim’s behavior. Finding ways to blame victims is psychologically prophylactic
...more
Kogan’s research was well suited to targeting voters with authoritarian personality traits, identifying narratives that would activate their support. After Kogan joined Cambridge Analytica’s project, CA’s internal psychology team started replicating some of his research from Russia: profiling people who were high in neuroticism and dark-triad traits. These targets were more impulsive and more susceptible to conspiratorial thinking, and, with the right kind of nudges, they could be lured into extreme thoughts or behavior.
For most of the time I was at SCL and Cambridge Analytica, none of what we were doing felt real, partly because so many of the people I met seemed almost cartoonish. The job became more of an intellectual adventure, like playing a video game with escalating levels of difficulty. What happens if I do this? Can I make this character turn from blue to red, or red to blue? Sitting in an office, staring at a screen, it was easy to spiral down into a deeper, darker place, to lose sight of what I was actually involved in.
The reaction to any problem, even one as serious as a threat to the integrity of our elections, is not “How can we fix it?” Rather, it’s “How can we monetize
The terms and conditions of Facebook’s mobile app asked for microphone and camera access. Although the company is at pains to deny pulling user audio data for targeted advertising, there is nonetheless a technical permission sitting on our phones that allows access to audio capabilities.
But Facebook’s “community” was building separate neighborhoods just for people who look like them.
articles that do nothing but reinforce a unidimensional point of view and take users to extremes to keep them clicking. What we’re seeing is a cognitive segregation, where people exist in their own informational ghettos. We are seeing the segregation of our realities. If Facebook is a “community,” it is a gated one.
segregation rests at the heart of the architectures of the Internet.
The destruction of mutual experience is the essential first step to othering, to denying another perspective on what it means to be one of us.
When Facebook goes on yet another apology tour, loudly professing that “we will try harder,” its empty rhetoric is nothing more than the thoughts and prayers of a technology company content to profit from a status quo of inaction. For Facebook, the lives of victims have become an externality of their continued quest to move fast and break things.
Because the objective of this hostile propaganda is not simply to interfere with our politics, or even to damage our companies. The objective is to tear apart our social fabric. They want us to hate one another. And that division can hit so much harder when these narratives contaminate the things we care about in our everyday lives—the clothes we wear, the sports we watch, the music we listen to, or the even coffee we drink.
We are all vulnerable to manipulation. We make judgments based on the information available to us, but we are all susceptible to manipulation when our access to that information becomes mediated. Over time, our biases can become amplified without our even realizing it. Many of us forget that what we see in our newsfeeds and our search engines is already moderated by algorithms whose sole motivation is to select what will engage us, not inform us. With most reputable news sources now behind paywalls, we are already seeing information inch toward becoming a luxury product in a marketplace where
...more
We can already see how algorithms competing to maximize our attention have the capacity to not only transform cultures but redefine the experience of existence. Algorithmically reinforced “engagement” lies at the heart of our outrage politics, call-out culture, selfie-induced vanity, tech addiction, and eroding mental well-being.
We risk creating a society obsessive about remembering, and we may have overlooked the value of forgetting, moving on, or being unknown.
Privacy is the very essence of our power to decide who and how we want to be. Privacy is not about hiding—privacy is about human growth and agency.
This story took the leadership of dedicated women, immigrants, and queers to ignite a public awakening about the discreet colonizing power of Silicon Valley and the digital technologies they have created to surround us.
there is a fine line between an algorithm defining you in order to represent who you really are and an algorithm defining you to create a self-fulfilling prophecy of who it thinks you should become.
the rabbit hole of personalization

