More on this book
Community
Kindle Notes & Highlights
BUT BY DIRECTLY COMMUNICATING select messages to select voters, the microtargeting of the Obama campaign had started a journey toward the privatization of public discourse in America.
When campaigns were conducted in private, the scrutiny of debate and publicity could be avoided.
With the ascendancy of social media, we have been forced to place our trust in political campaigns to be honest, because if lies are told, we may never notice. There is no one there to correct the record inside of a private ad network.
In reality, we pay with our data into a business model of extracting human attention.
Presidential debates suddenly shifted from policy positions into bizarre arguments about what was real news and what was fake news. America is now living in the aftermath of the first scaled deployment of a psychological weapon of mass destruction.
moved fast, I built things of immense power, and I never fully appreciated what I was breaking until it was too late.
Facebook is no longer just a company, I told them. It’s a doorway into the minds of the American people, and Mark Zuckerberg left that door wide open for Cambridge Analytica, the Russians, and who knows how many others. Facebook is a monopoly, but its behavior is more than a regulatory issue—it’s a threat to national security. The concentration of power that Facebook enjoys is a danger to American democracy.
We are socialized to place trust in our institutions—our government, our police, our schools, our regulators. It’s as if we assume there’s some guy with a secret team of experts sitting in an office with a plan, and if that plan doesn’t work, don’t worry, he’s got a plan B and a plan C—someone in charge will take care of it. But in truth, that guy doesn’t exist. If we choose to wait, nobody will come.
Behind the campaign was the emerging practice of microtargeting, where machine-learning algorithms ingest large amounts of voter data to divide the electorate into narrow segments and predict which individual voters are the best targets to persuade or turn out in an election.
five-factor model of personality, which represents personality as a set of ratings on five scales: openness, conscientiousness, extraversion, agreeableness, and neuroticism.
When a society jerks into extremism, so does its fashion.
perspecticide—the active deconstruction and manipulation of popular perception—you
The most effective form of perspecticide is one that first mutates the concept of self. In this light, the manipulator attempts to “steal” the concept of self from his target, replacing it with his own. This usually starts with attempting to smother the opponent’s narratives and then dominating the informational environment around the target. Often this involves gradually breaking down what are called psychological resilience factors over several months. Programs are designed to create unrealistic perceptions in the targets that result in confusion and damage self-efficacy. Targets are
...more
People high on the narcissism scale are susceptible because they are more prone to feelings of envy and entitlement, which are strong motivators of rule-breaking and hierarchy-defying behavior. This means these targets will be more likely to develop an exaggerated suspicion of harassment, persecution, victimhood, or unfair treatment. This is the “low-hanging fruit” for initiating the subversion of a larger organization. Later, this learning would serve as one of the foundations for Cambridge Analytica’s work catalyzing an alt-right insurgency in America.
told him that if you can’t define something, you can’t measure it, and if you can’t measure it, you can’t know if you are changing it.
To make a population more resilient to extremism, for example, you would first identify which people are susceptible to weaponized messaging, determine the traits that make them vulnerable to the contagion narrative, and then target them with an inoculating counter-narrative in an effort to change their behavior. In theory, of course, the same strategy could be used in reverse—to foster extremism—but that was not something I had even considered.
In psychological warfare, the weak points are flaws in how people think. If you’re trying to hack a person’s mind, you need to identify cognitive biases and then exploit them.
In psychology, this is called priming. And this is, in essence, how you weaponize data: You figure out which bits of salient information to pull to the fore to affect how a person feels, what she believes, and how she behaves.
This is in part because friends, colleagues, spouses, and parents typically see only part of your life, where your behavior is moderated by the context of that relationship.
This is what is known as “white fragility”: White people in North American society enjoy environments insulated from racial disadvantages, which fosters an expectation among white people of racial comfort while lowering their ability to tolerate racial stress. In our research, we saw that white fragility prevented people from confronting their latent prejudices. This cognitive dissonance also meant that subjects would often amplify their responses expressing positive statements toward minorities in an effort to satiate their self-concept of “not being racist.”
Bannon transformed CA into a tool for automated bullying and scaled psychological abuse. The firm started this journey by identifying a series of cognitive biases that it hypothesized would interact with latent racial bias.
Users were told by Facebook that the enterprise was about bringing people together. But Facebook’s “community” was building separate neighborhoods just for people who look like them. As the platform watched them, read their posts, and studied how they interacted with their friends, its algorithms would then make decisions about how to classify users into digital neighborhoods of their kind—what Facebook called their “Lookalikes.” The reason for this, of course, was to allow advertisers to target these homogeneous Lookalikes with separate narratives just for people of their kind. Most users
...more
Facebook is a “community,” it is a gated one.
From social isolation comes the raw material of both conspiracism and populism: mistrust.
Because the objective of this hostile propaganda is not simply to interfere with our politics, or even to damage our companies. The objective is to tear apart our social fabric. They want us to hate one another. And that division can hit so much harder when these narratives contaminate the things we care about in our everyday lives—the clothes we wear, the sports we watch, the music we listen to, or the even coffee we drink.
With most reputable news sources now behind paywalls, we are already seeing information inch toward becoming a luxury product in a marketplace where fake news is always free.
In this new economy of surveillance capitalism, we are the raw materials. What this means is that there is a new economic incentive to create substantial informational asymmetries between platforms and users. In order to be able to convert user behavior into profit, platforms need to know everything about their users’ behavior, while their users know nothing of the platform’s behavior.
If we exist in an environment that always watches, remembers, and labels us, according to conditions or values outside our control or awareness, then our data selves may shackle us to histories that we prefer to move on from.
Privacy is the very essence of our power to decide who and how we want to be. Privacy is not about hiding—privacy is about human growth and agency.
An “Internet utility” is a service, application, or platform whose presence has become so dominant on the Internet that it becomes affected with the public interest by the very nature of its own scale.

