More on this book
Community
Kindle Notes & Highlights
Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to product or service improvement, the rest are declared as a proprietary behavioral surplus, fed into advanced manufacturing processes known as “machine intelligence,” and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace for behavioral predictions that I call behavioral futures markets. Surveillance capitalists have
  
  ...more
Eventually, surveillance capitalists discovered that the most-predictive behavioral data come from intervening in the state of play in order to nudge, coax, tune, and herd behavior toward profitable outcomes. Competitive pressures produced this shift, in which automated machine processes not only know our behavior but also shape our behavior at scale. With this reorientation from knowledge to power, it is no longer enough to automate information flows about us; the goal now is to automate us. In
surveillance capitalism births a new species of power that I call instrumentarianism. Instrumentarian power knows and shapes human behavior toward others’ ends. Instead of armaments and armies, it works its will through the automated medium of an increasingly ubiquitous computational architecture of “smart” networked devices, things, and spaces.
Google invented and perfected surveillance capitalism in much the same way that a century ago General Motors invented and perfected managerial capitalism.
Surveillance capitalism operates through unprecedented asymmetries in knowledge and the power that accrues to knowledge. Surveillance capitalists know everything about us, whereas their operations are designed to be unknowable to us. They accumulate vast domains of new knowledge from us, but not for us.
Nonetheless, the danger of closing doors to rooms that will no longer exist is very real. The unprecedented nature of surveillance capitalism has enabled it to elude systematic contest because it cannot be adequately grasped with our existing concepts.
Surveillance capitalism is not technology; it is a logic that imbues technology and commands it into action. Surveillance capitalism is a market form that is unimaginable outside the digital milieu, but it is not the same as the “digital.” As
Just as Ford tapped into a new mass consumption, Apple was among the first to experience explosive commercial success by tapping into a new society of individuals and their demand for individualized consumption.
the digital era finally offered the tools to shift the focus of consumption from the mass to the individual, liberating and reconfiguring capitalism’s operations and assets.
“In the age of new consensus financial policy stabilization,” one US economist wrote, “the economy has witnessed the largest transfer of income to the top in history.”
In this context, Piketty cites the ways in which financial elites use their outsized earnings to fund a cycle of political capture that protects their interests from political challenge.51 Indeed, a 2015 New York Times report concluded that 158 US families and their corporations provided almost half ($176 million) of all the money that was raised by both political parties in support of presidential candidates in 2016, primarily in support of “Republican candidates who have pledged to pare regulations, cut taxes . . . and shrink entitlements.”
Many scholars have taken to describing these new conditions as neofeudalism, marked by the consolidation of elite wealth and power far beyond the control of ordinary people and the mechanisms of democratic consent.55 Piketty calls it a return to “patrimonial capitalism,” a reversion to a premodern society in which one’s life chances depend upon inherited wealth rather than meritocratic achievement.56
This is the existential contradiction of the second modernity that defines our conditions of existence: we want to exercise control over our own lives, but everywhere that control is thwarted. Individualization has sent each one of us on the prowl for the resources we need to ensure effective life, but at each turn we are forced to do battle with an economics and politics from whose vantage point we are but ciphers.
We live in the knowledge that our lives have unique value, but we are treated as invisible.
Why did Google’s Gmail, launched in 2004, scan private correspondence to generate advertising?
Facebook founder Mark Zuckerberg shut the program down under duress, but by 2010 he declared that privacy was no longer a social norm and then congratulated himself for relaxing the company’s “privacy policies” to reflect this self-interested assertion of a new social condition.
Online “contracts” such as terms-of-service or terms-of-use agreements are also referred to as “click-wrap” because, as a great deal of research shows, most people get wrapped in these oppressive contract terms by simply clicking on the box that says “I agree” without ever reading the agreement.68
A new breed of economic power swiftly filled the void in which every casual search, like, and click was claimed as an asset to be tracked, parsed, and monetized by some company, all within a decade of the iPod’s debut.
Privacy, they said, was the price one must pay for the abundant rewards of information, connection, and other digital goods when, where, and how you want them.
This new market form is a unique logic of accumulation in which surveillance is a foundational mechanism in the transformation of investment into profit.
Surveillance capitalism commandeered the wonders of the digital world to meet our needs for effective life, promising the magic of unlimited information and a thousand ways to anticipate our needs and ease the complexities of our harried lives.
Under this new regime, the precise moment at which our needs are met is also the precise moment at which our lives are plundered for behavioral data, and all for the sake of others’ gain.
Surveillance capitalism has taken root so quickly that, with the exception of a courageous cadre of legal scholars and technology-savvy activists, it has cunningly managed to evade our understanding and agreement.
The new harms we face entail challenges to the sanctity of the individual, and chief among these challenges I count the elemental rights that bear on individual sovereignty, including the right to the future tense and the right to sanctuary.
“The operator of a search engine is liable to affect significantly the fundamental rights to privacy and to the protection of personal data. In the light of the potential seriousness of the interference” with those interests, “it cannot be justified by merely the economic interest which the operator of such an engine has in that processing.”
“The Luxembourg Court felt that free flow of information matters, but not as much, ultimately, as the safeguarding of dignity, privacy, and data protection in the European rights regime.”
The court conferred upon EU citizens the right to combat, requiring Google to establish a process for implementing users’ de-linking requests and authorizing citizens to seek recourse in democratic institutions, including “the supervisory authority or the judicial authority,
In reasserting the right to be forgotten, the court declared that decisive authority over the digital future rests with the people, their laws, and their democratic institutions. It affirmed that individuals and democratic societies can fight for their rights to the future tense and can win, even in the face of a great private power.
If the digital future is to be our home, then it is we who must make it so. We will need to know. We will need to decide. We will need to decide who decides. This is our fight for a human future.
In other words, Google would no longer mine behavioral data strictly to improve service for users but rather to read users’ minds for the purposes of matching ads to their interests, as those interests are deduced from the collateral traces of online behavior. With Google’s unique access to behavioral data, it would now be possible to know what a particular individual in a particular time and place was thinking, feeling, and doing.
Google’s invention revealed new capabilities to infer and deduce the thoughts, feelings, intentions, and interests of individuals and groups with an automated architecture that operates as a one-way mirror irrespective of a person’s awareness, knowledge, and consent, thus enabling privileged secret access to behavioral data.
Google would now secure more behavioral data than it needed to serve its users. That surplus, a behavioral surplus, was the game-changing, zero-cost asset that was diverted from service improvement toward a genuine and highly lucrative market exchange.
It dismissed the moral and legal content of individual decision rights and recast the situation as one of technological opportunism and unilateral power.
Most of all, he credited the discovery of behavioral surplus as the game-changing asset that turned Google into a fortune-telling giant,
In the new operation, users were no longer ends in themselves but rather became the means to others’ ends.
The last thing that Google wanted was to reveal the secrets of how it had rewritten its own rules and, in the process, enslaved itself to the extraction imperative. Behavioral surplus was necessary for revenue, and secrecy would be necessary for the sustained accumulation of behavioral surplus.
As Schmidt told the New York Times, “You need to win, but you are better off winning softly.”
Google policies had to enforce secrecy in order to protect operations that were designed to be undetectable because they took things from users without asking and employed those unilaterally claimed resources to work in the service of others’ purposes.
Sandberg understood that through the artful manipulation of Facebook’s culture of intimacy and sharing, it would be possible to use behavioral surplus not only to satisfy demand but also to create demand.
we are the objects from which raw materials are extracted and expropriated for Google’s prediction factories. Predictions about our behavior are Google’s products, and they are sold to its actual customers but not to us. We are the means to others’ ends.
that ignorance is a condition of this ubiquitous rendition; that decision rights vanish before one even knows that there is a decision to make; that there are consequences to this diminishment of rights that we can neither see nor foretell;
Google’s machine intelligence capabilities feed on behavioral surplus, and the more surplus they consume, the more accurate the prediction products that result.
Google has learned to be a data-based fortune-teller that replaces intuition with science at scale in order to tell and sell our fortunes for profit to its customers, but not to us.
Surveillance capitalism’s profits derive primarily from these behavioral futures markets.
People will generate enormous amounts of data. . . . Everything you’ve ever heard or seen or experienced will become searchable. Your whole life will be searchable.”
human experience is subjugated to surveillance capitalism’s market mechanisms and reborn as “behavior.” These behaviors are rendered into data,
The commodification of behavior under surveillance capitalism pivots us toward a societal future in which market power is protected by moats of secrecy, indecipherability, and expertise.
We are the native peoples now whose tacit claims to self-determination have vanished from the maps of our own experience.
Digital dispossession is not an episode but a continuous coordination of action, material, and technique, not a wave but the tide itself. Google’s
One way that Google’s founders institutionalized their freedom was through an unusual structure of corporate governance that gave them absolute control over their company.

