Jump to ratings and reviews
Rate this book

The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power

Rate this book
The challenges to humanity posed by the digital future, the first detailed examination of the unprecedented form of power called "surveillance capitalism," and the quest by powerful corporations to predict and control our behavior.

In this masterwork of original thinking and research, Shoshana Zuboff provides startling insights into the phenomenon that she has named surveillance capitalism. The stakes could not be higher: a global architecture of behavior modification threatens human nature in the twenty-first century just as industrial capitalism disfigured the natural world in the twentieth.

Zuboff vividly brings to life the consequences as surveillance capitalism advances from Silicon Valley into every economic sector. Vast wealth and power are accumulated in ominous new "behavioral futures markets," where predictions about our behavior are bought and sold, and the production of goods and services is subordinated to a new "means of behavioral modification."

The threat has shifted from a totalitarian Big Brother state to a ubiquitous digital architecture: a "Big Other" operating in the interests of surveillance capital. Here is the crucible of an unprecedented form of power marked by extreme concentrations of knowledge and free from democratic oversight. Zuboff's comprehensive and moving analysis lays bare the threats to twenty-first century society: a controlled "hive" of total connection that seduces with promises of total certainty for maximum profit--at the expense of democracy, freedom, and our human future.

With little resistance from law or society, surveillance capitalism is on the verge of dominating the social order and shaping the digital future--if we let it.

Table of contents

1. Home or exile in the digital future

2. August 9, 2011: Setting the stage for Surveillance Capitalism
3. The discovery of behavioral surplus
4. The moat around the castle
5. The elaboration of Surveillance Capitalism: Kidnap, corner, compete
6. Hijacked: The division of learning in society

7. The reality business
8. Rendition: From experience to data
9. Rendition from the depths
10. Make them dance
11. The right to the future tense

12. Two species of power
13. Big Other and the rise of instrumentarian power
14. A utopia of certainty
15, The instrumentarian collective
16. Of life in the hive
17. The right to sanctuary

18. A coup from above

About the author
Detailed table of contents

691 pages, Hardcover

First published October 4, 2018

Loading interface...
Loading interface...

About the author

Shoshana Zuboff

23 books605 followers
Shoshana Zuboff is the Charles Edward Wilson Professor emerita, Harvard Business School. She is the author of In The Age of the Smart Machine: the Future of Work and Power and The Support Economy: Why Corporations Are Failing Individuals and the Next Episode of Capitalism. She received her Ph.D. from Harvard University and her BA from the University of Chicago.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
3,889 (38%)
4 stars
3,776 (37%)
3 stars
1,711 (16%)
2 stars
518 (5%)
1 star
171 (1%)
Displaying 1 - 30 of 1,607 reviews
Profile Image for Trevor.
1,293 reviews21.7k followers
May 17, 2019
I was talking to a friend at work about this book and we agreed it was both very good and very long, perhaps even too long. Sometimes, when you are at a symphony concert, the first movement will end with ‘da – da – da – daaaaaa’ and some people in the audience will clap, something that annoys all of those who know you are only meant to clap right at the end of the piece. Like I said, this book is very long and in three parts – and at the end of the first part I was getting ready to clap and thought, hang on, there seems to be an awful lot of this left.

This book is disturbing – and not just a little disturbing, I’ve been thinking of ways to do without Facebook and Google – that level of disturbing.

I think a large part of the problem people might have with this book is the same problem a lot of people have with advertising. I think most people think advertising might work on other people, other people being, by definition, a bit thick – but to suggest it might work on them is close to the biggest insult you can offer them. Sure they drink Coke – but not because they want to teach the world to sing in perfect harmony, but because, you know, it tastes good. People are so convinced they operate as free agents and that they know this because they base their decisions on a rational appraisal of the evidence before them, that the notion they might be ‘influenced’ by advertising has to be rejected outright because it would undermine their self-opinion and their notions of self-worth. All of which is particularly good news for the advertisers.

The author starts this book with a story about her house catching fire and while she was waiting for someone to come put it out, she rushed about ‘saving’ things. The problem was that once the fire brigade arrived she was the thing they felt they most needed to save. She had never been in a house fire before and so this unprecedented situation seemed much safer to her than it actually was. This is the situation she says we are in with surveillance capitalism. We have never had this level of surveillance before, we believe we are getting stuff for nothing (google, facebook and so on) but actually our lives are being used as the raw behavioural material that is being used to manipulate us in ways we can barely understand. And it will only get worse.

And this influence isn’t happening in ways we might think it is. That is, this isn’t about you writing to your mum that you have been thinking about buying a drone and then you getting lots of ads on your feed for places to buy drones. Yeah, I’m sure that shit happens, but it is nothing at all compared to what is really happening. There is a lovely bit in this where she discusses poker machines and how they have been designed by psychologists to provide the perfect environment of reward so that it stops being about winning and losing, but about getting you to continue to play. The longer you play, the more you lose. But the environment has been constructed so as to ensure that even looking away for a second feels like a kind of loss.

And this is about as good a description of facebook (or as someone I know calls it face-crack) as I can think of. Except with facebook the rewards are to do with your own self-esteem, so looking away is even harder.

And this isn’t even the worst of it. I’ve read quite a few books on behavioural economics – essentially it says that you can provide nudges to people and they will behave in ways that you want them to. This is generally presented as something good – like in Nudge itself, where people save more for their retirement if you construct a form in one way rather than in another. Or the example changing the form for organ donation from opt in to opt out and the rates of donation increase accordingly. You know, who is going to argue with that? Saving for your retirement is a good thing, donating your organs is a good thing. Except, the people who run facebook and google actually don’t make their money from you donating your organs – they make it from selling your behavioural data to people who want you to buy shit you don’t need or who want to work out what frightens the shit out of you so they can get you to vote for Trump or for Brexit. And they did. And it works. Even if you didn’t vote for either of these, it still works. Democracy doesn’t work by shifting those at the ends of the distribution – it works by shifting those in the middle. But we are all left with the consequences. 48% of the people of Britain are living with the consequences of what looks from the outside like - holy Jesus...

What the gathering and analysis of the dross of our lives (aka data) allows is the most disturbing and powerful means imaginable for influencing us – and the worst of it is that we will not even know we are being influenced. In fact, this I-know-you-better-than-you-know-yourself is often presented as a feature of the new system – you’ll walk into a bar and the bartender will already know your favourite drink and might even have it ready for you. Yeah, thanks for that...

A lot of this book discusses notions of operant conditioning and therefore it is a reprise of something like Plato’s republic – except his republic was to be run by the disinterested philosopher kings, people uninterested in money and worldly goods, who were judged (by whom?) to be the best and brightest and to be those worthy of organising society for the rest of us -even of telling us lies to make our little lives feel more comfortable. The republic we are entering with our eyes tightly closed is one where the philosopher kings have been replaced by the selfish and the greedy and we have given them the means to influence us in worlds they create and construct so that those worlds meet their own desires for us. I’ve always seen Plato’s republic as a dystopia – but what we are living in has no comparison. It is terrifying beyond words.

I wish this book wasn’t nearly so long – I would recommend it wholeheartedly then. You still do need to read it, though. Sorry.
Profile Image for Dominik.
112 reviews74 followers
March 22, 2019
An important, albeit flawed, book. Viewing the rise of Google and Facebook through the lens of sociology, this makes for some heavy reading as one swims among the book's unique vocabulary ("the will to will," "division of learning in society," "double movement," "shadow text," "extraction imperative," "prediction imperative"). Eventually the phrases begin to make their own sort of strange sense, but it definitely feels foreign. Perhaps I haven't read much sociology, so this failing may be my own. I do wish the book were written in simpler, less highfalutin prose.

Surprisingly, there are no data at all in this book. None. I went in expecting analyses of the economic gains from Google and Facebook contrasted to some approximation of the social costs imposed by them. Or of small businesses enabled by these platforms. Or any economic analysis at all. There's none. This isn't that sort of book. Google and Facebook's "dispossession" of "people's lives" are presented as obvious. Showing a targeted ad based on a search or a user profile is apparently, to the author, something tantamount to stealing their very soul.

The book's arguments are weakened by its purple prose. "Rivers of blood" are not, in fact, flowing because of surveillance capitalism. We digital natives are not, in actuality, being mercilessly slaughtered by Conquistadors Page, Zuckerberg, and Nadella (the author explicitly makes this analogy over several pages). I understand the author's desire to provoke "astonishment and indignation" but I found these hyperbolic passages less than persuasive.

Despite the book's heft, there's much it doesn't cover. There's no discussion of pre-internet data brokers (credit card companies and credit bureaus, anyone?) There's no consideration of the privacy controls that Google and Facebook offer (much of the "extraction imperative" can be entirely turned off in each's privacy settings), nor that many valuable services are provided for free that do directly improve folks' lives. Nor any examination of other, far darker players in ad tech and social media. There's no discussion of k-anonymity, sparingly little of encryption, one fleeting mention of blockchain (expressing disapproval that smart contracts subvert the fundamental human values imbued in the ancient practice of contracts). The treatment of ML is decidedly non-technical and more akin to seeing ML as magic fairy dust than wrangling with its very real shortcomings (surprisingly, ML fairness doesn't get a mention).

The author ends by envisioning a nightmare scenario of B.F. Skinner's wildest dreams where, like players in Pokemon Go, our entire lives become subject to the careful nudges and variable rewards promulgated by the "high priesthood" of data scientists carefully tuning the all-knowing, all-seeing machine learning system. Prediction becomes control and the machines take over "the human hive." Ooo... kay.

All that said, this book is still worth the read, if you can stomach the hyperbole and dense prose and can exercise empathy toward the author's very real fears of seeing democracy and free society swallowed up by powerful corporations that tap into and manipulate human hopes and fears. There is love poured into these pages, love of humanity in its culture and its unpredictable freedom, and that in itself makes the book worth the investment. At the same time though, there's little love for Google or Facebook or technology in general -- and quite a lot of fear, uncertainty, and doubt.

Disclosure: I am an employee of a big tech company, but the review above is solely my own opinion and not that of my employer. I've also tried to review the book without bias and through neutral eyes.
Profile Image for Lucas.
345 reviews28 followers
January 30, 2019
I’m giving this book 2 stars, in hopes that the surveillance capitalists at Amazon will not recommend others like it to me.

In terms of research, this book deserves a higher rating. It is incredibly thorough and well sourced. But here is an example of a sentence the author, Shoshanna Zuboff, uses: “This time, we have sent them into the raw heart of a rogue capitalism that amassed its fortune and power through behavioral dispossession parlayed into behavior modification in the service of others’ guaranteed outcomes.” All the great research she’s done is described in the most alarming wording possible. It is a book filled with hyperbolic conclusions that are often not justified by anything other than the alarming wording she uses.

I have two theories about why that is: 1) She assumes that anyone willing to read the book already agrees with her conclusion that the collection of data by Google represents nothing short of the end of individual freedom. So if everyone agrees, she doesn’t need to explain why something like Pokemon Go taking users to stores is bad, because it’s just so obvious. 2) She’s not actually writing for a contemporary audience, but an audience in the distant future when everything she warns about has come true. This might sound like I’m joking, but she talks in the introduction about her work being inspired by Marx, because he saw what capitalism would be when it was still an unprecedented phenomenon, and later says that one of her conclusions unites very well with Thomas Paine. So the stakes are very high for Zuboff.

Beyond the conclusions being what they are, the writing in general is too much for me. I usually don’t complain about writing being too dense/academic/pretentious, but here is another sentence that exists in this book: “Orwell’s chilling final passages fulfill the life of that dry seed planted at the turn of the century in Italy’s impoverished soil and nourished by war, deprivation, and humiliation to flower in the nightmare of Nazi Germany and the apocalypse of Stalin’s Russia, finally to bear fruit in Orwell’s imagination: a testament for all time to what Mussolini had called the “ferocious totalitarian will” and the souls on which it feeds.” It is a generally grating read, and it doesn’t help that the book is twice as long as it needs to be.

I’m not sure that all “behavior modification” by social media or data science is bad. (I should disclose that I work in data science, so ignore this whole review if you want). For example, a lot of people on the Goodreads app probably read more often than they would without the app, because of the social sharing aspect and the recommendation systems. This makes Amazon more money when people buy books from them. Is this bad? Most people on this app would probably agree that reading is good. Targeted advertising is not inherently bad either. For instance, I will never buy a truck. I don’t have anything against people who have trucks, but I have no interest in ever owning one. I hope every AI system in the world picks that sentence up, because it is my dream to watch a football game on the weekend and never see another truck commercial for the rest of my life.

Am I saying that there is nothing to be concerned about here? Of course not. As Zuboff points out, a lot of the data that companies have on consumers now leads to an unprecedented level of information and power. Power is vulnerable to abuse. Data can be used in ways that impair freedom and I’m concerned about that. That’s why I bought this book. But data can also be used in a lot of positive ways. Self-driving cars can virtually eliminate accidents from the road. Zuboff misleadingly quotes an internal memo at Google about self-driving cars and their ability to learn from collective car data storage, and uses it as an example of surveillance capitalists “lamenting” that people aren’t more like machines. It’s a very important subject, but Zuboff’s portrayal and conclusions are just way too much for me.
Profile Image for Michael.
655 reviews966 followers
May 29, 2020
Paints a frightening portrait of the rise of mass surveillance since the start of the Information Age. In ornate, often opaque, prose Zuboff charts the development of a new form of global capitalism that aims to surveil all facets of human existence and, using vast stores of privately held, ruthlessly gathered data, predict and modify user behavior to align with desired commercial outcomes.
Profile Image for Anna.
1,686 reviews636 followers
November 18, 2019
This will be a long review, so let me summarise it with tweet-like succinctness: ‘The Age of Surveillance Capitalism’ is Black Mirror for people who hate fun. I definitely mean that as a compliment. It synthesises and analyses a wide range of ideas I’ve come across in leisure and work reading during the past few years, mostly in articles online. As fragments, those ideas filled me with concern and confusion. Combined into the clear and systematic structure of a book, they fill me with dread, but the alleviation of confusion is very powerful. Zuboff sets out a convincing and shocking analysis of the recent turn global capitalism has taken towards intensive data-gathering, behavioural prediction, and pervasive surveillance. While I think it could have been equally effective at slightly shorter length, that is probably influenced by the unwieldiness of the hardback I got from the library. I really appreciated the measured pace and excellent explanations. Zuboff coins a number of useful descriptive phrases, none more helpful than that in the title. The vagueness of ‘late capitalism’ has always irritated me; ‘surveillance capitalism’ has a punchy accuracy. Zuboff is a great writer, with a consistent ability to identify key points without becoming reductive or sensationalist:

Surveillance capitalism’s ability to keep democracy at bay produced these stark facts. Two men at Google who do not enjoy the legitimacy of the vote, democratic oversight, or the demands of shareholder governance exercise control over the organisation and presentation of the world’s information. One man at facebook who does not enjoy the legitimacy of the vote, democratic oversight, or the demands of shareholder governance exercises control over an increasingly universal means of social connection along with the information concealed in its networks.

Zuboff centres her overall enquiry into surveillance capitalism on three fundamental questions: who knows? Who decides? And who decides who decides? The answers are disquieting, to say the least. It amazes me that so many people I know seem unconcerned about the amount of data the big five tech companies (Google, Apple, Facebook, Microsoft, Amazon) have about them and how it is used. Not only do these firms have far more data about us than we can readily understand, its individual value is irrelevant in comparison to the value of it all in aggregate. Zuboff uses the term ‘economies of scale and scope’ for this. Big data is valuable because machine learning models require vast amounts to produce useful results. For work reasons, I recently taught myself data mining and basic machine learning in R. It was alarming to realise how easy to use yet fundamentally opaque big data analytics are. Neural networks aren’t really analogous to human brains in structure or function. The real similarity is that in neither case is it known why you get the result you do. Moreover, like human brains, neural networks makes mistakes. However their mistakes are very different to those of humans, and generally depend on how much data they’ve been trained on and what forms it took.

Zuboff does not discuss such technicalities. If you want an introduction to machine learning, I suggest this youtube video. She is, rightly, more interested in the data that google and others gather to feed machine learning models, which then predict our behaviour in order to sell us stuff. Once limited to your computer, the imperative to gather more and more behavioural data increasingly invades daily life via the internet of things:

The very idea of a functional, effective, affordable product or service as a sufficient basis for economic exchange is dying. Where you might least expect it, products of every sort are remade by the new economic requirements of connection and rendition. Each is reimagined as a gateway to the new apparatus, praised for being ‘smart’ while traditional alternatives are reviled for remaining ‘dumb’. It is important to acknowledge that in this context, ‘smart’ is euphemism for rendition: intelligence that is designed to render some tiny corner of lived experience as behavioural data. Each smart object is a kind of marionette; for all its ‘smartness’, it remains a hapless puppet dancing to the puppet master’s hidden economic imperatives.

Zuboff is especially good at explaining how it came to this: how big tech seized a specific historic moment when neoliberal economics, the war on terror, and advances in information technology converged. The big five’s tactics for avoiding regulatory control or even admitting what they actually do are set out chillingly well. While all this has certainly been discussed before, it is expressed especially well here. A slew of short articles over years are hard to distil sense from, whereas this book sets out the situation with admirable clarity. Chapter eleven lists characteristics that have allowed surveillance capitalism to take root, despite the fact that the Western population consistently claim to value privacy. The range and impact of these characteristics certainly makes sense of how we got here: lack of precedent, declaration as invasion, historical context, fortifications, the dispossession cycle, dependency, self-interest, inclusion, identification, authority, social persuasion, foreclosed alternatives, inevitabilism, the ideology of human frailty, ignorance, and velocity.

The most novel part for me was an exploration of the philosophy underlying surveillance capitalism. Since big tech aggressively avoids articulating such a thing, based on the spurious claim that data is totally neutral, this was especially interesting. Zuboff labels it instrumentalism and contrasts it powerfully with totalitarianism:

Totalitarianism operated through the means of violence, but instrumentarian power operates through the means of behavioural modification, and this is where our focus must shift. Instrumentarian power has no interest in our souls or any principle to instruct. There is no training or transformation for spiritual salvation, no ideology against which to judge our actions. [...] It is profoundly indifferent to our meanings and motives. Trained on measurable action, it only cares that whatever we do is accessible to its ever-evolving operations of rendition, calculation, modification, monetisation, and control. [...] Totalitarianism was a political project that converged with economics to overwhelm society. Instrumentarianism is a market project that converges with the digital to achieve its own unique brand of social domination.

I particularly appreciated the link Zuboff made with behavioural economics and its rejection of the rationality assumption, while keeping all the other reductive and dubious assumptions of free market economics. The ‘nudge’ ethos of behavioural modification to optimise outcomes is entirely consistent with surveillance capitalism. Whenever I’ve read behavioural economics books over the years, the same questions come to mind: first off, why are you so amazed to have discovered very basic psychology? Secondly, whose behaviour are you nudging, why, and for whose benefit? This idea of nudging or tuning behaviour is deeply unsettling and contains potentially massive hidden contradictions, quite apart from its ethical implications. What if two companies in the big tech oligopoly try to push behaviour in different directions? Surely the vague aspiration of making the world run more smoothly and efficiently (whatever that means and for whom) is in conflict with the anger and violence social media stokes in politics?

I was slightly surprised that only towards the end of the book does Zuboff broach the corrosive political effects of social media, such as the spread of fake news and an adversarial, reductive, and angry political culture. In a way, she hardly needs to. The prior chapters set this up well, by explaining the ‘radical indifference’ that big tech has for the actual content it feeds to its users. The only aim is to increase revenues via a business model of maximising attention and engagement on the platform(s). If divisive, dangerous, and totally inaccurate material gets clicks and comments, then that’s good enough for facebook and google. They take zero responsibility for the consequences this has on politics, culture, and society, despite profiting massively from them. I’m actually glad this wasn’t mentioned earlier in the book, as it’s so depressing that it would have pulled focus from the economic and philosophical foundations beneath the surface.

As has probably become clear, I consider this a deeply thought-provoking and helpful book that has made my view of the world we live in a little clearer. That is the pinnacle of what you can hope for in non-fiction, in my view. Nonetheless, I didn’t agree with every word of it. Zuboff treats surveillance capitalism as a successor to industrial capitalism, stating several times that the latter wrecked the environment and now the former is wrecking the human soul. While I don’t disagree with this, I think surveillance capitalism is also making it much harder to deal with the consequences of industrial capitalism (which still exists as well! Smart phones don’t just manifest from the aether). Action to deal with climate change has been derailed by reactionary populist politics and a false equivalency between scientific research and conspiracy theories. The complex and long-term nature of environmental problems is totally unsuited to the acceleration and superficiality of social media. Moreover, surveillance capitalism is still capitalism, thus all about economic growth, increasing consumption, and wasteful energy use. I think these links should have been acknowledged a little more. Much like financial capitalism, surveillance capitalism is a parasite upon industrial capitalism; will it drain its host until they both die, I wonder?

I think the weakest material is in the final chapter, which considers how young people are growing up with pervasive internet surveillance that stunts their sense of self. This is more speculative and lacks the rigor and conviction of the other chapters. Which is not to say that I find the concept uninteresting or unimportant. Here, though, it is treated as something of an afterthought. The psychological effects of constant connectivity and a norm of performative content sharing, especially on children, deserve their own books. Mixing macro and micro-level analysis can be risky; this is a macro book and that is its great strength.

It is salutary to compare ‘The Age of Surveillance Capitalism’ with Paul Mason’s Postcapitalism: A Guide to Our Future, which I read in 2015. Mason covered some similar ground, but drew very different conclusions that now appear remarkably naive. The contradictions that he suggested would bring down neoliberalism are resolved by surveillance capitalism. Mason wrote, and I agreed when I read it, that big tech’s control over data was fragile and unsustainable. I no longer believe that; the past four years have seen consolidation and expansion of google and facebook’s control over data. Over the same period, it has become evident that such data can be put to dangerous purposes with a total absence of democratic accountability. According to free market economic theory, the infinite supply of data should make it worthless. Quite the opposite occurs, as data becomes more and more valuable as its scale and complexity increases, because it can be used to make quicker and more accurate behaviour predictions, and to influence behaviour. Certainly not in a free market, though. Google, amazon, and facebook are in unassailable economic positions. Any company that tries to compete is bought by them.

The only threats to their dominance come from outside the market: regulation, essentially. Breaking up their monopolistic positions is part of public discourse, for example the proposals of Elizabeth Warren, a potential Democratic presidential candidate in the US. As with oil companies, though, there is great reluctance to face the fundamental problem: their business model. Oil companies have no place in any world that takes climate change seriously, because we must stop burning oil. Likewise, pervasive surveillance and data gathering have no place in any world that values privacy. Reliance on secretive behavioural monitoring and modification should also stop, but a ban on them seems even further away than a ban on burning oil. At present they appear inextricably linked with the internet, just as energy systems seems inextricably linked with fossil fuels. In both cases, the two developed interdependently, but their linkage isn't inevitable. The possibility exists of other energy systems and other forms of internet. To my mind, the first step to imagining better is understanding the flaws in what we have.

I am more pessimistic and negative about social media than most people I know, quite possibly most people in general. While it can have positive consequences, the fact that it is optimised by a handful of companies to take our data and sell us shit makes it fundamentally flawed. The internet has a lot of potential to bring people together; social media as currently constituted is more likely to push them further apart. I wonder whether Trump could have become president without twitter and facebook? Frankly I doubt it. The irony of my posting this on a social media site owned by amazon is not lost on me; this is how we live now. ‘The Age of Surveillance Capitalism’ takes the reader beyond the endless noise of twitter et al in an attempt to explain the underlying theory and structure of 21st century capitalism. I found it an invaluable guide that solidified ideas I already had, as well as introducing new concepts and raising new questions. Be warned: I’m probably not going to shut up about this one for a long while. Probably best to read it now, so you can make up your own mind.
Profile Image for Mehrsa.
2,234 reviews3,657 followers
May 11, 2019
THIS is the book I have been waiting to read on the new internet era. It's a mix of Neil Postman, Marshall Mcluhan, and Huxely. If I have one complaint, it's that she gets super carried away with metaphors and flowery language--it was actually quite annoying. But Zuboff takes a long view of history and situates the new era of surveillance capitalism within parallel trends in markets, culture, and law. She makes some brilliant observations--her comparison of surveillance capitalism with totalitarianism was especially interesting. Sometimes I think she overstates the dangers, but it's a nice and necessary challenge to the technoutopians and the denialists who claim that nothing is new. She also challenges some of the older fears about the surveillance state. She's not worried about state surveillance. She's worried about the intrusion of markets into all of our private spheres. I appreciated her references to Polyani and Arendt because it was useful to connect political totalitarianism and the great transformation of the markets into newer spheres
Profile Image for Henk.
851 reviews
January 18, 2022
Sweeping and interesting, but also repetitive and alarmist. Definitely thought provoking but more concrete examples and less comparisons to totalitarianism would have aided my appreciation of this book
Assistance and personalization being the poetry of being able to sell more to you.

First of all, definitely check the privacy settings tab on social media and Google (as I did, so kudos for spurring the reader into action) and you’ll be shocked! Facebook in my example received information from 469 external sites to make adds more personal. The interests Google coupled to my account were more than 7 web pages of two columns long.
The statement We now pay for our domination that Shoshana Zuboff uses early on in the book definitely feels more real after just a brief glance behind the curtain of Big Tech.

Rise to prominence
The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power starts of with the advent of Google in the first tech boom, and how quickly the idea of knowledge made more accessible turned into something that needed to be monetized.
With this reorientation of Knowledge to Power as Zuboff calls it, there came into being a positive feedback loop that Google, as it acquired more data of its users, could als make services better and hence increase its competitive advantage versus other search engines. As a "surplus" to this data of users, the company could target users more effectively, essentially better predicting what kind of adds would work on what kind of users. Zuboff uses the term "Behavioral Future Markets" for this, with Google and other Big Tech companies essentially using the datasets they've built in respect to us as a resource for better predictions on future behavior. We are not the customer to the companies, those are the advertisers, and Zuboff even defines us as a human resources (We are the means to others ends) to the end of being able to sell better predictions of behavior. She uses the term Google’s prediction factories, with the bets on future decisions and choices as “harvested material” from the user interactions, being the end product.
It's a development she defines as a new phase of capitalism: from industrial capitalism exploiting natural resources to surveillance capitalism that exploits human social interactions and behavior.

Apple’s inversion of capitalism (mass produced and low cost versus expensive and personalized) to an individualized and changeable experience with the iPod and iTunes store is also detailed in this section of the book. Zuboff mentions that research showed that it takes 76 days to read all the terms of service one encounters in a year of surfing the internet in 2015. The internet still feels new, but Cookies were already invented in 1994. But the main focus definitely is Google and later on Facebook.

The aforementioned prediction products can be made more valuable by increasing their breadth (to all knowledge as Google calls it or by connecting everyone as Facebook sees it) or depth (by tying sources of information to each other and building ever more detailed profiles of consumers). Furthermore, what greater certainty of future choices can be gained than through actively influencing behavior? This control of action through nudging and behavioral modification is something that is detailed later on in the book.

The decrease of privacy at an individual level versus the increase of inscrutability at a corporate level, also fueled by the war on terror and the support the US government sought from Big Tech firms, is another topic Zuboff touches upon. The pivot to surveillance, and the tying together of databases and information seen as vital for security. There were 199 data mining projects in the US government and intelligence community in 2004, and the CIA even set up a Silicon Valley venture capital fund.

Google is alleged of developing artificial intelligence since 2003, with developing search to further AI, instead of the other way around.
Better forecasts of click through rates of online advertisements being the main sought after outcome. Google doesn’t sell user data but predictive products that are based on the data it acquired. Hence it is taking things outside of the markets sphere (as emergent social interaction data) and transforming it to new “surveillance” asset classes.

Above the law and moving into the physical world
Old institutions, like laws, haven’t kept up with the technological development. Laws can be right if they are 50 year old, long before the internet - Quote from a developer conference by one of the Google founders

Eric Schmidt of Google being a senior advisor to the Obama campaigns doesn't help in regulating the industry. 250 Google employees joined the government advisory boards (or government officials joining Google) until 2016. Google also being the corporation with the highest lobbying budget in the US and second highest budget in the EU.
Schmidt defines Google as a customer satisfaction company, expanding in 150 different products to get a more complete view of human behavior and hence increase the quality of the forecast future behaviors. Android licensing is a good example, being provided for free to get the data of the mobile phone users, including location, and leading to the Google Play store, all leading to more data of users being available to the company.

Google street view cars capturing personal wifi router information, including passwords, emails and URL's, with Google only fined for USD 25.000, and a settlement of USD 7.000.000 in the US, and EUR 150.000 in the EU. This part of the book made me think a lot of the strategies Uber used to get into cities, make people familiar with its products, making any resistance by municipalities hard, as detailed in Super Pumped: The Battle for Uber.

This move into the physical world leads Zuboff to say Places and people must be known before being able to be controlled and making comparisons with Spanish colonial conquistadors mapping America. The Terms of Service agreements being imposed on users being similar to these statements made by Spanish conquistadors to natives before their taking of lands:

In this broader trend, with an almost inevitable rise to prominence of surveillance capitalist corporations, the total bombing of the Verizon Yahoo! and AOL acquisitions is something that seems to be missed. Still it's a compelling and interesting narrative.

Behavior and certain outcomes, the holy grail
Nudging and hence influencing decisions immediately hugely increase accuracy of forecasts of future behavior, is the next step now the companies have broad and deep information about their users. Make the future for the sake of predicting it in some variant is something we all have stumbled across in our corporate environments as a kind of encouraging slogan, but is here taken to a whole other level.

Deloitte, CapGemini, Kearney and McKinsey are being quoted in respect to insurance advisory work.
Data companies moving from supplier to car companies to car companies becoming suppliers of car usage and location data.

Technological determinism, with the side effects being blamed on technology and algorithms instead of being the product of decisions made by people in Google, Facebook and other tech firms. Inevitability as a response to opposition.

Accept the terms and conditions or receive a severely less usable product or service.
A Nest thermostat, including all the parties that it delivers data to, requiring the review of 1.000 contracts/terms of service. This is called by Zuboff The dictatorship of no alternatives.
Wearables being ubiquitous, cheap and non-intrusive to bolster the more granular capture of data of users. Google (Now assistent) needing to know you better to know your needs before you have realized you have them, being an actually professed purpose by Google executives.
Children toys uploading conversations with children to Nuance.
Alexa being sold like Android was to third party developers and making it more ubiquitous.
Personality and emotions being recorded and classified to maximize advertising effectiveness.
Facebook running more than 1.000 experiments on the Newsfeed of users, without an oversight board on the experiments.

Future tense
In the last part of the book my enjoyment started to tank and my observations, and the narrative of Zuboff becomes more fragmentary.

The comparison to totalitarianism does not aid the book. I would also say that the whole thesis of 700 pages would be stronger with more specific current examples and less general statements on the will to will, right to future tense and other terms invented by Zuboff, that are repeated quite often. Panvesive = is this even a word?

Are the chapters about B.F. Skinner his work not better placed at the start of the book as a kind of theoretical basis? Now it feels very weird to return to something decades old after dealing with the Internet of Things.
Freedom is ignorance, an accident to be remedied by knowledge - unaccounted for human behavior compared to the weather, previously unknown but now largely predictable.
The problem of privacy as some thinkers even call it.

The positive effects of Microsoft’s Internet of Things Edge on factory operations seem very benign, is the extrapolation to society as a whole not a bit extremist?
Collective knowledge superseding total liberty of choice of individuals is not new, it’s the basic of civilization in a sense?
Who decides the greater good is a valid question?

Free will being acknowledged as existent, but only manifest in a few % of behaviors, hence not needed to be modelled by surveillance capitalist companies.

Living in the hive, as we are driven to what is predictable, common and “better” from the perspective of getting certain outcomes.

Isn’t it a bit simplistic to set the decrease of trust in society as an outcome of social media?
GM employing more people in the Great Depression than Google in 2019.
The link between American revolution and consumerism is rather tenuous in the conclusion.

A chilling book, sometimes marred by too much invented jargon and theoretical concepts, but definitely clear eyed on the dangers of the direction we find ourselves in. Go check your privacy setting guys!
Profile Image for Emily B.
426 reviews421 followers
March 28, 2021
‘Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioural data’

This was definitely an informing and illuminating and yet unsettling read. I feel everyone could benefit from reading this thought provoking book. However It was also a very long read and I feel it could have been condensed somewhat.
Profile Image for Laura Noggle.
677 reviews387 followers
June 21, 2019
“At its core, surveillance capitalism is parasitic and self-referential. It revives Karl Marx’s old image of capitalism as a vampire that feeds on labor, but with an unexpected turn. Instead of labor, surveillance capitalism feeds on every aspect of every human’s experience.”

I can't stop thinking about, or recommending this book enough. Awareness is the first line of defense.

"Consider that the internet has become essential for social participation, that the internet is now saturated with commerce, and that commerce is now subordinated to surveillance capitalism. Our dependency is at the heart of the commercial surveillance project, in which our felt needs for effective life vie against the inclination to resist its bold incursions. This conflict produces a psychic numbing that inures us to the realities of being tracked, parsed, mined, and modified."

This book is amazing, terrifying, and absolutely essential if you value even the abstract idea of privacy.

So many notes, will be digesting for awhile — full review to come.

“If democracy is to be replenished in the coming decades, it is up to us to rekindle the sense of outrage and loss over what is being taken from us . . . the human expectation of sovereignty over one’s own life and authorship of one’s experience.”

"Everyone needs to read this book as an act of digital self-defense."
- Naomi Klein

"Das Kapital of the digital age."
- Hugo Rifkind, The Times

Most Recently: In Court, Facebook Blames Users for Destroying Their Own Right to Privacy
Profile Image for Murtaza .
664 reviews3,401 followers
April 8, 2019
A few years ago I read Yuval Noah Harari's book Homo Deus, a whimsical look at our looming technological dystopia. Harari's book struck me as being happily resigned to the end of human freedom and indeed the end of humanity as we know it. This book could be described as the pessimistic and despairing counterpart to Harari's work. It goes over many of the same themes: the predictive power of Big Data, the loss of human freedom and the intrusion of surveillance technology into every corner of our lives. These things are all genuinely bad and troubling.

Having said that I found the authors analysis to be somehow overly bleak. There is a fundamental trade-off in having a free service like Google Maps. We get powerful, unprecedented assistance in navigation. In turn, they get our data which they use to refine their models of predictive human behavior. This probably isn't fair once you understand the full scope of what you are losing and maybe we should all demand a new modus vivendi. The book fails to describe it as a trade-off however. Instead it is depicted as us being literally conquered and enslaved by a foreign force, repeatedly analogized to the conquest of the Tainos by the Spanish conquistadors. This hyperbole is woven throughout the generally dense prose of the book. There is tons of information here. It doesn't necessarily say much that will be new to a relatively informed audience. I found this a bit disappointing given the breathless reception that the book has received. The repeated invocations of philosophers and renaissance poetry also felt a bit overwrought.

Having said that, I am in sympathy with much of the book's argument. We definitely have regressed to some sort of feudal economic arrangement, even while retaining the minds and desires of modern people. No wonder so many feel unhappy, despite their relative material wealth. The ubiquitous presence of sensory and data tracking equipment is setting human beings up for something unimaginable. Every movement, thought and impulse is on the way to being tracked and recorded in a powerful predictive text that can only be read by our new tech industry overseers. We are on the way to becoming laboratory rats in a maze designed to fine-tune our behavior, mainly for the purpose of controlling us and extracting our wealth. Every new gadget and app, often given for free or sold at cost price, is a new spy intended to capture as much behavioral data as possible. You don't need to enjoy the author's lurid prose to see that her words contain some truth.

Are there any solutions here? Only a vague gesture towards the need for more democracy. While that may be true I found it to be a predictable and somewhat lazy way to conclude the book. I feel it is not right to hammer out hundreds of pages of dire information without even formulating a plausible solution. Yes, the problems are articulated are real and critical. As the author acknowledges though, the tide is so overwhelming most have resigned themselves to drift along its course. How do we counteract this feeling of resignation? That would be a significant question to answer.

Were it possible I would give this book 2.5 stars of out of five. I would recommend any of the numerous essays about this subject out there, as well as Harari's more engaging overview.
Profile Image for Gary Barnes.
47 reviews3 followers
February 7, 2020
Ms Zuboff has a number of outstanding points to make in this weighty tome. Unfortunately she seems to have attempted to do it in Klingon. A 250 page book without the repetitive, dense, unnecessarily high-flown prose would have been perfectly okay. Now this book will go down as a laborious, soul destroying pile of paper. 5 stars for the content, deduct three stars for the writing style.
Profile Image for Paul.
789 reviews17 followers
February 14, 2020
The reviews of this book were very positive and the blurb suggested it was just the book I was looking to read. "...as surveillance capitalism advances from Silicon Valley into every sector of the global economy, she brings its consequences to life".

Unfortunately she doesn't. Over almost 700 pages (including 140 pages of footnotes to highlight the amount of research that has gone into this book) we do not get to see the consequences of living with the always on devices in our homes, on our wrists, in our cars of our modern world. You feel a critique of Facebook, Google, Amazon, etc data gathering techniques and how they use them would be like shooting fish in a barrel, but repeatedly she misses the target. We get more a history of behavioural psychology, and endless flashbacks to comparisons to Ford's production lines.

The language is full of jargon, neologisms and is endlessly circular. There are a few interesting insights hidden in there, such as the Chinese model of collecting personal data on millions of people and using it to affect life chances and advancement, but this is dismissed as unworkable, and "the Chinese are used to being observed anyway". There seems no curiosity about how much our western governments are doing the same thing to profile their citizens. The repeatedly examined catchy examples are of Pokemon-Go directing players to businesses and an automated vacuum cleaner secretly mapping your rooms. Little mention of surveillance capitalism's more sinister uses in having always on microphones in millions of households. Also Edward Snowden and Cambridge Analytica's exposés of big data's political uses, eg in trying to swing elections and referendums gets barely a mention.

The conclusion feels like a good introduction, summing up what we already know, but that is the point we get to by the end of the book. There feels like a naivety in the author, who has just suddenly realised that our actions, decisions, routes we take and private conversations are being catalogued. She talks about the teenagers being in the vanguard of those observed by Facebook (I would suggest teenagers are elsewhere whilst Facebook's biggest audience are aged 20-40), and other slightly wrong mentions about how technology is used and accepted into the day to day lives of many people, who know they are being watched, but don't care. This book fails to let us know why we should care.

I honestly think most newspaper reviewers read the intro, looked at the chapter titles and then read the conclusion, not noticing that there is little new in between.
Profile Image for Alex Orr.
143 reviews7 followers
February 5, 2021
This is so poorly written, so horrendously edited (was it edited?), and so stupefyingly redundant that it's really hard to adequately capture in words. You really have to slog through it yourself to understand the depths of its faults. So, let me save you the frustration. The author basically is saying that the aim of much of our current cutting edge technology is to gather as much data about us as possible through everything from smart homes, smart cars, and (obviously) social media in order to construct a perfect simulacrum of ourselves in order to not just sell us more stuff, but to mold our behaviors...nay...our whole lives...as the companies deploying this tech see fit. That's about it. This book is a Ted Talk gone off the rails in the hands of an author who never met an ancillary idea she didn't want to include, and who really, really, REALLY loves to restate her hypotheses over, and over, and over... In the third part of the book, she literally devotes a whole chapter to summarizing the previous chapter. Every second or third paragraph exists solely to restate the preceeding paragraphs. Whole paragraphs exist only to lay out what she will tell us in the proceeding paragraphs and then later (you guessed it) she devotes whole paragraphs to summarizing the previous paragraphs. As for her central ideas? Ehhh.... Look, if you're aware of this book then you're already well aware of her central ideas because they're some of the most commonly discussed and hotly debated in the realms of big data, online privacy, and the growing sway Silicon Valley holds over our lives. The people who are unaware of these issues really could use a shorter, tighter, and just plain easier to read book to perhaps wake them up. The biggest problem I found (other than this being an atrociously written and edited book) is the author's extreme confidence in her beliefs of how this is all going to play out. By the end of the book she is talking of a near-future defined by a sort of techno-fascism in which the individual self has been exterminated from the world. Subtlety is not in her intellectual toolkit. If you decide to tackle this massive misfire at least take some comfort in knowing that it gets easier to read because you become far more comfortable skimming dozens of pages in a matter of seconds since most of it is usually just stuff she said earlier in the book that is repeated, often identically, for no other reason than...well...I don't know. This book reads like a wild and wooly first draft, not like a thoughtful, streamlined, well laid out, final version.
Profile Image for Hadrian.
438 reviews222 followers
June 27, 2020
[O]ur lives are scraped and sold to fund their freedom and our subjugation, their knowledge and our ignorance about what they know.

This is the kind of outrage that could be expected from a Berkeley or NYU sociology department, but seeing this from a Professor Emeritus of the Harvard Business School raises attention. Zuboff refers to "overthrow", the end of democracy as we know it, the reshaping of all human nature. She turns to a metaphor of the Taino before meeting Columbus, as they were wholly ignorant of the atrocities of war he was about to unleash upon them.

"Surveillance capitalism", a term the author coined, is not all imagined. It is still a term that can refer to the bargain that users make when they use free websites such as this. The users of a website are not the consumer, they are the product, as their use data and product history is sold to big marketers or advertising firms. Zuboff calls this 'behavioral surplus'. On Goodreads, this is benign - I can talk about books with friends for free, and I get recommended new ones. No harm done.

The problem lies in what is referred to as the "internet of things" - household objects or new device which track more of the user's location and behavior. Exercise equipment, thermostats, refrigerators, phones, mattresses, and doorknobs are connected to the internet. All of this is sent to "behavioral futures markets", and the general public simply does not know their personal information is treated this way. Legal regimes on monopolies or privacy law do not encompass the current situation - and who reads the terms of service anyway?

Zuboff is best when she does the work of journalism and digs through patent files and internal documents. The antagonists of her story are Facebook and Google, and their grand ambitions in business jargon for human society. But where Zuboff loses me is her exaggeration of their hold on society, of how much change they can carry out, and of how powerless most people are. I personally share her skepticism, but not her "inevitabilism", to use her term.

Where adblocking tools are powerful and widely shared and where targetted lists of consumer data can still be sabotaged (see what happened to the Tulsa rally), advertising is not as powerful as is thought. What makes me the most doubtful at present is Facebook's own scandal where it inflated by a significant margin the count of how many of its users visited its ads, and big companies can and do pull their money and go elsewhere. If digital advertising really is so easily ignored, then why would big businesses bother with such a wide net at all?

While there are social costs to intrusive advertising and Zuboff is right to doubt, I admit things are as bad as she says. Those ships have not sailed.
Profile Image for Marija.
20 reviews62 followers
December 27, 2020
I began reading The Age of Surveillance on the premise of learning a little more about the influence of social networks and the deleterious impact they have on the modern human psyche. Instead, Shoshana Zuboff opened a much broader chapter of analysis in which she managed to follow the etymology of growth through a few hypotheses and instead traced the evolution of capitalism. This work comes as an academic umbrella to many particular questions regarding our digital lives, among which social networks become just one part. If you are patient enough, buckle up and Shoshana Zuboff will take you to a long journey.

Zuboff begins by noting that "surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioural data." As such, our digital behaviour encompasses a wide range of preferences and choices that we voluntarily or involuntarily share online. This inevitably becomes "a proprietary behavioural surplus that is fed into advanced manufacturing processes known as "machine intelligence" and fabricated into prediction products that anticipate what you will do now, soon, and later.” Surveillance capitalists wrap themselves in the fashions of support and emancipation, appealing to and exploiting the fears of the day, while the real action remains hidden behind the scene.

The hidden actions behind the scene are the main interest of her research, which deals extensively with the origin of a newly appearing capitalist system, its social and intellectual meaning. “We are the sources of surveillance capitalism’s crucial surplus. Surveillance capitalism’s actual customers are the enterprises that trade in its markets for future behaviour.”More broadly, the quintessence of her work concerns the idea that "just as industrial civilization flourished at the expense of nature and now threatens to cost us the Earth, an information civilization shaped by surveillance capitalism and its new instrumentarian power will thrive at the expense of human nature and will threaten to cost us our humanity.” Another important point brought to light is the often existing problem of confusion between surveillance capitalism and the technologies it employs.“ Surveillance capitalists would have us believe that their practices are inevitable expressions of the technologies they employ.

The migration of surveillance capitalism from the online environment to the real world serves to highlight examples of changes that have occurred since the technologies were deployed. “Instrumentarian power aims to organize, herd, and tune society to achieve a similar social confluence, in which group pressure and computational certainty replace politics and democracy, extinguishing the felt reality and social function of an individualized existence.” With this observation, this book deals in-depth with the "imposition of a totalizing collectivist vision of life in the hive, with surveillance capitalists and their data priesthood in charge of oversight and control" Zuboff challenges the premise of individualization, claiming that it "should not be confused with the neoliberal ideology of "individualism," which shifts all responsibility for success or failure to a mythical, atomized, isolated individual doomed to a life of perpetual competition and disconnected from relationships, community, and society. “ Individualization has sent each of us on the hunt for the resources we need to live effectively, but at every turn, we are forced to contend with an economy and politics from whose point of view we are mere cyphers. We live knowing that our lives have unique value, but we are treated as invisible.

“In our enthusiasm and growing dependency on technology, we tended to forget that the same forces of capital from which we had fled in the “real” world were rapidly claiming ownership of the wider digital sphere. This left us vulnerable and caught unawares when the early promise of information capitalism took a darker turn. ” In addition to individualization, Zuboff is interested in interpreting the concept of disruption, which is often used as an argument for a pre-existing value system. Under the guise of disruption, corporations have successfully covered up so many examples of brutal surveillance capitalism. “Google is to surveillance capitalism what the Ford Motor Company and General Motors were to mass-production–based managerial capitalism. ”, she writes.

“Users provided the raw material in the form of behavioral data, and those data were harvested to improve speed, accuracy, and relevance and to help build ancillary products such as translation.” This is called the behavioral value reinvestment cycle, in which all behavioral data are reinvested in the improvement of the product or service ” Google's invention unveiled new ways to understand the thoughts, feelings, intentions, and interests of individuals and groups with an automated architecture that functions as a one-way mirror independent of a person's awareness, understanding, and consent, allowing privileged, secret access to behavioural data. “With click-through rates as the measure of relevance accomplished, behavioral surplus was institutionalized as the cornerstone of a new kind of commerce that depended upon online surveillance at scale.” “Google has been careful to camouflage the significance of its behavioral surplus operations in industry jargon. Two popular terms—“digital exhaust” and “digital breadcrumbs”—connote worthless waste: leftovers lying around for the taking.” Zuboff also denounces the word "targeted," calling it another euphemism that evokes notions of precision, efficiency, and competence.

“Surveillance capitalism originates in this act of digital dispossession, brought to life by the impatience of over-accumulated investment and two entrepreneurs who wanted to join the system. This is the lever that moved Google’s world and shifted it toward profit.” She goes on to describe a fortification of Google's practices, citing 4 key factors: 1) "competitive advantage in electoral politics” 2) "a deliberate blurring of public and private interests through relationships and aggressive lobbying activities" 3) "a revolving door of personnel who migrated between Google and the Obama administration" and 4) "Google's intentional campaign of influence over academic work and the larger cultural conversation “

“Google had discovered that successful dispossession is not a single action but rather an intricate convergence of political, social, administrative, and technical operations. “The four stages of the cycle are incursion, habituation, adaptation, and redirection. ”
The book makes a solid case for why surveillance capitalism is profoundly anti-democratic since its striking power does not originate from the state, as it has historically. Its consequences cannot be reduced to or explained by technology or the bad intentions of evil people; they are the consistent and predictable outcomes of an inherently constant and successful logic of accumulation. “That the luxuries of one generation or class become the necessities of the next has been fundamental to the evolution of capitalism during the last five hundred years.” The main argument against the big tech corporations are hints of broader ambitions in which "emotion as a service" expands from observation to modification. A more human-centred worldview is expressed in her right to "will to will". The scientists and engineers she interviewed distinguished three main paths to the economics of action, each of which aims to accomplish behaviour modification: 1) "tuning," 2) "herding," and 3) “conditioning."

“The arc of behavioral modification at scale integrates the many operations that we have examined: ubiquitous extraction and rendition, actuation (tuning, herding, conditioning), behavioral surplus supply chains, machine-intelligence–based manufacturing processes, fabrication of prediction products, dynamic behavioral futures markets, and “targeting,” which leads to fresh rounds of tuning, herding, conditioning, and the coercions of the uncontract, thus renewing the cycle”

There is a growing sign of the psychic toll of life in the hive, in which the behavioural expertise of surveillance capital conflicts with the centuries-old human impulse toward self-construction. “The self-objectification associated with social comparison is also associated with other psychological dangers. First, we present ourselves as data objects for inspection, and then we experience ourselves as the “it” that others see.” “What we witness here is a bet-the-farm commitment to the socialization and normalization of instrumentarian power for the sake of surveillance revenues.”

Another human-centred idea of Zuboff is the right to sanctuary as technology increasingly takes over our homes and devices track every second of our privacy both online and offline. “Right now we are at the beginning of a new arc that I have called information civilization, and it repeats the same dangerous arrogance. The aim now is not to dominate nature but rather human nature. The focus has shifted from machines that overcome the limits of bodies to machines that modify the behavior of individuals, groups, and populations in the service of market objectives.”

The main questions remain as Zuboff claims: “Who knows? Who decides? Who decides who decides? ”

It took some time to finish this great book, but it was worth it. I expect it to become even more important in the years to come, as Zuboff declares that this study is only the beginning, not the end, of a new period.

Profile Image for Nilesh Jasani.
987 reviews137 followers
March 6, 2019
Data privacy or unauthorised (and wrongly allowed) usage of an individual’s private data by someone else are critically important topics. However, this comprehensively one-sided book does not even scratch the surface of the issues at hand.

The whole book is a repeated polemic on almost any data gathering, analytics and companies that have successful businesses running on them. The highly coloured and simplistic view of the evils of Google and Facebook (it is largely about these two and a bit about Microsoft) in such a long book ensures that the author never even gets started on discussing the real problem, let alone any solutions beyond calling for all of us to say “No”.

The author - who yearns for the pre-digital era utopia of almost anytime in history including as late as the 1980s - is a twenty-first century, Rousseau-like romantic. She happens to be a critic of our era modernity but if she was from some other era, she might have hated television or automobiles or combustible engines with equal gusto. In this book, she comes out as someone who hates collection of almost any data, and analysis based on them, even though if approached with these charges, she will flat deny them. The author never understands that there is no turning back from the current point, that the digital era has some enormous positives along with many negatives, that most negatives cannot be wished away, that many of the negatives are structural intricately joined to the massive positives, that authorities cannot fight or even understand many evils that are transforming continuously, etc etc

Let’s build slowly to see how complicated the issues of data ownership are before we even look at the commercial aspects:

- There is that age-old question: if a tree falls in a jungle with no “one” to observe, does it make a sound or does it “count”? If you twist this question in the digital contexts, how do you define the “one” doing the measurement? Is my privacy invaded if my “personal” data - however defined - are “seen” by programs?
- The author will say a resounding yes to the question posed above based on what one reads in the book. Here, if I have a program living on my machine analyzing the private data to give me some results on what I am looking for - like a friend’s contact details - is my privacy invaded?
- what if some incoming data - say a barrage of ten billion ads or news articles - are sorted by some program on my machine to filter out the likely unimportant ones for me based on my past habits as stored on my device - is this an invasion?
- if not, how is the same an invasion if the same things are stored or processed on hardware located millions of miles away?
- how does anyone explain complex analytical tools to me? Need my apps to analyze only to the extent I/humans understand? If yes, how will we ever progress in fields of quantum physics, genetics or other behavioural sciences?
One can go on and on. The point is that issues of what is private data or understandable analysis are complex: the author seems to hate any analysis of “my” data for almost any commercial use. However, the same author would despise

- if search and many other similar functions were charged and were so expensive due to the lack of scale that only the elite of the world benefited from them
- Or if my lack of data-sharing did not solve her health or traffic or many similar solutions
- Or somebody prevented smart analysis of “my” data that might have helped me find the right education, right connections, right entertainment, right news amongst a plethora of choices out there.
- Or if the companies forced not to “nudge” offered no choice architecture (like no search) which is far worse than some choice, or offered the same “choice” without customisation to all
- Or if lack of free storage of my data - like health over decades - did not offer me the same health or financial advantages as somebody who could afford such storage

The author does not see how free/cheap access to information, analysis, storage, health, entertainment, etc has benefitted the underprivileged of the world since the onset of the digital revolution. Is the surplus enjoyed by the digital elite at the cost of the rest? Or is there a win-win, where we may charge them of keeping a disproportionate amount, but not accuse them of plunder? 

The new era technologies have many negatives that need to be discussed and solved. Most of them are not what this author talks about in this book. The book is written by someone who yearns to be a digital hermit and feels she cannot. She is rightly frustrated by this viewpoint;  it is nigh impossible to completely go dark as it is no longer just about devices under my control with the surveillance of millions of other varieties.  That said, there are likely to be extremely few who could truly go back to such isolation for long now. 
Profile Image for Graeme Newell.
201 reviews50 followers
November 29, 2019
I found the topic of this book really fascinating. I’m so anxious to better understand how the tech giants like Google, Amazon and Facebook gather and use the vast amounts of data they collect.

Unfortunately this book was a swing and a miss. The author is so passionately obsessed with vilifying these companies that the book regressed to a 700+ page vendetta. It’s obvious she did some amazing research but the portent of doom that pervades this book just got too annoying. Okay, I get it. These companies are invading our privacy. Can we get to the information now?

The book starts with a mind-numbing 50 page tirade on how evil she thinks these companies have become. Why can’t she just let me make up my own mind after hearing the facts?

She also has the annoying tendency of continually wandering into the weeds. She’ll just get going on the story of how a tech giant uses information and then suddenly we’re whisked off to 16th century Europe, or for some reason we’re discussing Homer. All of this is a vain attempt to make some tangential metaphorical point.

I gave up on this book after 100 pages. Too bad. It’s a captivating topic. I know there is a lot of good insight buried somewhere inside this tome. I just wish someone with more focus and less of a score to settle had written this book.
Profile Image for Bryan Alexander.
Author 4 books277 followers
June 16, 2019
We read this for our online book club in the spring and summer of 2019.

Here are our discussions by chapter: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, and 18.

My concluding thoughts?

Generally Zuboff succeeds in sketching out a dystopian business model, one predicated on turning the details of our lives into corporate profit.  This is most famously or notoriously demonstrated by Facebook and Google.  The structures and strategies Surveillance Capitalism lays out are very useful tools, like the kidnap/corner/compete playbook.

Zuboff gives us new ways to think about the digital world in 2019, drawing out an archaeology of its development and pushing back on current understanding.  In one neat move, for example, she takes issue with the famous “you are not the consumer; you are the product” axiom.  Instead, Zuboff would rather we think thusly: “we are the sources of raw-material supply.” (69-70)

The book is mostly analytical, a wide-ranging and deeply probing exploration of the business model.  Developing a solution to surveillance capitalism is a secondary consideration, and not a very inspiring one.  Zuboff has some hopes for creative and popular resistance, and ultimately sees governmental regulation as the best option.  I'm not sure that this is convincing.

To begin with, as Zuboff acknowledges, governments often engage in surveillance that is arguably more terrifying than Facebooks, as they are backed up with laws and armed might.  Many governments also practice the nudging Surveillance Capitalism decries.  Yes, states can and often do two opposing things at the same time, but the tension should be addressed.  Otherwise the argument runs the risk of asking us to appeal to bullies for protection.

Moreover, the politics involved can become challenging, especially in the United States.  This month several Democratic presidential candidates are going after some of Silicon Valley; regulating tech firms could become an epically partisan issue, especially for a Republican party keen to protect and extend big business.  On the other hand, some Republicans, including Trump, are incensed at what they see as FAANG's anti-conservative bias.   The Trump administration is also weighing regulatory options against the digital giants.  A leading conservative blogger called for antitrust action. Is a bipartisan consensus possible?

Put another way, can Congress escape Silicon Valley's lobbying might, or will it be sufficiently captured to not enact anything meaningful?  Further, if Evgeny Morozov is right, as Barbara Fister suggests, and surveillance capitalism is really about capitalism itself, what kind of political organization is available now to respond? Perhaps thee bipartisan possibility I noted above will fall apart as a socialism-interested left wing feuds with Republicans and centrist Democrats alike.   Or, if surveillance capitalism is about extraction, as Zuboff insists, is a better model anti-colonialism, as Vanessa Vaile suggests?  That could lead to an international politics, whereby some other nations oppose Silicon Valley for fomenting an updated, digital colonialism.

Or should we think about this instead as a health care issue, since so many of the privacy  violations Zuboff abhors occur in the body of mind?  If so,  the odds aren't good, as Noel De Martin observes.  Indeed, surveillance capitalism may have succeeded in implanting itself too deeply in our psyches to be uprooted, as Mark Spradley ponders.

Zuboff_discovery of behavioral surplusFurthermore, the book notes several times that the surveillance capitalism model doesn't stem entirely from the technology sector.  Indeed, the financial sector played a key role in shaping and driving it (cf Mark Corbett Wilson's fine comment).  That sector is enormously powerful, both economically and politically.  How can a society and culture oppose its strategy?  Arguably America failed to do so after the 2008 financial disaster (recall Occupy).  Again, political challenges and complexity are rampant on this score.

Shifting from politics to economics, Zuboff would like us to support alternative funding models.  What are they?  Barbara Fister identifies DuckDuckGo and paid(walled) journalism. Noel De Martin points to Netflix as one where we pay for content - although it's really a hybrid model, as Netflix mines our viewing habits to surface recommendations.

Can other businesses compete by openly resisting surveillance capitalism?  Carl Rosenfeld thinks this might be happening with VPN providers.  Apple has lately made a play for being taken seriously as a pro-privacy actor. Alan Baily notes that Apple makes hardware and might be too far behind Google etc. to catch up.  Perhaps their alleged shift to being a media company will lead them to follow Netflix's hybrid path.  The computer gaming industry - immense and weirdly absent from Zuboff's book - largely sells artifacts and services without managing to stalk our inner data-thoughts (except through canny design); Steam is not great at recommendations.

On a different register, I think Age of Surveillance Capitalism fails to understand why so many consumers volunteer to enter the universe of decreased, monetized privacy.  The book compares this business model to military conquest, but doesn't account well for our conscious embrace of it.  As Ken Soto points out, low- or no-cost services of high quality are quite appealing to consumers.  Think of how Gmail outcompeted email clients, or how Facebook crafted a better social experience than Facebook.  Google Earth, Google Books: these are effective tools without serious competition.  As Nicholas Carr argues (and it's not often I agree with him),
While Zuboff’s assessment of the costs that people incur under surveillance capitalism is exhaustive, she largely ignores the benefits people receive in return — convenience, customization, savings, entertainment, social connection, and so on. The benefits can’t be dismissed as illusory, and the public can no longer claim ignorance about what’s sacrificed in exchange for them.

I admit to being torn on this in my personal experience.  Despite my dread of their datamining, Amazon's recommendation system is better than the suggestions I'll get from 99% of bookstores.   I use many Google tools (Drive, Gmail, Maps, etc) because the price is good and the quality high.  Facebook still gives me a bigger social network than any other platform, no matter how badly Zuckerberg behaves.  Voice activated tools are handy for me when I'm cooking or driving.  Convenience and quality are powerful forces and help enable the age of surveillance capitalism; the titular book needs to account for them, even though that would weaken its rhetorical stance.

Beyond myself, I think the personal experience of many other people helps explain why Facebook, Google et al can get away with this.  You see, Zuboff posits an opposition between a good life with privacy and the bad life after social media, yet that duality doesn't withstand scrutiny.  Before Web 2.0 many people already lived with many privacy violations.  The world of work can compromise privacy in a variety of ways, from surveilled email to intense bodily scrutiny; actually, as Katie Fitzpatrick points out, Zuboff seems more concerned with leisure than work.  The war on (some) drugs has habituated many Americans to yielded up our bodily fluids to bureaucratic processing.  The many people who serve in or work closely with the military have a very different privacy experience than the ideal one Zuboff holds out.  In fact, the war on terror has systematically degraded American civil liberties.  Even without war, many who would access some public services are long used to opening up their lives to the gaze of civil servants.  Next to any of these, letting Google trawl one's email to shape some small ads is far less threatening.  If I can slightly misread Blayne Haggart, the book overstates its claims to novelty.

Where does this leave us?

The Age of Surveillance Capitalism is a powerful book that should be read.  It does feel incomplete, however, like a business book that falls short of politics, or one crafted with a Manichean zeal that misses the nuances of history and daily life.  I recommend it for its utility and the conversations it should start.

Looking ahead, I think Zuboff outlines an unfolding politics.  We should pursue that thinking.

(And many thanks, once more, to the readers in our online book club.)
Profile Image for Kusaimamekirai.
652 reviews217 followers
February 25, 2019
Who knows? Who decides? Who decides who decides?

I remember thinking a few years back, when Siri was introduced to Apple iPhones, “why exactly do we need this?”. Had it really become such an inconvenience to type a few words into our ubiquitous phones to get information? Who was clamoring for this?
Today Siri seems almost quaint. We’ve moved at the speed of light to voice recognition finding a restaurant for us to “smart homes” that play our music, regulate temperatures, adjust our mattresses, and (in theory at least) keep us safe from the outside world. These technologies have infiltrated our lives with such speed that they seem like they have always been here and to question their place in our lives seems almost like heresy. Any thought of a world without them seems antiquated and inconvenient. It is the future and everyone knows the future is inevitable. Or is it?
Shoshana Zuboff’s “The Age of Surveillance Capitalism” asks just that question. Just how inevitable are these seemingly relentless incursions into our private lives? More importantly, at what cost do we allow them in?
One of the main problems necessary to overcome before answering these questions however is distinguishing between technology and what Zuboff calls “surveillance capitalism”. The former is in and of itself neutral. It’s effect on society is entirely at the mercy of how it is used. It has no intrinsic morality. In contrast, “surveillance capitalism” is quite different:

“Surveillance capitalism is not technology; it is a logic that imbues technology and commands it into action….That surveillance capitalism is a logic in action and not a technology is a vital point because surveillance capitalists want us to think that their practices are inevitable expressions of the technologies they employ….We cannot evaluate the current trajectory of information civilization without a clear appreciation that technology is not and never can be a thing in itself, isolated from economics and society. This means that technological inevitability does not exist. Technologies are always economic means, not ends in themselves"

“Surveillance capitalism” seeks to monetize the routines of our daily lives through collecting excruciatingly minute data about our every interaction and use it to modify our behavior and herd us toward a particular financial outcome. That emoji you posted on Facebook? That time you got angry at a friend on your Android? That video chat where you shifted uncomfortably in your chair? All logged and crunched into an extensive data profile that insurance companies can use to determine the cost of your policy, banks can use to determine if you’re a credit risk, and in China to determine if you are eligible for certain kinds of employment. This is not simply a digital footprint of websites you’ve visited where you can clear your history and move on. What companies like Facebook and Google are doing know is a radical reshaping of societies around the world. Consider:

“In addition to Facebook’s already complex computational machinery for targeting ads, by 2016 the News Feed function depended upon one of the world’s most secretive predictive algorithms, derived from a God view of more than 100,000 elements of behavioral surplus that are continuously computed to determine the’personal relevancy’ score of thousands of possible posts as it scans and collects everything posted in the past week by each of your friends, everyone you follow, each group you belong to, and every Facebook page you’ve liked.”

This is being done for financial gain to be sure (Pokemon Go directing players to paying advertisers or insurance companies purchasing customer profiles and shutting down cars remotely with drivers who they judge to be at risk) but more importantly it is shaping how we behave. As Zuboff writes, surveillance capitalists despise and seek to alleviate “friction”. Friction here meaning the unpredictability of human beings and their purchasing habits. It is no longer enough to simply hope they will want to by a particular product, we can now manipulate people toward desired outcomes based on their emotions and life patterns. It is truly terrifying and yet there is seemingly little outrage about this fundamental altering of who we are as human beings.
Zuboff argues that this is in part due not only to the false feeling of inevitability these companies assure us of, but also due to the unprecedented nature of what is happening. As Zuboff writes:

"In contrast, surveillance capitalism is a new actor in history, both original and sui generis. It is of its own kind and unlike anything else: a distinct new planet with its own physics of time and space, its sixty-seven-hour days, emerald sky, inverted mountain ranges, and dry water."

When contesting something that is familiar, we combat it with methods that are also familiar. Methods that have been tried, tested, and effective.
However the unprecedented nature of surveillance capitalism upends all of that. Using existing law to legally restrain them takes years, during which time the incursions being resisted have either morphed into something totally different or have become such a habitual part of our lives to the point that we forget just what it was we were fighting. Zuboff calls this stage, habituation:

“Whereas lawsuits and investigations unwind at the tedious pace of democratic institutions, Google continues the development of its contested practices at high velocity. During the elapsed time of FTC and FCC inquiries, court cases, judicial reviews, and EU Commission investigations, the new contested practices become more firmly established as institutional facts, rapidly bolstered by growing ecosystems of stakeholders. People habituate to the incursion with some combination of agreement, helplessness, and resignation. The sense of astonishment and outrage dissipates. The incursion itself, once unthinkable, slowly worms its way into the ordinary. Worse still, it gradually comes to seem inevitable. New dependencies develop. As populations grow numb, it becomes more difficult for individuals and groups to complain.”

With surveillance capitalism moving at light speed, new methods of combatting it that move equally quickly must be used.
Particularly considering that surveillance capitalism, aware of the existential danger to its existence, is deeply imbedded in the governments in charge of regulating it. Consider the case of Google:

“By April 2016, 197 individuals had migrated from the government into the Googlesphere, and 61 had moved in the other direction. Among these, 22 White House officials went to work for Google, and 31 Googlesphere executives joined the White House or federal advisory boards with direct relevance to Google’s business.”

With tech companies firmly pitting a finger on the scale of regulation, a new way of controlling them is vital. Since we are increasingly not given a choice as to whether we assent to relinquishing the details of our personal lives, we must raise our voices and demand that choice.
As surveillance capitalism creeps further and further into every facet of our lives, obliterating the concept of privacy while labeling it as being outdated, we must reclaim out humanity from their tyranny.
Profile Image for lindsi.
54 reviews42 followers
June 25, 2022
Lots of mixed feelings here, but I still highly, highly recommend reading this.

Zuboff lays out the mechanics of surveillance capitalism clearly and eloquently. I understand this market form magnitudes better than I did before reading her book. Her ability to identify new phenomena and coin terms for them is incredibly useful. In particular, I found her development of a theory of instrumentariam power to be paradigm-shifting.

My frustration comes with her prognosis. She emphasizes repeatedly that surveillance capitalism is so dangerous because it is unprecedented - yet then recommends reverting to earlier forms of civic action that were used to combat industrial capital in order to combat surveillance capital. Does an unprecedented threat not warrant an unprecedented solution?

Regardless, her diagnosis is spectacular, and the book is beautifully written. I truly enjoyed her elegant prose and did not find it to be too wordy or unclear.
Profile Image for Rick Wilson.
646 reviews226 followers
December 31, 2022
What an interesting book. I feel like I’m going to be mentally chewing on this one for a while. It’s good, with the caveat that I actually don’t agree with the conclusions the author has. This is kind of like reading the most compelling case against betamax circa early 1990‘s. The arguments are well-made and very convincing, but I can’t help but feel it’s a warning from a snapshot of the pre-pandemic world that doesn’t exist anymore.

The author points at the rise of big data and advent of artificial intelligence, and then sounds a warning siren against the collection of that data and it’s use to steer human decisions. She does this by looking at Google and Facebook primarily, but Amazon and others are thrown in there. The language gets a bit hyperbolic and at times over the top(if you’ve read my other reviews you know Im not one to talk here). There’s a compelling case made that the overwhelming ubiquity of these platforms and their data collection is detrimental to the long-term health of society.

I think the core problem with this book, is that even a few years removed, it seems a lot like Tipper Gore complaining about “explicit language in rap music.“ it was published in 2019 and it already feels out of touch. And I can’t quite tell if it’s because all of these technologies have crossed the Rubicon and are so enmeshed with our daily life that I can’t disentangle it from any sort of imagined alternative, or if the author actually missed the boat. Because of some issues I’ve had and my personal experiences I’m inclined to believe the latter.

Its strange because I started this book over a year ago and set it down. Only to pick it back up from the beginning this week. I remember my initial read through I struggled with the language and agreed with the concepts. This time I actually thought the opposite. The language is the typical obtuse buffoonery I would expect from a Harvard professor, but once you get into it you can develop a rhythm and cadence. It was actually the ideas this time around that struggled for me.

Back from 2016-2018ish I worked at a marketing company that was very briefly on the bleeding edge of digital lead generation and ended up being acquired by a company you’ve probably interacted with. I’ve seen the behind the scenes, witnessed the great and magical OZ, and it’s really disappointing. It’s just a bunch of dudes. And these guys are typically smart, but not that smart. And while they prioritize making money over say, global peace and prosperity or even the betterment of mankind, there’s not some pernicious blood ritual focused on grinding behavioral surplus into cash. At least I was never initiated into it.

So hearing the arguments against in this kind of technology is interesting. But it seems kind of detached from the core of what’s actually happening. Artificial intelligence isn’t a thing. Machine learning is improving but nowhere near the point where it can have malicious intent. There was a story going around a couple years ago about how a teenage girl changed her type of face moisturizer and Targets “marketing AI” figured out she was pregnant because of that. She got ads for pregnancy stuff and her dad got all upset. And everyone was so astounded at the level of intelligence that targets marketing department had. But what came out about that story later is that she literally bought a pregnancy test with her debit card. Her debit card that was tied to her account. Like this stuff is not that advanced. It’s creepy. And I think we should be seriously discussing where the limits are, talking about public and private spheres. The ability to opt out, and more visibility into what’s collected about you. Apple has taken a baby step in that direction. On a macro level there should be a level of consumer protection the same way we used the FCC to regulate television when it was emerging as the dominant form of information and entertainment.

Professor Zumoff gets into this sort of conspiratorial tone where Google is painted as big evil and Facebook is this great puppet master that controls you without you even realizing it. I don’t buy it. I don’t have any love lost for either of those companies. And I think they both probably should be dismantled under antitrust law. But I’ve worked with Xooglers and ex-Facebookers, And while they’re usually smart, in my experience they’re also usually kind of weird and not super aware of anything on a macro level beyond their very narrow band of expertise. I would’ve found a lot of the arguments more credible if instead of this sort of conspiratorial tone of “Stalin plotting against the naysayers“ there was more of a caution about dorky code jockeys maximizing their own KPI‘s. I think the author is a little too convinced of just how far BF Skinner can take us here.

That combined with the last two years of pandemic world left me shaking my head at the end of this book. I think three years ago, if I had read this book when it first came out, I would’ve agreed inherently with just about everything that was said. But having some distance from the industry and seeing some of the buffoonery that tech companies have engaged in, I think my concern is less this maniacal slide into FAANG controlling and knowing your every action. And more of a comedy of errors and mistakes that will lead us into what I would consider the destabilization of society. I think the scarier thing to me is not that Zuckerberg or Bezos have some master plan, it’s that there is no plan beyond trying to keep their empires afloat.

So I’m left with I think a real feeling of confliction. I think there are issues raised here that are very present and vital. I also think that some of this really misses the mark and the fact that it doesn’t hold up less than three years after it was written, that’s really concerning to me. What I hope is that this book can be a warning that never has to be empirically tested. There are definite deep-seated issues with the technology that we are so rapidly creating. The author does a great job of laying those out. This technology is advancing at a hyper sonic speed. I’d rather be overly cautious of the doomsday scenario painted here, than to be wrong and only realize it too late.
Author 18 books28 followers
February 1, 2019
In The Age of Surveillance Capitalism, Shoshanna Zuboff argues relentlessly that the rise of Google, Facebook, etc. has done something terribly wrong to society. She's alarmed by all the surreptitious information-gathering that goes on in the background. She's especially troubled by the way we're deluged with personalized ads, every time we go on the Internet. And she wants us to be alarmed, too.

Her book touches on some fascinating questions. Do we want a world that cherishes sanctuary, or that provides effortless socialization? Do we want the right to be forgotten, or easier opportunities to become known? If we want a bit of both, how do we balance these trade-offs?

I would have loved to read a book that combined her relentless probing of tech's potential excesses with a realistic sense of what we've gained and lost at each step of the way. But that's not what we get. Instead TA0SC is weirdly colored -- in fact sometimes outright poisoned -- by endless nostalgia for a supposedly better time, long ago.

I've lived through the multi-decade transformation that Zuboff describes. Things weren't better back then! We're talking about the days of taking outdated paper maps on road trips, instead of having in-dash nav. Or looking up hotels in a Frommer's guide, instead of tapping into the detailed listings and reviews on TripAdvisor or Airbnb. Looking back on it, information was maddeningly costly, slow and hard to obtain back then.

Even if some aspects of today's digital connectivity seem silly or unnerving, when you add it all up, we're enjoying much better lives with Internet-era tools. To cite one more example, we've evolved from narrow knowledge cartels (where encyclopedia salespeople in the 1970s preyed on poor families) to a much more equitable world where easily accessible Google searches can get you up to speed on almost anything, no matter what your income.

I'd hoped this book would be a landmark contribution to the dialogue about how to handle tech's allures and pitfalls better. Instead, it's as frustrating as a partial encyclopedia set that stops at the letter D. For me, at least, TAoSC loses much of its effectiveness (and readability) by treating modernity as the apocalypse. All we're left with is a preservationist's anguished denunciation of everything that's changed.

There's a wonderful quote from English author Douglas Adams that covers this terrain better than anything I can add:

“1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
3. Anything invented after you're thirty-five is against the natural order of things.”
Profile Image for Chris Chapman.
Author 3 books27 followers
January 29, 2020
When I like people immensely I never tell their names to any one. It seems like surrendering a part of them. You know how I love secrecy. It is the only thing that can make modern life wonderful or mysterious to us. The commonest thing is delightful if one only hides it.
Oscar Wilde, The Picture of Dorian Gray

This is a book of extraordinary erudition and intelligence. She identifies the problem but also, where it is all leading, and why we are blithely going along with it. Arendt, Kafka, Rousseau, Paine, Piketty and many others are marshalled to her cause.

So - in short, googlefacebookmicrosoftamazonverizon are making oodles of money from targeted ads, based on predictions of our behaviour which in turn are based on our activity online. Yeah yeah yeah, you say, we know this. Fine. But she goes further - their ultimate goal is not to stop at predicting our behaviour but shaping it. Cambridge Analytica, the targeted fake news aimed at influencing voting, was only the tip of the iceberg.

The promise of the promise and the will to will run deeper than these deformities. They remind us of that place again where we humans heal the breach between the known and the unknowable, navigating the seas of uncertainty in our vessels of shared promises. In the real world of human endeavour there is no perfect information and no perfect rationality. Life inclines us to take action and to make commitments even when the future is unknown. Anyone who has brought a child into the world or has otherwise given her or his heart in love knows this to be true. God knows the future, but we move forward, take risks, and bind ourselves to others despite the fact that we can’t know everything about the present let alone the future. This is the essence of our freedom, expressed as the element right to the future tense. With the construction and ownership of the new means of behavioural modification, the fate of this right conforms to a pattern that we have already identified. Is it not extinguished but rather it is usurped, commandeered and accumulated by surveillance capital’s exclusive claims on our futures. (Ch. 11.2: We Will to Will)

As you can see she invents a new language to talk about this stuff. Don't be put off. You learn it pretty quickly (she makes very clever use of repetition - not too much, just the right amount so you retain key themes, and they are rhetorically anchored in her narrative).
Profile Image for Lucy.
26 reviews
October 19, 2022
The unnecessarily ornate writing style makes the content harder to comprehend and retain.
Profile Image for Naopako dete .
118 reviews43 followers
August 6, 2021
Ovo može biti ključna knjiga kada je reč o kapitalizmu i načinu na koji digitalna tehnologija utiče na promenu oblika kapitalističkih sistema. Autorka je skovala zanimljivu sintagmu - nadzorni kapitalizam - kako bi označila novi oblik društvenog uređenja koji je u povoju, ali koji je istovremeno ispred i izvan svake pravne regulativa, šta više, upravo nadzorni kapitalisti oblikuju pravne regulative i to nezavisno od države i njenih institucija. Sve to, naravno znači samo jednu stvar. Korporacije poput Alfabeta u čijem je vlastništvu pored ostalih kompanija i Gugl, postale su moćnije od države, što praktično dovodida do stanja u kojem svetski društveni poredak u rukama drži nekoliko pojedinaca. Sve to ne bi bilo toliko strašno, da se mehanizmi nadzornog kapitalizma ne koriste podmuklim načinima da dođu do onoga na šta nemaju nikakvog prava, a to su podaci svojih korisnika, na kojima se njihov profit i zasnima. Raznim tehinikama, od eufmeziranih ugovora, do donošenja deklaracija na jeziku koji nikom nije razumljiv, do onemogućavanja upotrebe uređaja ukoliko se ne prihvate uslovi korišćenja (pristajanje na prikupljanje podataka), nadzorni kapitalisti sakupljaju podatke na koje ne polažu nikakvo pravo, da bi od njih ne samo stvarali profit, već rasturili pojam intimnog sveta kao i utočišta, rastočili individualnost kao egzistencijalnu kategoriju i oteli sposobnost donošenja moralnih odluka. Drugim rečima, ideja nadzornog kapitalizma je da transformiše društvo tako da u njemu postoji samo prividni individualizam, svako će imati pravo na sve, ali to će nekolicina odlučivati o tome šta je to sve. Autorka, ovakav odnos stvari poredi sa košnicom u kojoj se ideje i predstave o svetu slivaju samo sa jednog izvora, koji diktira svaki mogući pokret u društvenim okvirima.

Knjiga je zaista podsticajna za razmišljanje, pogotovo što pravi dosta jak otklon različitim teorijama zavera o tome kako neko iz senke upravlja našim životima. Ne radi se o tome da će nama upravljati mašine ili da će roboti zavladati svetom, već se radi o činjenici da mali broj ljudi eksploatiše većinu i to praktično sprovodi ucenjivanjem i uslovljavanjem protiv kojeg su pojedinci nemoćni. Ta manjina istovremeno proglašava svet glupim i neobrazoavnim, što je samo još jedan od mehanizama zbunjivanja, jer jezik kojim se toj većini obraćaju nerazumljive je i vodi u ludilo.
Profile Image for Keith Swenson.
Author 15 books50 followers
April 25, 2019
It is not your grand-father's internet any more. What we are living through is a brand new kind of technology, and a brand new kind of business built on and for that. It astoundingly hard to grasp the nature of a paradigm shift while it is happening. Shoshanna Zuboff puts it all together into a single book: the history, the discovery, the development, from the Google taking the responsibility to find the right place to put the ad, to predicting behavior from digital exhaust, to the surprising technique of guaranteed outcomes.

You already know that as users of Google search, we are not the customer. But we are not the product either. We are merely the source of something else: behavioral surplus. Google, and Facebook, and Amazon vacuum up that surplus which consists of old search queries, logs of mouse clicks, records of sites visited, and most importantly: "like" button clicks. None of this is of any value to us, but when aggregated it can tell a powerful story about each and every one of us. That behavioral surplus can be used to predict what we will do. If it was only prediction it would not be a particular problem. But this is where it turns dark: if you have a model that can predict what a person will do, then you can use that model to make people do things.

Our democracy and capitalism is built on the idea of a free market. Adam Smith described the "invisible hand" of the market, that while millions of people made independent decisions for our own personal welfare, the market was structured to find a nearly optimal distribution of goods ans services. The hand is invisible because it was impossible to know everything the everyone was doing. The aggregate behavior of all people was unknowable.

Until now.

Google and Facebook have detailed profiles on most of the people in the country: some thorough and some not, but it is all getting much more complete as new devices emit ever more digital exhaust to be collected. It is surveillance-as-a-service. We can be quite sure the 2016 Brexit vote and the Trump victory were due in no small part to manipulation with the tools of surveillance capitalism. What happens to democracy when a company can sell guaranteed outcomes for a price? What happens to the free market when the invisible hand disappears and is replaced by plutocrats who simply manipulate the public will to their favor?

We don't know the answers to these questions, and Zuboff doesn't either. It is too new. It is all just forming. Most of our legislators have only the faintest grasp of what is going on, much less what to do about it. But this is clearly the most important development of our time. If you want to understand it, this book is a monumental resource: well organized, well researched, well written.

This is a timely book, not too soon, and not too late. On a five-star scale, I am giving it a six. It is that important.

Wait ... we have had mass propaganda before that persuaded people and in some sense controlled them. Why is it any different this time? The difference is that through the collection of behavioral surplus, a very detailed model of every citizen can be constructed. From this model, algorithms can fashion a detailed plan for exactly what to display unique to each person. But it does not stop there: your behavior is further monitored to see which things you reacted to and which you didn't. This is used to further tune the model. Advertisements, memes, opinions, political statements, images are all thrust into the sidebars of the web pages. No two people get the exact same mix. Your response on every one can be captured and categorized independently. There has never in history been the ability to customize propaganda campaigns for individuals and be able to measure exactly how each individual responds. We have no precedence for this, and we have no idea where it is going or even how to change course if we see something amiss.

Zuboff gives us detailed example after example of how this came about, how it is being used, and what the legal implications are. Even Roger McNamee who has just published a similarly themed book called "Zucked" admits that the big difference is that Zuboff has all the data. She compare this emergence to that of Totalitarianism just 100 years ago. Totalitarianism enveloped the individual in a state-wide system of total control. We are not to fear that this is all going toward totalitarianism -- that was last time, and this time is different. She coins the term "Instrumentarianism" to describe the way that unknown algorithms are at the root of collecting the data, building the models, and crafting the plan of persuasion.

I guarantee in the next 5 to 10 years, you will be hearing a lot about Surveillance Capitalism and Instrumentarianism. We will need these new terms, and many other presented concepts just to understand what is happening. Only then, can we hope to navigate these stormy seas to the (hopefully) safe harbor once we know how to tame this new beast.
Profile Image for Sasha Mircov.
19 reviews9 followers
June 20, 2019
The Age of Surveillance Capitalism is a very welcome and overdue attempt to provide a theory and critique of the information economy. Professor Soshana Zuboff does a great job of the former but the absolute and inescapable apocalypse her theory predicts is unconvincing.

The two main concepts in Zuboff’s book are “surveillance capitalism” and “instrumentarian power” or, in less charged terms, the information economy and knowledge as power.

In the spirit of some of the most prominent social theorists of the last century, such as Adorno, Arent, Weber, Marx and Foucault, Zuboff provides a coherent picture of the inner workings of big tech behind the information economy, notably Google and Facebook. She eloquently explains the effect of the quantified and all-connected world in which big tech is harvesting our "digital behavioral surplus" for profit, often without our informed and explicit consent.

Zuboff’s prowess as a business professor shines through as she breaks down big tech’s business practices. From the business model of Google AdWords and Sheryl Sandberg’s influence to Google's lobbying strategies and Facebook's constant experimentation, the author gets it all right. She even ties it all together with the Internet of Things and The Fourth Industrial Revolution.

As masterful as it is, the book is not without its shortfalls and deficiencies. The economic impact of the surveillance economy is barely discussed. In 2017, Google alone generated around $238B in economic activity. Google and Facebook are often the best-performing advertising platforms for millions of small and medium-size businesses. Another noticeable no-show in Zuboff's book is Amazon, perhaps because of its association with the concept of "advocacy capitalism" - a more favorable form of capitalism as described in Zuboff’s previous book, The Support Economy.

The least convincing part of Zuboff’s most recent book is around what is at stake - our humanity, our freedom and even our free will. Free will, a finicky and controversial philosophical concept, is mutated into its meta-version, a will to will, which is the prerequisite for freedom and ultimately what defines us as humans. And it is precisely the ability to envision a future for ourselves that the “instrumentarian power” through “surveillance capitalism” is stealing away from us.

Zuboff is not wrong. People are complex, dynamic systems and the very attempt to predict one’s future behavior can trigger adaptation. The information economy, mediated by prediction algorithms, does precisely that - it changes us. However, to fear the change because it will strip us of our free will leaves her worries hanging on a shaky nail. Zuboff is careful not to get entangled in the metaphysics of free will, leaving those inclined to question the concept with unanswered questions.

Finally, for the 700 plus pages long polemic, the book is noticeably short of solutions. Zuboff acknowledges that regulation, such as the EU privacy regulation GDPR, are steps in the right direction and praises artists making clothing that can trick the image recognition software deployed by the surveillance capitalists.

On a philosophical level, she calls for the development of synthetic declaration on surveillance capitalism "to define and support other variants of information capitalism that participate in the social order, value people, and reflect democratic principles". Blockchain, the one technology that sprung out of the need for privacy, is part of the problem according to Zuboff, especially smart contracts, or "the anti-contracts", as she calls them.

In spite of the flaws, The Age of Surveillance Capitalism should be required reading for those who work in or with information technology. After all, what Professor Zuboff is doing in the book is predicting the future. And, in the same way Facebook's prediction and recommendation algorithms are slowly changing us, so will this book, but hopefully in the opposite direction.
Profile Image for Matthew Sun.
73 reviews
June 4, 2021
It took me so long to get through this book but here we finally are! I'm honestly still considering how much of Zuboff's central thesis I agree with, but I appreciate how expansive her arguments were & the wide variety of evidence used to substantiate her claims. Some thoughts (not a comprehensive review, just the first few things that are top of mind now upon finishing it):
- As others have probably stated, the book is far too long. To be fair, I think Zuboff was not aiming for brevity, but rather to rouse the reader. But I felt that some of the more idiosyncratic phrases / topics started to blend into each other by the end (Big Other, instrumentation power, uncontract, etc.) that they almost began to lose their meaning.
- I wish Zuboff had taken more seriously the possibility that the instrumentarian project is itself a massive farce! For example, so many scholars are arguing that despite hundreds of millions of dollars being thrown into targeted ads, they actually influence our actions very, very little, and that online advertising is a massive bubble (see the Subprime Attention Economy by Tim Hwang for more details). For all the language about the "subjugation of free will," there's surprisingly little evidence that this is actually happening. Zuboff's discussion of sponsored PokeStops in Pokemon Go made me lose a little faith in her analysis; she depicts Pokemon Go as this massive surveillance project designed to psychologically manipulate users into purchasing...Starbucks drinks...is not persuasive. I tend to think most of these products make their money on a very tiny percentage of users whose actions at the margin are influenced by technological nudges.
- There's a bit of a weird "China is the dystopian society that the US must avoid becoming at all costs" dynamic in this book and surprisingly little engagement with people who actually live in China, or China scholars. This isn't to say that Zuboff's analysis is necessarily wrong - I just found this part of the analysis to be particularly thin. In general, I find that Zuboff is quite selective with the evidence she brings in to support her points - for example, she insists that social media is bad for mental health, which doesn't seem to accurately capture the nuances of current academic debates around social media and mental health.
- Zuboff's yearning for a sense of home / sanctuary from technology definitely resonates. She writes beautifully in the last few chapters about the very real sense of encroachment that technology introduces into our lives. But though I feel this, I am a bit skeptical of universalizing this idea that people are fundamentally incapable of finding their own ways to negotiate their relationships with technology.
- My reading is that Zuboff is actually not anti-capitalist at all? Which was a bit surprising when I started reading the book, especially since some people have said that this is "my generation's Das Kapital." In general, she expresses a lot of faith/hope in democracy and highly regulated capitalism.
146 reviews1 follower
August 18, 2020
I really wanted to love this book as the subject is fascinating and horrifying and right up my alley. However it has a few things about it that i don't love. As I read it I was thinking man this is like someone's Phd thesis (ie impenetrable and a lot of "as we saw in chapter 1 blah blah" and "we will see in chapter 2 blah blah" - it's like, if you didn't do that every constantly the book would be half as long). And then I checked out the author and she is a Harvard academic. So that it explains it. Not that everything should be dumbed down, but this feels like it is purposefully trying to be hyper intellectual and the result is a giant yawn fest. My other issue with it is that she doesn't actually clearly enough articulate what the problem with surveillance capitalism is. Or rather, her argument is not that compelling to me. There is a lot of "surveillance capitalism is causing us to lose the will to will" (wtf) and it is "stealing our right to a future tense" (also wtf). But, as I mentioned in an earlier comment, my dad made a much better argument when he said that surveillance capitalism is bad for the environment. The author goes on about modifying human behaviour, but fails to spell out that it is modifying our behaviour so we BUY MORE STUFF. And consumption is what is destroying the planet. Because she focuses on the "right to future tense" stuff, and relegates environmental destruction to something that was caused by the industrial revolution onwards, rather than something that is caused by how we/corporations behave now, I think she is missing more than half the story (which is quite an achievement in a book so long). All that is not to say that I don't have a renewed distrust of Google and Facebook et al. They really are the absolute worst.
Profile Image for Alex.
646 reviews88 followers
December 2, 2020
Not going to finish. Moments of brilliance, but also weighed down by academic jargon and overstating phenomenon for sake of thesis.
Displaying 1 - 30 of 1,607 reviews

Can't find what you're looking for?

Get help and learn more about the design.