Ten Arguments for Deleting Your Social Media Accounts Right Now
Rate it:
Open Preview
1%
Flag icon
even though we love dogs, we don’t want to be dogs, at least in terms of power relationships with people, and we’re afraid Facebook and the like are turning us into dogs. When we are triggered to do something crappy online, we might call it a response to a “dog whistle.” Dog whistles can only be heard by dogs. We worry that we’re falling under stealthy control.
1%
Flag icon
This book is about how to be a cat. How can you remain autonomous in a world where you are under constant surveillance and are constantly prodded by algorithms run by some of the richest corporations in history, which have no way of making money except by being paid to manipulate your behavior?
2%
Flag icon
if you have the latitude to quit and don’t, you are not supporting the less fortunate; you are only reinforcing the system in which many people are trapped. I am living proof that you can have a public life in media without social media accounts.
2%
Flag icon
in the last five or ten years, nearly everyone started to carry a little device called a smartphone on their person all the time that’s suitable for algorithmic behavior modification.
2%
Flag icon
We’re being hypnotized little by little by technicians we can’t see, for purposes we don’t know. We’re all lab animals now.
3%
Flag icon
Now everyone who is on social media is getting individualized, continuously adjusted stimuli, without a break, so long as they use their smartphones. What might once have been called advertising must now be understood as continuous behavior modification on a titanic scale.
4%
Flag icon
This book argues in ten ways that what has become suddenly normal—pervasive surveillance and constant, subtle manipulation—is unethical, cruel, dangerous, and inhumane. Dangerous? Oh, yes, because who knows who’s going to use that power, and for what?
4%
Flag icon
We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever.… It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.… The inventors, creators—it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people—understood this consciously. And we did it anyway … it literally changes your relationship with society, with each other.… It probably interferes with productivity ...more
4%
Flag icon
The short-term, dopamine-driven feedback loops we’ve created are destroying how society works.… No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem—this is not about Russian ads. This is a global problem.… I feel tremendous guilt. I think we all knew in the back of our minds—even though we feigned this whole line of, like, there probably aren’t any bad unintended consequences. I think in the back, deep, deep recesses of, we kind of knew something bad could happen.… So we are in a really bad state of affairs right now, in my opinion. It is eroding the ...more
5%
Flag icon
The core process that allows social media to make money and that also does the damage to society is behavior modification. Behavior modification entails methodical techniques that change behavioral patterns in animals and people. It can be used to treat addictions, but it can also be used to create them.
5%
Flag icon
The damage to society comes because addiction makes people crazy. The addict gradually loses touch with the real world and real people. When many people are addicted to manipulative schemes, the world gets dark and crazy.
6%
Flag icon
To a degree, you’re an animal in a behaviorist’s experimental cage. But the fact that something is fuzzy or approximate does not make it unreal.
6%
Flag icon
Using symbols instead of real rewards has become an essential trick in the behavior modification toolbox. For instance, a smartphone game like Candy Crush uses shiny images of candy instead of real candy to become addictive. Other addictive video games might use shiny images of coins or other treasure.
6%
Flag icon
Various kinds of punishment have been used in behaviorist labs; electric shocks were popular for a while. But just as with rewards, it’s not necessary for punishments to be real and physical. Sometimes experiments deny a subject points or tokens. You are getting the equivalent of both treats and electric shocks when you use social media.
7%
Flag icon
it’s not that positive and negative feedback work, but that somewhat random or unpredictable feedback can be more engaging than perfect feedback.
7%
Flag icon
A touch of randomness is more than easy to generate in social media: because the algorithms aren’t perfect, randomness is intrinsic. But beyond that, feeds are usually calculated to include an additional degree of intentional randomness. The motivation originally came from basic math, not human psychology. Social media algorithms are usually “adaptive,” which means they constantly make small changes to themselves in order to try to get better results; “better” in this case meaning more engaging and therefore more profitable. A little randomness is always present in this type of algorithm.
8%
Flag icon
Adaptive systems often include such a leaping mechanism. An example is the occurrence of useful mutations in natural evolution, which is usually animated by more incremental selection-based events in which the genes from an individual are either passed along or not. A mutation is a wild card that adds new possibilities, a jarring jump. Every once in a while a mutation adds a weird, new, and enhancing feature to a species.
8%
Flag icon
it turns out that the randomness that lubricates algorithmic adaptation can also feed human addiction. The algorithm is trying to capture the perfect parameters for manipulating a brain, while the brain, in order to seek out deeper meaning, is changing in response to the algorithm’s experiments; it’s a cat-and-mouse game based on pure math. Because the stimuli from the algorithm don’t mean anything, because they genuinely are random, the brain isn’t adapting to anything real, but to a fiction. That process—of becoming hooked on an elusive mirage—is addiction. As the algorithm tries to escape a ...more
8%
Flag icon
The power of what other people think has proven to be intense enough to modify the behavior of subjects participating in famous studies like the Milgram Experiment and the Stanford Prison Experiment. Normal, noncriminal people were coerced into doing horrible things, such as torturing others, through no mechanism other than social pressure.
9%
Flag icon
Negative emotions such as fear and anger well up more easily and dwell in us longer than positive ones. It takes longer to build trust than to lose trust. Fight-or-flight responses occur in seconds, while it can take hours to relax. This is true in real life, but it is even more true in the flattened light of algorithms.
10%
Flag icon
The prime directive to be engaging reinforces itself, and no one even notices that negative emotions are being amplified more than positive ones. Engagement is not meant to serve any particular purpose other than its own enhancement, and yet the result is an unnatural global amplification of the “easy” emotions, which happen to be the negative ones.
10%
Flag icon
In the bigger picture, in which people must do more than conform in order for our species to thrive, behaviorism is an inadequate way to think about society. If you want to motivate high value and creative outcomes, as opposed to undertaking rote training, then reward and punishment aren’t the right tools at all. There’s a long line of researchers studying this topic, starting with Abraham Maslow in the 1950s and continuing with many others, including Mihaly Csikszentmihalyi (joined by writers like Daniel Pink). Instead of applying the simple mechanisms of behaviorism, we need to think about ...more
10%
Flag icon
But there’s something about the rigidity of digital technology, the on-and-off nature of the bit, that attracts the behaviorist way of thinking. Reward and punishment are like one and zero. It’s not surprising that B. F. Skinner was a major player in the earliest days of digital networking, for instance.15 He saw digital networks as an ideal way to train a population for the kind of utopia he sought, where we’d all just finally behave. One of his books was called Beyond Freedom and Dignity. Beyond!
10%
Flag icon
We still call the customers of social media companies “advertisers”—and, to be fair, many of them are. They want you to buy a particular brand of soap or something. But they might also be nasty, hidden creeps who want to undermine democracy. So I prefer to call this class of person a manipulator.
11%
Flag icon
What started as advertising morphed into what would better be called “empires of behavior modification for rent.” That transformation has often attracted new kinds of customers/manipulators, and they aren’t pretty.
11%
Flag icon
Social media is biased, not to the Left or the Right, but downward. The relative ease of using negative emotions for the purposes of addiction and manipulation makes it relatively easier to achieve undignified results. An unfortunate combination of biology and math favors degradation of the human world. Information warfare units sway elections, hate groups recruit, and nihilists get amazing bang for the buck when they try to bring society down.
Dan liked this
11%
Flag icon
The unfortunate result is that once an app starts to work, everyone is stuck with it. It’s hard to quit a particular social network and go to a different one, because everyone you know is already on the first one. It’s effectively impossible for everyone in a society to back up all their data, move simultaneously, and restore their memories at the same time. Effects of this kind are called network effects or lock-ins. They’re hard to avoid on digital networks.
12%
Flag icon
Our early libertarian idealism resulted in gargantuan, global data monopsonies.
12%
Flag icon
One of the main reasons to delete your social media accounts is that there isn’t a real choice to move to different social media accounts. Quitting entirely is the only option for change. If you don’t quit, you are not creating the space in which Silicon Valley can act to improve itself.
12%
Flag icon
ADDICTION AND FREE WILL ARE OPPOSITES Addiction gradually turns you into a zombie. Zombies don’t have free will. Once again, this result isn’t total but statistical. You become more like a zombie, more of the time, than you otherwise would be.
12%
Flag icon
the problem isn’t behavior modification in itself. The problem is relentless, robotic, ultimately meaningless behavior modification in the service of unseen manipulators and uncaring algorithms.
12%
Flag icon
Hypnosis might be therapeutic so long as you trust your hypnotist, but who would trust a hypnotist who is working for unknown third parties? Who? Apparently billions of people.
13%
Flag icon
Tech company lawyers have testified under oath that the companies couldn’t have known when Russian intelligence services sought to disrupt elections or foment divisions to weaken societies,
13%
Flag icon
social media has been successfully deployed to disrupt societies,19 and we know that the price to do so is remarkably low. We know that relevant companies take in an astounding amount of money and that they don’t always know who their customers are. Therefore, there are likely to be actors manipulating us—manipulating you—who have not been revealed. To free yourself, to be more authentic, to be less addicted, to be less manipulated, to be less paranoid … for all these marvelous reasons, delete your accounts.
13%
Flag icon
The problem isn’t the smartphone, as suggested by a flood of articles with titles like “Has the Smartphone Destroyed a Generation?”1 The problem isn’t the internet, which is also routinely accused of ruining the world.2 Something is ruining the world, but it isn’t that we’re connecting with people at a distance using bits, or that we’re staring into little glowing screens. To be sure, you can overdo staring at the little screen,3 just as you can overdo a lot of things, but that’s not an existential problem for our species. There is one particular high-tech thing, however, that is toxic even in ...more
13%
Flag icon
The problem is in part that we are all carrying around devices that are suitable for mass behavior modification.
14%
Flag icon
The problem is not only that users are crammed into online environments that can bring out the worst in us. It’s not only that so much power has concentrated into a tiny number of hands that control giant cloud computers.
14%
Flag icon
The problem occurs when all the phenomena I’ve just described are driven by a business model in which the incentive is to find customers ready to pay to modify someone else’s behavior.
14%
Flag icon
If we could just get rid of the deleterious business model, then the underlying technology might not be so bad.
14%
Flag icon
Deleting your accounts now will improve the chances that you’ll have access to better experiences in the future.
14%
Flag icon
When it became undeniable that lead was harmful, no one declared that houses should never be painted again. Instead, after pressure and legislation, lead-free paints became the new standard.6 Smart people simply waited to buy paint until there was a safe version on sale. Similarly, smart people should delete their accounts until nontoxic varieties are available.
15%
Flag icon
“Behaviors of Users Modified, and Made into an Empire for Rent”? BUMMER.
15%
Flag icon
Even at their best, BUMMER algorithms can only calculate the chances that a person will act in a particular way. But what might be only a chance for each person approaches being a certainty on the average for large numbers of people. The overall population can be affected with greater predictability than can any single person.
15%
Flag icon
Since BUMMER’s influence is statistical, the menace is a little like climate change. You can’t say climate change is responsible for a particular storm, flood, or drought, but you can say it changes the odds that they’ll happen.
15%
Flag icon
Like climate change, BUMMER will lead us into hell if we don’t self-correct.
15%
Flag icon
A is for Attention Acquisition leading to Asshole supremacy B is for Butting into everyone’s lives C is for Cramming content down people’s throats D is for Directing people’s behaviors in the sneakiest way possible E is for Earning money from letting the worst assholes secretly screw with everyone else F is for Fake mobs and Faker society
16%
Flag icon
Ordinary people are brought together in a setting in which the main—or often the only—reward that’s available is attention. They can’t reasonably expect to earn money, for instance. Ordinary users can gain only fake power and wealth, not real power or wealth. So mind games become dominant.
16%
Flag icon
If you’re reading this on an electronic device, for instance, there’s a good chance an algorithm is keeping a record of data such as how fast you read or when you take a break to check something else.
17%
Flag icon
From the point of view of the algorithm, emotions, happiness, and brand loyalty are just different, but similar, signals to optimize.
17%
Flag icon
The algorithms are rarely interrogated, least of all by external or independent scientists, in part because it’s hard to understand why they work. They improve automatically, through feedback. One of the secrets of present-day Silicon Valley is that some people seem to be better than others at getting machine learning schemes to work, and no one understands why. The most mechanistic method of manipulating human behavior turns out to be a surprisingly intuitive art. Those who are good at massaging the latest algorithms become stars and earn spectacular salaries.
« Prev 1 3