More on this book
Community
Kindle Notes & Highlights
by
P.W. Singer
Read between
October 9, 2018 - February 23, 2019
The engineers behind social media had specifically designed their platforms to be addictive. The brain fires off tiny bursts of dopamine as a user posts a message and it receives reactions from others, trapping the brain in a cycle of posts, “likes,” retweets, and “shares.”
Diplomacy has become less private and policy-oriented and more public and performative.
War is political. And politics will always lie at the heart of violent conflict, the two inherently linked.
Attacking an adversary’s most important center of gravity—the spirit of its people—no longer requires massive bombing runs or reams of propaganda. All it takes is a smartphone and a few idle seconds. And anyone can do it.
From the world’s most powerful nations to the pettiest flame war combatants, all of today’s fighters have turned social media into a weapon in their own national and personal wars, which often overlap.
Narrative, emotion, authenticity, community, and inundation are the most effective tools of online battles, and their mastery guides the efforts of most successful information warriors.
First, the internet has left adolescence.
Second, the internet has become a battlefield. As integral as the internet has become to business and social life, it is now equally indispensable to militaries and governments, authoritarians and activists, and spies and soldiers.
Third, this battlefield changes how conflicts are fought. Social media has rendered secrets of any consequence essentially impossible to keep. Yet because virality can overwhelm truth, what is known can be reshaped. “Power” on this battlefield is thus measured not by physical strength or high-tech hardware, but by the command of attention. The result is a contest of psychological and algorithmic manipulation, fought through an endless churn of competing viral events.
Fourth, this battle changes what “war” means.
Fifth, and finally, we’re all part of this war.
Although nobody knew it at the time, the introduction of the iPhone also marked a moment of destruction. Family dinners, vacations, awkward elevator conversations, and even basic notions of privacy—all would soon be endangered by the glossy black rectangle Jobs held triumphantly in his hand.
Before almost anyone realized it, mobile tech, carefully policed app stores, and corporate consolidation had effected another massive change in the internet—who controlled it.
At its core, crowdsourcing is about redistributing power—vesting the many with a degree of influence once reserved for the few.
Does a retweet actually mean endorsement? For Dion Nissenbaum, the answer to this question landed him in a Turkish prison.
Humans continually test their beliefs against those of the perceived majority and often quietly moderate their most extreme positions in order to get along better with society as a whole. By creating an atmosphere in which certain views are stigmatized, governments are able to shape what the majority opinion appears to be, which helps steer the direction of actual majority opinion.
Yet the web has also given authoritarians a tool that has never before existed. In a networked world, they can extend their reach across borders to influence the citizens of other nations just as easily as their own.
Put simply, people like to be right; they hate to be proven wrong. In the 1960s, an English psychologist isolated this phenomenon and put a name to it: “confirmation bias.”
Across the board, just one-tenth of professional media coverage focused on the 2016 presidential candidates’ actual policy positions. From the start of the year to the last week before the vote, the nightly news broadcasts of the “big three” networks (ABC, CBS, and NBC) devoted a total of just thirty-two minutes to examining the actual policy issues to be decided in the 2016 election!
Human minds are wired to seek and create narrative. Every moment of the day, our brains are analyzing new events—a
During the 2016 election, Carnegie Mellon University researchers studied and ranked the complexity of the candidates’ language (giving it what is known as a Flesch-Kincaid score). They found that Trump’s vocabulary measured at the lowest level of all the candidates, comprehensible to someone with a fifth-grade education.
In each case, recruits to extremist causes are lured by a warmth and camaraderie that seems lacking in their own lonely lives. In each case, such recruits build communities that attract people from across the world but that show almost no diversity of thought.
One data scientist found that in the twenty-four hours that followed Donald Trump’s election night win of November 8, 2016, the word “fuck” appeared nearly 8 million times on Twitter.
Importantly, Trump’s larger follower pool was made up of not just real-world voters, but—as we’ve discussed previously—a cavalcade of bots and sockpuppet accounts from around the world that amplified his every message and consequently expanded his base of support.
In 2006, a young MIT postgraduate named Jonah Peretti cofounded a “viral lab.” Peretti’s intention was to understand what content took off and what didn’t. Within a decade, the spinout company called BuzzFeed would grow to become a billion-dollar network with hundreds of employees and offices scattered around the world.
The lesson was clear: Not only did modern war require a well-planned military campaign. It required a viral marketing campaign as well.
Today, the battles between Israelis and Palestinians continue, both in the occupied territories and online. Yet they are only one tiny front in a world of wars. The social media accounts of every military organization, diplomatic envoy, world leader, soldier, and civilian exist in the same digital milieu.
Every tweet or public statement, in other words, is a new front of LikeWar waiting to happen.
Whether the conflict is a civil war echoing across YouTube, a dispute over missile tests that culminates in one leader tweeting threats at another, a swaggering Facebook argument between gangs, or just a celebrity flame war, all of these socially mediated conflicts are overtly theatrical.
The internet makes it even easier to strike and then prolong the agony. Social media algorithms work by drawing attention to content that trends on their networks, even (and especially) when people are outraged by it. The result is the virtual equivalent of a grease fire, where widespread condemnation of something ensures that new groups of users see it and condemn it in turn. Because virality is incompatible with complexity, as content trends, any context and details are quickly stripped away. All that remains is the controversy itself, spread unwittingly by people who feel the need to “weigh
...more
The thread that runs through all these strange internet skirmishes is that they take place simultaneously, in the same space.
Although Facebook engineers were essentially putting a drug in users’ pockets, it wasn’t their—or anyone’s—job to consider the potential side effects.
Ultimately, the greatest challenge that confronts these social media giants has nothing to do with software code. It is a problem of corporate incentives, clashing cultures, and a historic revolution that has left both politics and Silicon Valley reeling. It is a problem of bestowing carefree engineers uninterested in politics with grave, nation-sized political responsibilities.
And although this is a problem with endless dimensions, at its heart it has always revolved around the same three questions: Should these companies restrict the information that passes through their servers? What should they restrict? And—most important for the future of both social media and the world—how should they do it?
The LikeWars of tomorrow will be fought by highly intelligent, inscrutable algorithms that will speak convincingly of things that never happened, producing “proof” that doesn’t really exist. They’ll seed falsehoods across the social media landscape with an intensity and volume that will make the current state of affairs look quaint.
Regardless of how old they are, humans as a species are uniquely ill-equipped to handle both the instantaneity and the immensity of information that defines the social media age.
for all the sense of flux, the modern information environment is becoming stable.
the internet is a battlefield. Like every other technology before it, the internet is not a harbinger of peace and understanding. Instead, it’s a platform for achieving the goals of whichever actor manipulates it most effectively.
The best and worst aspects of human nature duel over what truly matters most online: our attention and engagement.
this battlefield changes how we must think about information itself. If something happens, we must assume that there’s likely a digital record of it—an image, video, or errant tweet—that will surface seconds or years from now. However, an event only carries power if people also believe that it happened. The nature of this process means that a manufactured event can have real power, while a demonstrably true event can be rendered irrelevant.
war and politics have never been so intertwined. In cyberspace, the means by which the political or military aspects of this competition are “won” are essentially identical. As a result, politics has taken on elements of information warfare, while violent conflict is increasingly influenced by the tug-of-war for online opinion.
we’re all part of the battle. We are surrounded by countless information struggles—some apparent, some invisible—all of which seek to alter our perceptions of the world. Whatever we notice, whatever we “like,” whatever we share, becomes the next salvo. In this new war of wars, taking place on the network of networks, there is no neutral ground.
Yet recognizing the new truths of the modern information environment and the eternal aspects of politics and war doesn’t mean admitting defeat. Rather, it allows us to hone our focus and channel our energies into measures that can accomplish the most tangible good.
Accordingly, information literacy is no longer merely an education issue but a national security imperative. Indeed, given how early children’s thought patterns develop and their use of online platforms begins, the process cannot start early enough.
Like it or not, social media now plays a foundational role in public and private life alike; it can’t be un-invented or simply set aside.
Instead, if we want to stop being manipulated, we must change how we navigate the new media environment. In our daily lives, all of us must recognize that the intent of most online content is to subtly influence and manipulate.
One of the underlying themes of Plato’s cave is that power turns on perception and choice. It shows that if people are unwilling to contemplate the world around them in its actuality, they can be easily manipulated. Yet they have only themselves to blame.
In this new world, the same basic law applies to us all: You are now what you share. And through what you choose, you share who you truly are.