More on this book
Community
Kindle Notes & Highlights
by
P.W. Singer
Read between
July 2, 2020 - January 22, 2021
published a series of especially witty letters by
Exercise apps have inadvertently revealed everything from the movements of a murderer committing his crime to the location of a secret CIA “black site” facility in the Middle East. (A heat map made from tracing agents’ daily jogs around the perimeter of their base provided a near-perfect outline of one installation.)
For instance, Ashley Madison is a social network that links people thinking of cheating on their spouses. Its algorithms mine social media to detect when business travelers
arrive at a hotel (and thus are more likely to stray from their marriages).
In 2016, one victim of an airplane hijacking scored the ultimate millennial coup: taking a selfie with his hijacker.
Luke Stark, a researcher in the sociology department at Dartmouth College, explains that accumulated online postings provide “something much more akin to medical data or psychiatric data.” Even the most trivial details can be unexpectedly revealing. Consistent use of black-and-white Instagram filters and single-face photos,
for instance, has proven a fairly good identifier of clinical depression.
The Economist explained, this was, in fact, one of the key factors that fueled the years-long Syrian civil war. Fighters sourced needed funds by learning “to crowdfund their war using Instagram, Facebook and YouTube. In exchange for a sense of what the war was really like, the fighters asked for donations via PayPal. In effect, they sold their war online.”
They’d found their smoking
With today’s OSINT, anyone can gather and process intelligence in a way that would have been difficult or impossible for even the CIA or KGB a generation ago. One OSINT analyst explained to us just how simple it can be, through a story of his pursuit of Iranian
arms smuggling. He began by searching for some common weapons-related words in Farsi, courtesy of Google Translate. Soon he discovered an article that profiled a young Iranian CEO, who had launched a company specializing in drones. The journalist also found a video online that showed the CEO’s face. This allowed him to locate the CEO in a registry of Iranian aerospace professionals, which, in turn, yielded an email address as well as phone and fax numbers. Translating the CEO’s name into Farsi and searching Facebook (cross-referencing it with the email address), he quickly found the CEO’s
...more
who liked to wear bunny ears and a six-inch skirt. Her own posts, in turn, revealed her to be an alumna of the “International Sex School.” He ended the search there. In just one hour of digging, he’d been able to compile a list of leads that might once have taken an intelligence agency months to find, as well as to tease out a fertile opportunity for blackmail. Although he’d begun his work as a...
This highlight has been truncated due to consecutive passage length restrictions.
Indeed, both technology and international law crossed a new frontier in 2017, when the International Criminal Court indicted Mahmoud Al-Werfalli for a series of mass killings in Libya. He was the first person ever to be charged with war crimes based solely on evidence found on social media.
Predata is a small company founded by James Shinn, a former CIA agent. Shinn modeled his unique service on sabermetrics, the statistics-driven baseball analysis method popularized by Michael Lewis’s book Moneyball.
His firm signed a shady $530,000 deal with a company linked to the Turkish government, which became doubly
questionable when Flynn failed to register as a foreign lobbying agent. He accepted $45,000 to speak at a glitzy Russian government– sponsored gala in Moscow. Photos of the ex-general sitting next to Vladimir Putin at the dinner shocked many in the U.S. security establishment.
To censor a term, they’d say, was to “harmonize” it. Eventually, the censors caught on and banned the use of the word “harmony.”
In 2017, the lovable bear Winnie-the-Pooh was disappeared from the Chinese internet. Censors figured out “Pooh” was a reference to President Xi, as he walks with a similar waddle.
“The Kremlin’s idea is to own all forms of political discourse, to not let any independent movements develop outside its walls,” writes Peter Pomerantsev, author of Nothing Is True and Everything Is Possible. “Moscow can feel like an oligarchy in the morning and a democracy in the afternoon, a monarchy for dinner and a totalitarian state by bedtime.”
The point of this barrage was to instill doubt—to make people wonder how, with so many conflicting stories, one could be more “right” than any other.
Indeed, a full three years after the flight MH17 tragedy, we tested the strength of the Russian disinformation machine for ourselves by setting what’s known as a “honeypot.”
The term traditionally referred to a lure—in fiction, usually a sexy female agent—which enemy operatives couldn’t resist. Think Vesper Lynd’s seduction of James Bond in Casino Royale, or her real-life counterpart, Anna Chapman, the redheaded KGB agent who worked undercover in New York and then, after she was caught by the FBI and deported back to Russia, began a second career as a Facebook lingerie model.
As one 17-year-old girl explained at the nightclub, watching the teen tycoons celebrate from her perch at the bar, “Since fake news started, girls are more interested in geeks than macho guys.”
In numerous studies, across numerous countries,
involving millions of people, researchers have discovered a cardinal rule that explains how information disseminates across the internet, as well as how it shapes our politics, media, and wars. The best predictor is not accuracy or even content; it is the number of friends who share the content first. They are more likely to believe what it says—and then to share it with others who, in turn, will believe what they say. It is all about us, or rather our love of ourselves and people like us. This phenomenon is called “homophily,” meaning “love of the same.” Homophily is what makes humans social
...more
It didn’t matter if the story was untrue; it didn’t even matter if the story was preceded by a warning that it might be fake. What counted most was familiarity. The more often you hear a claim, the less likely you are to assess it critically. And the longer you linger in a particular
community, the more its claims will be repeated until they become truisms—even if they remain the opposite of the truth.
In California, the percentage of parents applying a “personal belief exception” to avoid vaccinating their kindergartners quadrupled between 2000 and 2013, and disease transmittal rates among kids soared as a result. Cases of childhood illnesses like whooping cough reached a sixty-year high, while the Disneyland resort was rocked by an outbreak of measles that sickened 147 children. Fighting an infectious army of digital conspiracy theorists, the State of California eventually gave up arguing and passed a law requiring kindergarten vaccinations, which only provided more conspiracy theory
...more
U.S. Army colonel turned historian Robert Bateman summarizes it pointedly: “Once, every village had an idiot. It took the internet to bring them all together.”
Nor has the problem been limited to elections. Perhaps the most worrisome example occurred on Christmas Eve 2016, when Pakistani defense minister Khawaja Asif read a false online report that Israel was threatening to attack his country if it intervened in Syria. “We will destroy them with a nuclear attack,” the report had quoted a retired Israeli defense minister as saying. Asif responded with a real threat of his own, tweeting about Pakistan’s willingness to retaliate with nuclear weapons against Israel. Fortunately, Christmas was saved when the original report was debunked before the crisis
...more
When Syria began to disintegrate into civil war in 2011, the Assad regime used Twitter bots to flood its opponents’ hashtags with random soccer statistics. Those searching for vital information to fight the regime were instead greeted with a wall of nonsense. At the same time, the #Syria news hashtag was flooded with beautiful landscape images.
One was funded by the Koch brothers and the other by the group that had organized the “Swift Boat” negative advertising campaign
that had sunk the 2004 presidential bid of Democratic candidate John Kerry. Suddenly, bots popped up everywhere, all fighting for Brown. Fake accounts across Facebook and Twitter trumpeted Brown’s name as often as possible, seeking to manipulate search results. Most novel was what was then called a “Twitterbomb.” Twitter users interested in the election began to receive automated replies supporting Brown. Importantly, these solicitations hit users beyond Massachusetts, greatly enriching Brown’s coffers. When Brown became the first Republican to win a Massachusetts Senate seat since 1952,
...more
Behind this massive bot army lay a bizarre mix of campaign operatives, true believers, and some who just wanted to watch the world burn.
The most infamous went by the online handle “MicroChip.” A freelance software
developer, MicroChip claimed to have become a believer in the alt-right after the 2015 Paris terrorist attacks. With his tech background, he realized he could manipulate Twitter’s programming applications, initially testing such “anti-PC” hashtags as #Raperefugees to see what he could drive viral. By the time of the 2016 election, he labored twelve hours at a time, popping Adderall to stay focused as he pumped out pro-Trump propaganda. Described by a Republican strategist as the “Trumpbot overlord,” MicroChip specialized in using bots to launch hashtags (#TrumpTrain, #cruzsexscandal,
...more
This highlight has been truncated due to consecutive passage length restrictions.
The hateful fakes were mimicking real people, but then real people began to mimic the hateful fakes.
“Terrorism is theater,” declared RAND Corporation analyst Brian Jenkins in a 1974 report that became one of terrorism’s foundational studies. Command enough attention and it didn’t matter how weak or strong you were: you could bend populations to your will and cow the most powerful adversaries into submission. This simple principle has guided terrorists for millennia. Whether in ancient town squares, in colonial wars, or via ISIS’s carefully edited beheadings, the goal has always been the same: to send a message.
Not by coincidence, the field of study that seeks to counter this process of radicalization, known as countering violent extremism (CVE), also focuses on the powers of community-building.
“Only peer-to-peer relations can change minds,” she concluded. The only way to prevent radicalization was to assemble a crowd of authentic voices to fight back. Pandith determined that social media would be the key battleground in this fight.
In an interview shortly after the election, Trump reflected on how he had won. “I think that social media has more power than the money they spent, and I think . . . I proved that.”
In mid-2016, Twitter fired the first salvo, kicking the Breitbart writer and far-right provocateur Milo Yiannopoulos out of its service. Having won fame with his race-baiting, Yiannopoulos had finally crossed the line when he organized a campaign of online harassment targeting an African American actress for the
crime of daring to star in a Ghostbusters remake. While Yiannopoulos would insist that he’d been wrongly smeared as a bigot—that he’d “just been trolling”—the evidence suggested otherwise. A year later, when a trove of Yiannopoulos’s files leaked online, it was revealed that he used email passwords like “Kristallnacht” (a November 1938 attack on German Jews in which dozens were murdered) and “LongKnives1290” (a reference to both the Night of the Long Knives, a 1934 Nazi purge that solidified
Hitler’s rule, and the year in which Jews were banished from...
This highlight has been truncated due to consecutive passage length restrictions.
When Facebook announced in 2017 that it was hiring 250 more people to review advertising on the platform, New York University business professor Scott Galloway rightly described it as “pissing in the ocean.”
In 2016, Facebook was reported to be developing such a “smart” censorship system in a bid to allow it to expand into the massive Chinese market. This was an ugly echo of how Sun Microsystems and Cisco once conspired to build China’s Great Firewall.
In turn, its State Department failed to increase efforts to battle online terrorist propaganda and Russian disinformation, even when Congress allocated hundreds of millions of dollars for the purpose.
Accordingly, information literacy is no longer merely an education issue but a national security imperative. Indeed, given how early children’s thought patterns develop and their use of online platforms begins, the process cannot start early enough.
At younger ages, these include programs that focus on critical thinking skills, expose kids to false headlines, and encourage them to play with (and hence learn from) image-warping software. Nor should the education stop as students get older. As of 2017, at least a dozen universities offered courses targeting more advanced critical thinking in media consumption, including an aptly named one at the University
of Washington: “Calling Bullshit: Data Reasoning in a Digital World.” This small number of pilot programs point the way, but they also illustrate how far we have to go in making them more widely available.