More on this book
Community
Kindle Notes & Highlights
The on-screen characters behave predictably, placed within a universe where their behavior is predetermined (Titanic is essentially a three-hour flashback). They exist to support the completion of their inescapable doom. Their only job is to go down with the ship.
In Hollywood, Star Wars superfans started living in tents on the sidewalk outside of movie theaters, lining up for the chance to buy tickets six weeks before opening night. This was even stupider than it sounds: At the time, no theater chain in the country had definitively secured the rights to show The Phantom Menace. It was possible—and somehow unsurprising—that people were living on the street in order to buy tickets for a movie that might not even be available.
To some, Jar Jar epitomized the coldest view of George Lucas as an auteur—a technical taskmaster who preferred designing actors on a computer so that he’d never have to confront living people with actual feelings.
Lucas had tried pretty goddamn hard to satisfy an entire generation of strangers who likely wouldn’t have been satisfied by anything he delivered. Did such a mean-spirited categorization bother him? Maybe. But not really. “I’m sorry if they don’t like it,” said Lucas. “They should go back and see The Matrix.”
It occurred without any warning, in a city assumed to have no political significance (a detail that amplified the universality of the fear).
He expressed no remorse, once telling a journalist that his coming execution only meant the final score was “168 to 1.”
Network anchorwoman Connie Chung said, “A U.S. government source has told CBS News that it has Middle East terrorism written all over it.”
When arrested, he was wearing a T-shirt that said, “Sic semper tyrannis,” the Latin phrase supposedly exclaimed by John Wilkes Booth after he’d assassinated Abraham Lincoln.
For someone like McVeigh—and everyone else, really—the experience of following the events in Waco was a creative process. It was like a TV series scripted by writers who’d run out of plotlines.
For most of the 51-day encounter, nothing in Waco was happening: There were unseen people inside the compound and there were militarized ATF authorities stationed at the perimeter of the encampment. We watched people watching people.
He watched tanks being driven through the walls of compound residences while a loudspeaker repeatedly broadcast the phrase “This is not an assault. This is not an assault.” Even in 1993, that level of ironic cognitive dissonance was too much.
Thomas exited the room after his initial appearance and did not return until Hill was finished. He then denied everything Hill had said, but also added that he hadn’t listened to one word of her testimony. He then dropped the rhetorical equivalent of an atomic bomb.
Had Anita Hill been white, the proceedings would have adopted a classic racial tension (and that would have almost certainly hurt Thomas). But because both parties were Black (and because Thomas had used phrases like “high-tech lynching” in his defense), it presented a Sophie’s Choice for sympathetic liberals: Was this mostly about racism or was this mostly about sexism?
The very first sexual harassment case in U.S. history (under Title VII of the Civil Rights Act) had happened just fifteen years before this hearing.
A poll in USA Today showed Hill’s support among women was only around 26 percent.
Which is not to suggest no one believed Hill: Both The New York Times and the Los Angeles Times took editorial stands against Thomas’s confirmation. There was an entire episode of the CBS sitcom Designing Women about the hearings that generally sided with Hill (though one female character did side with Thomas). The 1992 song “Youth Against Fascism” by Sonic Youth includes the line “I believe Anita Hill,” which is not exactly a subtle expression of support.
Was it an early example of that perplexing nineties paradox where institutions were viewed cynically while institutional figures were believed?
Forty years of network programming had trained people to associate the performance of emotion with the essentialism of truth, and Thomas had been much more emotional than Hill. He seemed angry, sad, confused, and uncompromising. She just made a good argument, which—on television—is never enough.
It is a hinge moment in U.S. media history, ostensibly for its effect on race and celebrity but mostly for the way it combined tragedy and stupidity on a scope and scale that would foretell America’s deterioration into a superpower that was also a failed state.
There was a long history of Simpson’s physically abusing Nicole, once prompting her to directly tell police, “He’s going to kill me!”
It was a math equation: The fact that he stabbed two people to death had to be weighed against the history of racism in America.
Once you know the outcome of the chase, the actions leading up to that finale become not just boring but borderline painful. One cannot reconnect with the feeling that this event was ever captivating. The newscasters compulsively repeat the same phrases (some version of “What you are seeing right now is unbelievable”), speculate on minor details (such as what off-ramp the vehicle might take), and occasionally say nothing at all for long stretches of time.
Knowing what is now known, it’s hard to overlook how limited the potential outcomes really were—the Bronco could stop or the Bronco could keep going.
The detail always noted in remembrances of the Bronco chase is the throngs of bystanders cheering for Simpson as the car rolled down the freeway, congregating on overpasses and holding makeshift cardboard signs proclaiming, “The Juice Is Loose.” It seemed perverse then and still seems perverse now. Yet this can also be understood as the primordial impulse of what would eventually drive the mechanism of social media: the desire of uninformed people to be involved with the news, broadcasting their support for a homicidal maniac not because they liked him, but because it was exhilarating to
...more
When a child in The Matrix bends a spoon with his mind, the child has done something surreal; when a child walks into a school cafeteria and shoots his classmates, he is doing something utterly and unspeakably genuine.
Columbine High was nowhere near to being the first American school shooting. Less than a year before Columbine, a mentally ill fifteen-year-old killed two of his classmates and wounded twenty-five others at Thurston High School in Springfield, Oregon. The history of such acts is disturbingly long, dating back to the dawn of public education in the New World.
It was the epitome of that three-phase creative process: the disorder and guesswork of the live event, the subsequent seventy-two hours of random speculation and false explanation, and ten years of debunking all the incorrect conjecture about what had motivated the killers to do what they did.
The persistence of these fabrications can be mainly attributed to a communal unwillingness to admit that there was no rational explanation behind this attack. Harris was a full-on psychopath who aspired to replicate the work of Timothy McVeigh. Klebold was (at a minimum) depressed and suicidal.
The final entry in the Harris diary, dated April 3, does indicate how insecurity and loneliness played a role in his desire to destroy (“I hate you people for leaving me out of so many fun things . . . you had my phone #”).
The artist most directly blamed for the shooting was Marilyn Manson, a knowingly controversial shock rocker surging in popularity (his most recent album, Mechanical Animals, had debuted at number 1 on the Billboard charts). The fact that Harris and Klebold were not fans of Marilyn Manson did not seem to matter (they preferred the German industrial group Rammstein).
Manson replied, “I wouldn’t say a single word to them. I would listen to what they have to say. That’s what no one did.” It was, to Manson’s credit, a generous reaction to an event he had nothing to do with (and that could have ended his career). Yet his words also felt a little like the last scene from the worst real-life After School Special the country had ever experienced: “What about the children? What are the children feeling?”
Alan Greenspan told you to feel less. He never said those exact words, but that was the tip of his intellectual spear. Greenspan was something that had never existed before and will likely never exist again: a rock star Federal Reserve chairman.
Greenspan pointedly dropped the phrase “irrational exuberance” into a 1996 speech and the worldwide stock market immediately tumbled.
The foundation of his ideology was initially grounded in two philosophies: (a) the notion that only verifiable facts are worthy of consideration, and (b) Ayn Rand’s Objectivist theory, promoting the idea that society would be better served if everyone always acted in their own self-interest. To say these theories are unpopular with progressives is a little like saying nuclear power is unpopular with people who owned hotels outside Chernobyl.
“I cannot listen to other people blaming their mothers,” the talk show host said in 1994. “I have to move on. We’re not gonna book a show where someone is talking about their victimization.”
By 1993, she’s the genre’s powerbroker, asking Michael Jackson if he’s a virgin in front of a television audience of 90 million (his response: “I’m a gentleman”).
In 2000, she creates her own monthly magazine and appears on every cover for twenty consecutive years.
Like other women, she hates being fat.
There was a sense of inevitability to the Weasel’s ascension, as if this thing no one had asked for was obviously the culmination of what pop culture had been careening toward for half a generation.
Celebrity simply connotes a person who is famous, and it’s not really possible for any other American to be more famous than the sitting president for a sustained period of time. It’s the ultimate form of celebrity, since it’s the only version of that designation that automatically interlocks with history.
There will be a time, in the not-too-distant future, when almost no one will remember that Robert Redford was the biggest box office star of 1975, or that 1975 saw the release of Born to Run. But there will always be a rough awareness that Gerald Ford was president in 1975, even though he was never elected and achieved almost nothing.
A president is the only celebrity remembered out of civic obligation. And that, usually, works to a president’s advantage.
Bad policies and political betrayals stay tethered to the past while the man who made them continues to live, humanized by the rudimentary act of staying alive.
Which is why the legacy of Bill Clinton is so difficult to elucidate to those who missed his tenure: He’s the rare example of a polarizing ex-president who saw the anger against him fade, only to have it resurface and spike upward within his own lifetime, often for the same reasons that made people like him originally.
Clinton was the last transcendent political figure of an era no one realized was ending.
the numerous other women who accused him of pursuing unwanted (or consensual-but-extramarital) sexual encounters, a number stretching into the double digits.
How could a married forty-nine-year-old “liberal” president (a) chronically seduce an unpaid subordinate less than half his age, (b) receive nonreciprocal oral sex inside the Oval Office, (c) get caught, (d) lie about it, (e) never directly apologize to the involved woman, and (f) still experience his highest presidential approval rating immediately after being impeached for lying under oath about the nature of that sexual relationship?