More on this book
Community
Kindle Notes & Highlights
where the number of accounts is relatively small. There’s a game I like to play with people when we’re at the bar, especially if they’re educated and drunk. The game has no name, but the rules are simple: The player tries to answer as many of the following questions as possible, without getting one wrong, without using the same answer twice, and without looking at a phone.
The first question is, “Name any historical figure who was alive in the twenty-first century.” (No one has ever gotten this one wrong.) The second question is, “Name any historical figure who was alive in the twentieth century.” (No one has ever gotten this one wrong, either.) The third question is, “Name any historical figure who was alive in the nineteenth century.”
Here’s what I mean: When something fits into a lucid, logical continuum, it’s generally remembered for how it (a) reinterprets the entity that influenced its creation, and (b) provides influence for whatever comes next. Take something like skiffle music—a musical genre defined by what it added to early-twentieth-century jazz (rhythmic primitivism) and by those individuals later inspired by it (rock artists of the British Invasion, most notably the Beatles). We think about skiffle outside of itself, as one piece of a multidimensional puzzle. That won’t happen with television.
To attack True Detective or Lost or Twin Peaks as “unrealistic” is a willful misinterpretation of the intent.
We don’t need television to accurately depict literal life, because life can literally be found by stepping outside.
But Mad Men defines the difference between ancillary verisimilitude and premeditated reconstruction. Mad Men cannot show us what life was like in the sixties. Mad Men can only show how life in the sixties came to be interpreted in the twenty-first century.
consumer, I’d argue the opposite. But right now, I’m focused on a different type of appreciation. I’m trying to think about TV as a dead medium—not as living art, but as art history (a process further convoluted by the ingrained reflex to never think about TV as “art,” even when it clearly is).
Nothing on TV looks faker than failed attempts at realism. A show like The Bachelor is instantly recognized (by pretty much everyone, including its intended audience) as a prefab version of how such events might theoretically play out in a distant actuality.
The protagonist in Entourage was supposed to be a version of Entourage producer Mark Wahlberg, had Wahlberg experienced Leonardo DiCaprio’s career. There’s a venture capitalist on Silicon Valley based (at least partially) on a melding of billionaire Mark Cuban and online entrepreneur Sean Parker.
The NFL] is completely disconnected to the consequences of the sport that they are engaged in . . . They are off on this nineteenth-century trajectory which is fundamentally out of touch with the rest of us.”
“I love football. Love it. Love it. I think it’s the last bastion of hope for toughness in America in men, in males.”
My existence is split into two unequal, asymmetrical halves. The first half was when I lived in North Dakota, where I was an interesting version of a normal person. That lasted twenty-six years. The second half started when I moved to New York, where I became an uninteresting version of an abnormal person.
I was single and devoid of prospects, though I don’t recall any feelings of loneliness; on at least three evenings, I sat on my balcony and watched a hedgehog eat apples, an experience more satisfying than going on dates and talking to other forlorn strangers about how dating is hard.
Ohio is a wonderful place to ponder the state of American democracy, because you’re constantly being reminded that America is where you are. Ohio is a scale model of the entire country, jammed into 43,000 square miles. Cleveland views itself as the intellectual East (its citizens believe they have a rivalry with Boston and unironically classify the banks of Lake Erie as the North Coast). Cincinnati is the actual South (they fly Confederate flags and eat weird food). Dayton is the Midwest. Toledo is Pittsburgh, before Pittsburgh was nice. Columbus is a low-altitude Denver, minus the New World
...more
This electoral phenomenon is widely known and endlessly cited, so living in Ohio during an election cycle is madness. It feels like the media is talking directly at you, all the time. Your vote is so (theoretically) valuable that you forget it’s (statistically) irrelevant.
old, so I considered myself too mature to take Rage Against the Machine seriously (that seemed like something you did when you were nineteen) and too cool to like their music as music (that seemed like something you did when you were twenty-seven). But I was still dumb enough to trust Michael Moore, so I liked this video.
Most Americans did, as is illustrated by the fact that no one seemed particularly outraged when the Supreme Court upheld Bush’s victory, except for those performative armchair revolutionaries who express reflexive outrage over everything.
This, I grant, is no profound revelation: The world evolves, so perspectives evolve with it.
When I went on dates—and maybe this explains why I was single—I would always talk about this hedgehog, inevitably noting a platitude that often applies to politics. The clever fox knows many things, states the proverb, but the old hedgehog knows one big thing. “I finally understand what that means,” I’d tell the confused woman sitting across from me. “The old hedgehog knows that gravity applies to fruit.” This banter, I must admit, did not lead to any canoodling (although most women did laugh, and one literally said, “You sure know a lot about hedgehogs,” which I almost count as a common-law
...more
In a plain sense, the adage simply means that some people know a little about many subjects while other people know a lot about one subject.
the soft sciences, with archaeology and anthropology and those kinds of things. The good part about that change is that historians are much more diligent about facts than they used to be, and much more careful and much more quantified, and they’re likely to talk about things like radiocarbon dating. They sound more like archaeologists. But the downside is—when you’re talking about stories that involve human beings—there’s a lot of it that’s just not quantifiable.”
Storytelling’s relationship to history is a little like interviewing’s relationship to journalism: a flawed process without a better alternative. We are socially conditioned to understand the universe through storytelling, and—even if we weren’t—there’s neurological evidence that the left hemisphere of our brain automatically organizes information into an explainable, reassuring narrative.
We then run through the various problems with Reagan’s presidential tenure, namely the lowering of the top marginal income tax on the super-rich from 70 percent to 28 percent and (what Carlin considers) the myth of Reagan’s destruction of the Soviet Union. “The reason the Soviet Union fell was that it was a system designed on an early-twentieth-century model that could not incorporate the changes necessary for the late twentieth century,”
If people feel optimistic about where they live, details don’t matter. But here’s the thing—you need to have an active, living memory of Reagan for any of this to seem plausible.
Every few months, something happens in the culture that prompts people to believe America is doomed. Maybe a presidential candidate suggests the pyramids were built to store wheat; maybe Miley Cyrus licks someone’s face at the Video Music Awards; maybe a student at Yale insists her college is not supposed to be an intellectual space, based on a fear of hypothetical Halloween costumes.
The collapse of Rome has been something alarmists have loved and worried about since 1776, the year British historian Edward Gibbon published The History of the Decline and Fall of the Roman Empire. That was, probably coincidentally, the same year the US declared its independence.
Pointing out how it’s not true in practice is so easy it doesn’t even require examples; all you need to do is look at the socioeconomic experiences of American citizens from varying races and opposing genders. But it’s not even true with people whose experiences are roughly identical. Take any two white males raised in the same income bracket in the same section of the same city, and assume they receive the same treatment from law enforcement and financial institutions and prospective employers. They’re still not equal. One of these people will be smarter than the other. One will be more
...more
which means it’s illusionary. That’s the problem. I sometimes wonder if the pillars of American political culture are really just a collection of shared illusions that will either (a) eventually be disbelieved or (b) collapse beneath the weight of their own unreality. And that would certainly be the end of everything (or at least something that will feel like everything to those who live through the
The Western world (and the US in particular) has invested so much of its identity into the conception of democracy that we’re expected to unconditionally support anything that comes with it. Voting, for example. Everyone who wants to vote should absolutely do so, and I would never instruct anyone to do otherwise. But it’s bizarre how angry voters get at non-voters. “It’s your civic responsibility,” they will say. Although the purpose of voting is to uphold a free society, so one might respond that a free society would not demand people to participate in an optional civic activity. “But your
...more
But I would traditionally counter that Washington’s One Big Thing mattered more, and it actually involved something he didn’t do: He declined the opportunity to become king, thus making the office of president more important than any person who would ever hold it. This, as it turns out, never really happened. There is no evidence that Washington was ever given the chance to become king, and—considering how much he and his peers despised the mere possibility of tyranny—it’s hard to imagine this offer was ever on the table.
The first moment someone calls for a revolution is usually the last moment I take them seriously.
The ultimate failure of the United States will probably not derive from the problems we see or the conflicts we wage. It will more likely derive from our uncompromising belief in the things we consider unimpeachable and idealized and beautiful. Because every strength is a weakness, if given enough time.
“My argument in The End of Science is that science is a victim of its own success,” he tells me from his home in Hoboken. “Science discovers certain things, and then it has to go on to the next thing. So we have heliocentrism and the discovery of gravity and the fundamental forces, atoms and electrons and all that shit, evolution, and DNA-based genetics. But then we get to the frontier of science, where there is still a lot left to discover. And some of those things we may never discover. And a lot of the things we are going to discover are just elaborations on what we discovered in the past.
...more
“The question is posed like this: ‘Will there be a time in our future when our current theories seem as dumb as Aristotle’s theories appear to us now?’
Many of them have already been infected by postmodernism and believe that knowledge is socially constructed, and they believe we’ll have intellectual revolutions forever.
“By the time I finally finished writing The End of Science, I’d concluded that people don’t give a shit about science,” Horgan says. “They don’t give a shit about quantum mechanics or the Big Bang. As a mass society, our interest in those subjects is trivial. People are much more interested in making money, finding love, and attaining status and prestige. So I’m not really sure if a post-science world would be any different than the world of today.”
But what specifically appalled Horgan was Fukuyama’s assertion about how a problem-free society would operate. Fukuyama believed that once mankind eliminated all its problems, it would start waging wars against itself for no reason, almost out of boredom.
“It’s the belief that what has always been in the past must always be in the future. To me, that’s a foolish position.”
The moment machines become self-aware, they will try to destroy people. What’s latently disturbing about this plot device is the cynicism of the logic. Our assumption is that computers will only act rationally.
If the ensuing assumption is that human-built machines would immediately try to kill all the humans, it means that doing so must be the most rational decision possible.
It must also be noted that Kurzweil initially claimed this event was coming in 2028, so the inception of the Singularity might be a little like the release of Chinese Democracy.
Still, for the first twenty-five years of my life, the concept of intelligence was intimately connected to broad-spectrum memory. If I was having an argument with a much older person about the 1970 Kent State shootings, I’d generally have to defer to her analysis, based on the justifiable fact that she was alive when it occurred and I was not. My only alternative was to read a bunch of books (or maybe watch a documentary) about the shooting and consciously retain whatever I learned from that research, since I wouldn’t be able to easily access the data again.
These were the kind of people who subscribed to Ray Gun magazine and made a point of mentioning how they started watching Seinfeld when it was called The Seinfeld Chronicles.
These are consumers who self-identify as being the first person to know about something (often for the sake of coolness, but just as often because that’s the shit they’re legitimately into). It’s integral to their sensibility. And the rippling ramifications of that sensibility are huge.
“My prejudices are innumerable, and often idiotic. My aim is not to determine facts, but to function freely and pleasantly.”
He was quite willing to concede that his most intensely held opinions weren’t based on factual data, so trying to determine what the factual data actually was would only make him depressed. It’s a worldview that—even if expressed as sarcasm—would be extremely unpopular today. But it’s quietly become the most natural way to think about everything, due to one sweeping technological evolution: We now have immediate access to all possible facts. Which is almost the same as having none at all.
Now, the Civil War is the most critical event in American history, and race is the defining conflict of this country. It still feels very much alive, so it’s not surprising that teachers and historians want to think about it on disparate micro and macro levels, even if the realest answer is the simplest answer.
The other side argues that all time is happening at once. This is difficult to comprehend. But replace the word “time” with “history,” and that phenomenon can be visualized on the Internet. If we think about the trajectory of anything—art, science, sports, politics—not as a river but as an endless, shallow ocean, there is no place for collective wrongness. All feasible ideas and every possible narrative exist together, and each new societal generation can scoop out a bucket of whatever antecedent is necessary to support their contemporary conclusions.
We can’t unconditionally trust the motives of people we don’t know, so we project a heightened sense of security upon those we do, even if common sense suggests we should do the opposite. If 90 percent of life is inscrutable, we need to embrace the 10 percent that seems forthright, lest we feel like life is a cruel, unmanageable joke. This is the root of naïve realism. It’s not so much an intellectual failing as an emotional sanctuary from existential despair.
Whenever we mention the possibility of relocating to Portland to anyone who reads magazines or listens to NPR or lives in New York, we are now asked, “But aren’t you worried about the earthquake?” My standard response equates to some rambling version of “Kind of, but not really.” It’s not something I think about, except when I’m writing this book. Then I think about it a lot.