But What If We're Wrong?: Thinking about the Present as If It Were the Past
Rate it:
Open Preview
2%
Flag icon
It’s a dissonance that creates the most unavoidable of intellectual paradoxes: When you ask smart people if they believe there are major ideas currently accepted by the culture at large that will eventually be proven false, they will say, “Well, of course. There must be. That phenomenon has been experienced by every generation who’s ever lived, since the dawn of human history.” Yet offer those same people a laundry list of contemporary ideas that might fit that description, and they’ll be tempted to reject them all. It is impossible to examine questions we refuse to ask. These are the big ...more
3%
Flag icon
“Pompous, overbearing, self-indulgent, and insufferable. This is the worst book I’ve ever read,” wrote one dissatisfied customer in 2014. “Weak narrative, poor structure, incomplete plot threads, ¾ of the chapters are extraneous, and the author often confuses himself with the protagonist. One chapter is devoted to the fact that whales don’t have noses. Another is on the color white.” Interestingly, the only other purchase this person elected to review was a Hewlett-Packard printer that can also send faxes, which he awarded two stars. I can’t dispute this person’s distaste for Moby-Dick. I’m ...more
4%
Flag icon
The straightforward definition of naïve realism doesn’t seem that outlandish: It’s a theory that suggests the world is exactly as it appears. Obviously, this viewpoint creates a lot of opportunity for colossal wrongness (e.g., “The sun appears to move across the sky, so the sun must be orbiting Earth”). But my personal characterization of naïve realism is wider and more insidious. I think it operates as the manifestation of two ingrained beliefs: “When considering any question, I must be rational and logical, to the point of dismissing any unverifiable data as preposterous,” and “When ...more
5%
Flag icon
On those rare occasions when The Book of Predictions is referenced today, the angle is inevitably mocking: The most eye-catching predictions are always the idiotic ones. As it turns out, there has not been a murder in outer space committed by a jealous astronaut, which is what lawyer F. Lee Bailey predicted would occur in 1990 (and evidently struck Bailey as more plausible than the possibility of defending a jealous Hall of Fame running back for an earthbound murder in 1994). According to population expert Dr. Paul Ehrlich, we should currently be experiencing a dystopian dreamscape where ...more
5%
Flag icon
Yet what is most instructive about The Book of Predictions is not the things that proved true. It’s the bad calculations that must have seemed totally justifiable—perhaps even conservative—at the time of publication. And the quality all these reasonable failures share is an inability to accept that the status quo is temporary. The Book of Predictions was released in 1980, so this mostly means a failure to imagine a world where the United States and the Soviet Union were not on the cusp of war. Virtually every thought about the future of global politics focuses on either (a) an impending ...more
6%
Flag icon
Yet as recently as twenty years ago, this question still mattered; as a college student in the early nineties, I knew of several long-term romantic relationships that were severed simply because the involved parties attended different schools and could not afford to make long-distance calls, even once a week. In 1994, the idea of a sixty-minute phone call from Michigan to Texas costing less than mailing a physical letter the same distance was still unimaginable. Which is why no one imagined it in 1980, either.
6%
Flag icon
With Occam’s Razor is how a serious person considers the past. Unfortunately, it simply doesn’t work for the future. When you’re gazing into the haze of a distant tomorrow, everything is an assumption. Granted, some of those competing assumptions seem (or maybe feel) more reasonable than others. But we live in a starkly unreasonable world. The history of ideas is littered with more failures than successes. Retroactively, we all concede this. So in order to move forward, we’re forced to use a very different mind-set. For lack of a better term, we’ll just have to call it Klosterman’s Razor: the ...more
8%
Flag icon
Art history almost never works that way. In fact, it often seems like our collective ability to recognize electrifying genius as it occurs paradoxically limits the likelihood of future populations certifying that genius as timeless. “What ages [poorly], it seems, are ideas that trend to the clever, the new, or the merely personal,” Saunders continues. “What gets dated, somehow, is that which is too ego inflected—that hasn’t been held up against the old wisdom, maybe, or just against some innate sense of truth, and rigorously, with a kind of self-abnegating fervor. Again and again some yahoo ...more
8%
Flag icon
When The New York Times released its 2014 “100 Notable Books” list, several readers noticed how there were exactly twenty-five fiction books by men, twenty-five fiction books by women, twenty-five nonfiction books by men, and twenty-five nonfiction books by women. Do I have a problem with this? I have no problem with this. But it does reflect something telling about the modern criteria for quantifying art: Symmetrical representation sits at the center of the process. It’s an aesthetic priority. Granted, we’re dealing with a meaningless abstraction, anyway—the list is called “notable” (as ...more
9%
Flag icon
Every rational person knows this symmetry was conscious, and that this specific result either (a) slightly invalidates the tangible value of the list, or (b) slightly elevates the intangible value of the list. (I suppose it’s also possible to hold both of those thoughts simultaneously.) In either case, one thing is absolutely clear: This is the direction in which canonical thinking is drifting. Díaz’s view, which once felt like an alternative perspective, is becoming the entrenched perspective. And when that happens, certain critical conclusions will no longer be possible.
9%
Flag icon
This is not a criticism of identity politics (even though I know it will be taken that way), nor is it some attempt at diminishing the work of new writers who don’t culturally resemble the old writers (because all writing is subjective and all writers are subjectively valid). I’m not saying this progression is unfair, or that the new version of unfairness is remotely equivalent to the old version of unfairness. Such processes are never fair, ever, under any circumstances. This is just realpolitik reality: The reason something becomes retrospectively significant in a far-flung future is ...more
10%
Flag icon
This is how the present must be considered whenever we try to think about it as the past: It must be analyzed through the values of a future that’s unwritten. Before we can argue that something we currently appreciate deserves inclusion in the world of tomorrow, we must build that future world within our mind. This is not easy (even with drugs). But it’s not even the hardest part. The hardest part is accepting that we’re building something with parts that don’t yet exist.
11%
Flag icon
It’s flat-out impossible to speculate on the future without (a) consciously focusing on the most obvious aspects of what we already know, and (b) unconsciously excluding all the things we don’t have the intellectual potential to grasp. I can’t describe what will happen in one hundred years if my central thesis insists that the best guess is always the worst guess. I can’t reasonably argue that the most important writer of this era is (for example) a yet-to-be-identified Irish-Asian skoliosexual from Juárez, Mexico, who writes brilliantly about migrant cannibalism from an anti-union ...more
13%
Flag icon
There will be no shadow history of the 2008 financial crisis or the 2014 New England Patriots’ “Deflategate” scandal, because every possible narrative and motive was discussed in public, in real time, across a mass audience, as the events transpired. Competing modes of discourse no longer “compete.” They coexist. And the same thing is happening in the arts. The diverse literary canon Díaz imagines is not something that will be reengineered retroactively. We won’t have to go back and reinsert marginalized writers who were ignored by the establishment, because the establishment is now a ...more
13%
Flag icon
But here’s where we taste the insecure blood from Klosterman’s Razor: The mere fact that I can imagine this scenario forces me to assume that it won’t happen. It’s a reasonable conclusion to draw from the facts that presently exist, but the future is a teenage crackhead who makes shit up as he goes along. The uncomfortable, omnipresent reality within any conversation about representation is that the most underrepresented subcultures are the ones that don’t even enter into the conversation. They are, by definition, impossible to quantify. They are groups of people whom—right now, in the present ...more
14%
Flag icon
There is a misguided belief—often promoted by creative writing programs—that producing fiction excessively tied to technology or popular culture cheapens the work and detracts from its value over time. If, for example, you create a plot twist that hinges on the use of an iPad, that story will (allegedly) become irrelevant once iPads are replaced by a new form of technology. If a character in your story is obsessed with watching Cheers reruns, the meaning of that obsession will (supposedly) evaporate once Cheers disappears from syndication. If your late-nineties novel is consumed with Monica ...more
15%
Flag icon
When any novel is rediscovered and culturally elevated, part of the process is creative: The adoptive generation needs to be able to decide for themselves what the deeper theme is, and it needs to be something that wasn’t widely recognized by the preceding generation. In one hundred years, it’s possible that the contemporary novel best illustrating media alienation will be something like Cormac McCarthy’s The Road, even though nobody makes that connection now. The defining 9/11 novel may end up being Infinite Jest, even though it was written five years before the actual event and has very ...more
18%
Flag icon
When someone argued rock was “dead” in 1968 or 1977 or 1994 or 2005, that individual was making an aesthetic argument, grounded in whatever that person assumed to be the compromised motives of the artists of the time (customarily built on the conviction that the current generation of musicians were more careerist in nature, thus detracting from the amount of raw emotion they were allegedly injecting into the music). The popularity of the rock genre is irrelevant to this accusation. People insisted rock was dead in the mid-1980s, the absolute commercial peak for guitar-driven music. Normal ...more
19%
Flag icon
“Rock” can now signify anything, so it really signifies nothing; it’s more present, but less essential. It’s also shackled by its own formal limitations: Most rock songs are made with six strings and electricity, four thicker strings and electricity, and drums. The advent of the digital synthesizer opened the window of possibility in the 1980s, but only marginally. By now, it’s almost impossible to create a new rock song that doesn’t vaguely resemble an old rock song. So what we have is a youth-oriented musical genre that (a) isn’t symbolically important, (b) lacks creative potentiality, and ...more
23%
Flag icon
“Over time, critics and historians will play a larger role in deciding whose fame endures. Commercial factors will have less impact,” he writes. “I don’t see why rock and pop will follow any different trajectory from jazz and blues. For example: In 1956, Nelson Riddle and Les Baxter sold better than almost every rock ’n’ roll star not named Elvis Presley, but historians and critics don’t care about 1950s bachelor pad music. They’ve constructed a historical perspective on the period that emphasizes the rise of rock, and that pushes everything else into the background. In 1957, Tab Hunter’s ...more
24%
Flag icon
Never Mind the Bollocks is part of the White House record library, inserted by Amy Carter just before her dad lost to Ronald Reagan.
Christopher (Donut)
Citation needed. Amy Carter was 14 in 1980.
24%
Flag icon
The album is overtly transgressive (and therefore memorable), while Saturday Night Fever has been framed as a prefab totem of a facile culture (and thus forgettable). For almost forty years, that’s been the overwhelming consensus. But I’ve noticed—just in the last four or five years—that this consensus is shifting. Why? Because the definition of “transgressive” is shifting. It’s no longer appropriate to dismiss disco as superficial. More and more, we recognize how disco latently pushed gay, urban culture into white suburbia, which is a more meaningful transgression than going on a British TV ...more
26%
Flag icon
Interestingly (or maybe unavoidably), Lethem and Adams both think the better answer is Bob Dylan. But something tells me that their dual conclusion is too rooted in the world we still inhabit. It seems self-evident only because Dylan still feels culturally present. I keep imagining a college classroom in five hundred years, where a hipster instructor is leading a tutorial filled with students. These students relate to rock music with the same level of fluency as the music of Mesopotamia: It’s a style of music they’ve learned to recognize, but just barely (and only because they’ve taken this ...more
27%
Flag icon
Architects fuse aesthetics with physics and sociology. And there is a deep consensus over who did this best, at least among non-architects: If we walked down the street of any American city and asked people to name the greatest architect of the twentieth century, most would say Frank Lloyd Wright. In fact, if someone provided a different answer, we’d have to assume we’ve stumbled across an actual working architect, an architectural historian, or a personal friend of Frank Gehry. Of course, most individuals in those subsets would cite Wright, too. But in order for someone to argue in favor of ...more
30%
Flag icon
Talking to only these two men, I must concede, is a little like writing about debatable ideas in pop music and interviewing only Taylor Swift and Beyoncé Knowles. Tyson and Greene are unlike the overwhelming majority of working scientists. They specialize in translating ultra-difficult concepts into a language that can be understood by mainstream consumers; both have written bestselling books for general audiences, and I assume they both experience a level of envy and skepticism among their professional peers. That’s what happens to any professional the moment he or she appears on TV. Still, ...more
30%
Flag icon
His unspoken reaction came across as “This is a fun, non-crazy hypothetical.” Tyson’s posture was different. His unspoken attitude was closer to “This is a problematic, silly supposition.” But here again, other factors might have played a role: As a public intellectual, Tyson spends a great deal of his time representing the scientific community in the debate over climate change. In certain circles, he has become the face of science. It’s entirely possible Tyson assumed my questions were veiled attempts at debunking scientific thought, prompting him to take an inflexibly hard-line stance. (It’s ...more
35%
Flag icon
When Galileo later declared that Copernicus was right (and that the Bible was therefore wrong) in the seventeenth century, he was eventually arrested by the Inquisition and forced to recant—but not before the Catholic Church told him (and I’m paraphrasing here): “Hey, man. We all know you’re probably correct about this. We concede that you’re a wizard, and what you’re saying makes sense. But you gotta let us explain this stuff to the rest of the world very, very slowly. We can’t suddenly tell every pasta-gorged plebeian in rural Italy that we live in a heliocentric universe. It will blow their ...more
36%
Flag icon
Like most people who enjoy dark rooms and Sleep’s Jerusalem, I dig the simulation argument. It is, as far as I can tell, the most reasonable scientific proposition no one completely believes. I have yet to encounter anyone who totally buys it; even the man most responsible for its proliferation places the likelihood of its validity at roughly 20 percent. But even a one-in-five chance presents the potential for a paradigm shift greater than every other historical shift combined. It would place the Copernican Revolution on a par with the invention of Velcro. The man to whom I refer is ...more
37%
Flag icon
“That’s a rational possibility: that someday, in the future, we’ll be able to simulate universes with such verisimilitude that the beings within those simulations believe they are alive in a conventional sense. They will not know that they are inside a simulation,” says Greene. “And in that case, there is a simulator—maybe some kid in his garage in the year 4956—who is determining and defining the values of the constants in this new universe that he built on a Sunday morning on a supercomputer. And within that universe, there are beings who will wonder, ‘Who set the values of these numbers ...more
51%
Flag icon
Since I was still crossing the East River in a taxi at the seven p.m. start time, the order of the speakers was flopped. Gladwell graciously spoke first. When I finally arrived, he was almost finished with his piece, a reported essay from The New Yorker about why NFL teams are habitually terrible at drafting quarterbacks. Upon finishing the reading, he took a handful of questions from the audience, almost all of which were about football. The last question was about the future of the sport. Gladwell’s response, at least at the time, seemed preposterous. “In twenty-five years,” he said, “no one ...more
55%
Flag icon
Only Richard Brody of The New Yorker came close to saying this directly: “To justify his methods,” he writes, “[Simmons] tells [Teller] that the worst thing you can tell a young artist is ‘Good job,’ because self-satisfaction and complacency are the enemies of artistic progress . . . and it’s utter, despicable nonsense. There’s nothing wrong with ‘Good job,’ because a real artist won’t be gulled or lulled into self-satisfaction by it: real artists are hard on themselves, curious to learn what they don’t know and to push themselves ahead.” Socially, this is absolutely the way we have been ...more
56%
Flag icon
It also denies the long-held assumption that physical games are a natural manifestation for a species that is fundamentally competitive, and that team sports are simply adult versions of the same impulse that prompts any two five-year-olds to race across the playground in order to see who’s faster. When I mentioned this theory to a friend who works for ESPN, he thought about it for a long time before saying, “I guess I just can’t imagine a world where sports don’t exist. It would seem like a totally different world.” Well, he’s right. It would be a totally different world. But different worlds ...more
56%
Flag icon
My free time was spent drinking, sometimes with others but often alone. I was single and devoid of prospects, though I don’t recall any feelings of loneliness; on at least three evenings, I sat on my balcony and watched a hedgehog eat apples, an experience more satisfying than going on dates and talking to other forlorn strangers about how dating is hard. Nothing was happening in my life, which provided me the luxury of thinking about life and politics at the same time, almost as if they had an actual relationship.
67%
Flag icon
For a time in the early 2000s, there was a belief that bloggers would become the next wave of authors, and many big-money blogger-to-author book deals were signed. Besides a handful of notable exceptions, this rarely worked, commercially or critically. The problem was not a lack of talent; the problem was that writing a blog and writing a book have almost no psychological relationship. They both involve a lot of typing, but that’s about as far as it goes. A sentence in a book is written a year before it’s published, with the express intent that it will still make sense twenty years later. A ...more
67%
Flag icon
Even when the Internet appears to be nostalgically churning through the cultural past, it’s still hunting for “old newness.” A familiar video clip from 1986 does not possess virality; what the medium desires is an obscure clip from 1985 that recontextualizes the familiar one. The result is a perpetual sense of now. It’s a continual merging of the past with the present, all jammed into the same fixed perspective. This makes it seem like our current, temporary views have always existed, and that what we believe today is what people have always believed. There is no longer any distance between ...more
71%
Flag icon
The program bills itself as a political roundtable featuring the “sharpest minds,” the “best sources,” and the “hardest talk.” All three of these statements are patently false, though it’s hard to isolate which detail is the most untrue, particularly since “best sources” is willfully unclear70 and “hardest talk” is wholly ambiguous in any non-pornographic context. The content is ostensibly about Beltway gossip, but it’s much closer to wide-angle political science for semi-informed lunatics. My wife refers to The McLaughlin Group as The Yelling Hour, which is technically incorrect twice—the ...more
72%
Flag icon
My grandmother was born before the Wright Brothers’ virgin 852-foot flight and died after we’d gone to the moon so many times the public had lost interest. Everything in between happened within her lifetime. It might be unreasonable to expect any normal person to experience this level of constant change without feeling—and maybe without literally being—irrefutably nutzo. Consciously trying to keep up with what’s happening might actually make things worse. We spend our lives learning many things, only to discover (again and again) that most of what we’ve learned is either wrong or irrelevant. A ...more
76%
Flag icon
A final note about hedgehogs: In “The Case Against Freedom,” I spend a few pages describing a period of my life when I watched a hedgehog from the balcony of my Akron apartment. It turns out there is a problem with this memory—hedgehogs are not native to North America. Whatever was chomping apples outside my window must have been either a groundhog or a woodchuck (although it was definitely something). I have to assume this is not a well-known fact, since I’ve been telling this anecdote for almost two decades and not one person has ever remarked, “Hey idiot—don’t you realize there are no ...more
93%
Flag icon
Now, the easy counter to this suggestion is, “That’s crazy. Nobody uses the Deep Web for artistic purposes, and nobody ever would. That’s like saying the next great movie director might currently be involved with the production of snuff films.” But this response is already false. The British electronica artist Aphex Twin released the title and track listing for his 2014 album Syro on the hidden Deep Web service Tor. The reason this was done remains unclear—but that’s part of the value here. Clarity is not required.
94%
Flag icon
This is probably obvious, but—just in case it isn’t—I should mention that whenever I call something “great,” I’m not arguing that I necessarily consider that particular thing to reflect any greatness to me personally, or even that I like (or fully understand) what that something is. I’m using it more like the editorial “we”: There is a general harmonic agreement that this particular thing is important and artful, both by people invested in supporting that assertion and (especially) by people who will accept that designation without really considering why. My own taste might play a role in the ...more