But What If We're Wrong?: Thinking about the Present as If It Were the Past
Rate it:
Open Preview
3%
Flag icon
We now know (“know”) that Newton’s concept was correct. Humankind had been collectively, objectively wrong for roughly twenty centuries. Which provokes three semi-related questions:
4%
Flag icon
Melville, a moderately successful author at the time of the novel’s release, assumes this book will immediately be seen as a masterwork. This is his premeditated intention throughout the writing process. But the reviews are mixed, and some are contemptuous (“it repels the reader” is the key takeaway from one of the very first reviews in the London Spectator). It sells poorly—at the time of Melville’s death, total sales hover below five thousand copies. The failure ruins Melville’s life: He becomes an alcoholic and a poet, and eventually a customs inspector. When he dies destitute in 1891, one ...more
4%
Flag icon
century after his death, Melville gets his own extinct super-whale named after him, in tribute to a book that commercially tanked. That’s an interesting kind of career.
4%
Flag icon
The practical reality is that any present-tense version of the world is unstable. What we currently consider to be true—both objectively
4%
Flag icon
But increasing the capacity for the reconsideration of ideas is not the same as actually changing those ideas (or even allowing them to change by their own momentum).
4%
Flag icon
We live in an age where virtually no content is lost and virtually all content is shared. The sheer amount of information about every current idea makes those concepts difficult to contradict, particularly in a framework where public consensus has become the ultimate arbiter of validity. In other words, we’re starting to behave as if we’ve reached the end of human knowledge. And while that notion is undoubtedly false, the sensation of certitude it generates is paralyzing.
4%
Flag icon
Yet this wholly logical position discounts the overwhelming likelihood that we currently don’t know something critical about the experience of life, much less the ultimate conclusion to that experience. There are so many things we don’t know about energy, or the way energy is transferred, or why energy (which can’t be created or destroyed) exists at all. We can’t truly conceive the conditions of a multidimensional reality, even though we’re (probably) already living inside one.
5%
Flag icon
We must start from the premise that—in all likelihood—we are already wrong. And not “wrong” in the sense that we are examining questions and coming to incorrect conclusions, because most of our conclusions are reasoned and coherent. The problem is with the questions themselves.
6%
Flag icon
it’s not that the 1948 editors of Science Digest were illogical; it’s that logic doesn’t work particularly well when applied to the future.
6%
Flag icon
Occam’s Razor: the philosophical argument that the best hypothesis is the one involving the lowest number of assumptions.
6%
Flag icon
Language is more durable than content. Words outlive their definitions. Vinyl represented around 6 percent of music sales in 2015, but people continue to say they listen
7%
Flag icon
What critics in the nineteenth century were profoundly wrong about was not the experience of reading this novel; what they were wrong about was how that experience would be valued by other people. Because that’s what we’re really talking about whenever we analyze the past.
7%
Flag icon
“It’s a terrifying thought,” George Saunders tells me, “that all of the things we—that I—take for granted as being essential to good literature might just be off. You read a ‘good’ story from the 1930s and find that somehow the world has passed it by. Its inner workings and emphases are somehow misshapen. It’s answering questions in its tone and form that we are no longer asking. And yet the Gaussian curve4 argues that this is true—that most of us are so habituated to the current moment that what we do will fade and lose its power and just be an historical relic, if that.
8%
Flag icon
In fact, it often seems like our collective ability to recognize electrifying genius as it occurs paradoxically limits the likelihood of future populations certifying that genius as timeless.
8%
Flag icon
I suspect it will be controlled by the evolving, circuitous criteria for what is supposed to matter about anything. When trying to project which contemporary books will still be relevant once our current
8%
Flag icon
“All I can tell you is that in 100 years I seriously doubt that the list of the 100 best writers from our time is going to be as white, as male, as straight, as monocultural as the lists we currently produce about the 100 best writers of our time.” This an e-mail from Junot Díaz,
8%
Flag icon
It’s an aesthetic priority. Granted, we’re dealing with a meaningless abstraction, anyway—the list is called “notable” (as opposed to “best”), it’s politicized by the relationships certain authors have with the list makers, it annually highlights books that instantly prove ephemeral, and the true value of inclusion isn’t clear to anyone. Yet in the increasingly collapsible, eternally insular idiom of publishing, the Times’ “100 Notable” list remains the most visible American standard for collective critical appreciation.
9%
Flag icon
When that evolution transpires, here’s the one critical conclusion that cannot (and will not) happen: “You know, I’ve looked at all the candidates, consciously considering all genders and races and income brackets. I’ve tried to use a methodology that does not privilege the dominant class in any context. But you know what? It turns out that Pynchon, DeLillo, and Franzen were the best. The fact that they were white and male and straight is just coincidental.” If you prioritize cultural multiplicity above all other factors, you can’t make the very peak of the pyramid a reactionary exception, ...more
9%
Flag icon
significant in a far-flung future is detached from the reason it was significant at the time of its creation—and that’s almost always due to a recalibration of social ideologies that future generations will accept as normative. With books, these kinds of ideological transfers are difficult to anticipate, especially since there are over two million books published in any given year. But it’s a little
9%
Flag icon
The idea of a character choosing between swallowing a blue pill that allows him to remain a false placeholder and a red pill that forces him to confront who he truly is becomes a much different metaphor. Considered from this speculative vantage point, The Matrix may seem like a breakthrough of a far different kind. It would feel more reflective than entertaining, which is precisely why certain things get remembered while certain others get lost.
10%
Flag icon
What’s weird is that all his greatest work came after he fell out of fashion, and also that there was such a strong dip in his reputation that he was barely remembered for a while . . . Kafka was conversant with a sophisticated literary conversation, and had, despite the strongly self-defeating tendencies to neither finish nor publish his writings, the attention of various alert colleagues.
10%
Flag icon
even if we accept the possibility that there is a literary canon—we’re
10%
Flag icon
we’re really discussing multiple canons and multiple posterities. We are discussing what Lethem calls “rival claims”: in essence, the idea that the only reason we need a canon is so that other people can disagree with it. The work of the writers who get included becomes almost secondary, since they now exist only for the purposes of contradiction.
11%
Flag icon
Meanwhile, we live in a time where the numbers of creators of literature has just exploded—and that plenitude is the field, and the context for the tiny, tiny number of things that get celebrated in the present, let alone recalled ten or twenty years later, let alone by the 22nd century.
11%
Flag icon
He was paralyzed by both a hatred of his own writing and a buried arrogance over his intellectual superiority.
12%
Flag icon
If you are heavily involved with normal Internet culture, you are partially involved with branding (even if you’re trying to be weird and obtuse on purpose). Internet writing is, by definition, public writing.
12%
Flag icon
his voice can be trusted, because he (seemingly) had no ulterior motive. He was just typing into the abyss. Which is pretty much the definition of writing on a version of the Internet nobody sees. So this is the venue. This is where our candidate lives.
12%
Flag icon
But I guess I’ll have to take that risk, since overthinking is the only way to figure out the opposite of something that hasn’t actually happened.
13%
Flag icon
For most of the twentieth century, there was an ever-growing realization (at least among intellectuals) that the only way to understand the deeper truth about anything complicated was through “shadow histories”: those underreported, countercultural chronicles that had been hidden by the conformist monoculture and emerge only in retrospect.
13%
Flag icon
Competing modes of discourse no longer “compete.” They coexist.
13%
Flag icon
The mere fact that I can imagine this scenario forces me to assume that it won’t happen. It’s a reasonable conclusion to draw from the facts that presently exist, but the future is a teenage crackhead who makes shit up as he goes along.
14%
Flag icon
The reason Vonnegut’s writing advice remains (mostly) correct has to do with the myth of universal timeliness. There is a misguided belief—often promoted by creative writing programs—that producing fiction excessively tied to technology or popular culture cheapens the work and detracts from its value over time.
14%
Flag icon
The first is that it’s impossible to generate deep verisimilitude without specificity.
14%
Flag icon
Jane Austen (as timeless a writer as there will ever be) wrote about courtship and matrimony in an essentially sexless universe. As a result, the unspoken sexual undercurrents are the main gravitational pull for modern readers.
14%
Flag icon
unpacking process is a big part of what you love. A book becomes popular because of its text, but it’s the subtext that makes it live forever. For the true obsessive,
14%
Flag icon
whatever an author doesn’t explicitly explain ends up becoming everything that matters most (and since it’s inevitably the obsessives who keep art alive, they make the rules).
14%
Flag icon
Beowulf is mostly about what isn’t there.
15%
Flag icon
The only detail we can all be certain of is that a novel’s (eventual) interpretation will (eventually) be different from its surface meaning—and if that doesn’t happen, the book won’t seem significant enough to retroactively canonize. So this, it seems, is the key for authors who want to live forever: You need to write about important things without actually writing about them.
15%
Flag icon
My goal is to think about the present in the same way we think about the past, wholly aware that such mass consideration can’t happen until we reach a future that no longer includes us. And why do I want to do this? Because this is—or should be—why we invest time into thinking about anything that isn’t essential or practical or imperative. The reason so many well-considered ideas appear laughable in retrospect is that people involuntarily assume that whatever we believe and prioritize now will continue to be believed and prioritized later, even though that almost never happens. It’s a mistake ...more
16%
Flag icon
But I’ve been a paid critic for enough years to know my profession regularly overrates many, many things by automatically classifying them as potentially underrated. The two terms have become nonsensically interchangeable.
16%
Flag icon
The nonfiction wing of this level houses elemental tacticians like Robert Caro; someone like William T. Vollmann straddles both lines,
17%
Flag icon
reality of publishing is that most books just come out. They are written, edited, marketed, and publicized—but nothing else happens. They are nominally reviewed by the trade publications that specialize in reviewing everything, and that’s as far as it goes (if they receive any attention beyond that, it likely skews positive, but only because there’s no point in criticizing a book nobody else has heard of).
17%
Flag icon
think the social difference between 2016 and 2155 will be significantly more profound than the social difference between 1877 and 2016, in the same way that the 139-year gap between the publication of Anna Karenina and today is much vaster than the 139-year gap between 1877 and 1738. This acceleration is real, and it will be harder and harder for
17%
Flag icon
In as little as fifty years, the language and themes of The Corrections will seem as weird and primordial as Robinson Crusoe feels to the consumer of today:
18%
Flag icon
History is a creative process (or as Napoleon Bonaparte once said, “a set of lies agreed upon”). The world happens as it happens, but we construct what we remember and what we forget. And people will eventually do that to us, too.
18%
Flag icon
Cleveland radio DJ Alan Freed, a man who played black music for white audiences and unwittingly caused the Rock and Roll Hall of Fame to be built on the shores of Lake Erie, the artistic equivalent of naming North America after the first guy who happened to draw a map of it.
18%
Flag icon
you’re a successful tax attorney who owns a hot tub, clients will refer to you as a “rock star CPA” when describing your business to their less hip neighbors. The
18%
Flag icon
The defining music of the first half of the twentieth century was jazz; the defining music of the second half of the twentieth century was rock, but with an ideology and saturation far more pervasive. Only television surpasses its influence. And pretty much from the moment it came into being, people who liked “rock” insisted it was dead. The critic Richard Meltzer allegedly claimed that rock was already dead in 1968. And he was wrong to the same degree that he was right.
18%
Flag icon
Rock is dead, in the sense that its “aliveness” is a subjective assertion based on whatever criteria the listener happens to care about. When someone argued rock was “dead” in 1968 or 1977 or 1994 or 2005, that individual was making an aesthetic argument, grounded in whatever that person assumed to be the compromised motives of the artists of the time (customarily built on the conviction that the current generation of musicians were more careerist in nature, thus detracting from the amount of raw emotion they were allegedly injecting into the music). The popularity of the rock genre is ...more
18%
Flag icon
Normal consumers declare rock to be dead whenever they personally stop listening to it (or at least to new iterations of it), which typically happens about two years after they graduate from college. This has almost nothing to do with what’s actually happening with the artists who make it. There will always be a handful of musicians making new rock music, just as there will always be a handful of musicians making new mariachi music. The entire debate is semantic: Something that’s only metaphorically alive can never be literally dead.
« Prev 1 3 4