But What If We're Wrong? (2 page)

Read But What If We're Wrong? Online

Authors: Chuck Klosterman

BOOK: But What If We're Wrong?
8.88Mb size Format: txt, pdf, ePub

We live in an age where virtually no content is lost and virtually all content is shared. The sheer amount of information about every current idea makes those concepts difficult to contradict, particularly in a framework where public consensus has become the ultimate arbiter of validity. In other words, we're starting to behave as if we've reached the end of human knowledge. And while that notion is undoubtedly false, the sensation of certitude it generates is paralyzing.

----------

In her book
Being Wrong
, author Kathryn Schulz spends a few key pages on the concept of “naïve realism.” Schulz notes that while there are few conscious proponents of naïve realism, “that doesn't mean there are no naïve realists.” I would go a step further than Schulz; I suspect most conventionally intelligent people are naïve
realists, and I think it might be the defining intellectual quality of this era. The straightforward definition of naïve realism doesn't seem that outlandish: It's a theory that suggests the world is exactly as it appears. Obviously, this viewpoint creates a lot of opportunity for colossal wrongness (e.g., “The sun appears to move across the sky, so the sun must be orbiting Earth”). But my personal characterization of naïve realism is wider and more insidious. I think it operates as the manifestation of two ingrained beliefs:

  1. “When considering any question, I must be rational and logical, to the point of dismissing any unverifiable data as preposterous,” and
  2. “When considering any question, I'm going to assume that the information we currently have is all the information that will ever be available.”

Here's an extreme example: the possibility of life after death. When considered rationally, there is no justification for believing that anything happens to anyone upon the moment of his or her death. There is no reasonable counter to the prospect of nothingness. Any anecdotal story about “floating toward a white light” or Shirley MacLaine's past life on Atlantis or the details in
Heaven Is for Real
are automatically (and justifiably) dismissed by any secular intellectual. Yet this wholly logical position discounts the overwhelming likelihood that we currently don't know something critical about the experience of life, much less the ultimate conclusion to that experience. There are so many things we don't know about energy, or the way energy is transferred, or why energy (which can't be created or destroyed) exists at all. We can't truly conceive
the conditions of a multidimensional reality, even though we're (probably) already living inside one. We have a limited understanding of consciousness. We have a limited understanding of time, and of the perception of time, and of the possibility that all time is happening at once. So while it seems unrealistic to seriously consider the prospect of life after death, it seems equally naïve to assume that our contemporary understanding of this phenomenon is remotely complete. We have no idea what we don't know, or what we'll eventually learn, or what might be true despite our perpetual inability to comprehend what that truth is.

It's impossible to understand the world of today until today has become tomorrow.

This is no brilliant insight, and only a fool would disagree. But it's remarkable how habitually this truth is ignored. We constantly pretend our perception of the present day will not seem ludicrous in retrospect, simply because there doesn't appear to be any other option. Yet there
is
another option, and the option is this: We must start from the premise that—in all likelihood—we are already wrong. And not “wrong” in the sense that we are examining questions and coming to incorrect conclusions, because most of our conclusions are reasoned and coherent. The problem is with the questions themselves.

A Brief Examination as to Why This Book Is Hopeless (and a Briefer Examination as to Why It Might Not Be)

The library in my sixth-grade classroom contained many books that no one ever touched. It did, however, include one book that my entire class touched compulsively:
The Book of Lists
. Published in 1977,
The Book of Lists
was exactly what it purported to be—521 pages of lists, presented by
The
People's Almanac
and compiled by three writers (David Wallechinsky, his sister Amy, and their father Irving). This was a book you didn't really
read
, per se; you just thumbed through it at random and tried to memorize information that was both deliberately salacious and generally unencumbered by the fact-checking process (I still recall the book's list of famous homosexuals, which included only three rock musicians—Janis Joplin, Elton John, and David Bowie, the last of whom was married to the same woman for more than twenty years). Sequels to the book were released in 1980 and 1983. What I did not realize, however, was that the creators of
The Book of Lists
also published a similar work titled
The Book of Predictions
, in 1980. (I stumbled across
it in the late nineties, in the living room of a friend who liked to buy bizarre out-of-print books to peruse while stoned.) Like its more famous predecessor,
The Book of Predictions
describes itself: It's several hundred pages of futurists and scientists (and—somewhat distractingly—psychics) making unsystematic predictions about life on Earth in the coming fifty years.

On those rare occasions when
The Book of Predictions
is referenced today, the angle is inevitably mocking: The most eye-catching predictions are always the idiotic ones. As it turns out, there has not been a murder in outer space committed by a jealous astronaut, which is what lawyer F. Lee Bailey predicted would occur in 1990 (and evidently struck Bailey as more plausible than the possibility of defending a jealous Hall of Fame running back for an earthbound murder in 1994). According to population expert Dr. Paul Ehrlich, we should currently be experiencing a dystopian dreamscape where “survivors envy the dead,” which seems true only when I look at Twitter. Yet some of the book's predictions are the opposite of terrible: Several speculators accurately estimated the world population in 2010 would be around seven billion. A handful of technology experts made remarkably realistic projections about an imminent international computer network. Charlie Gillett, a British musicologist best known for writing the first comprehensive history of rock music (1970's
The Sound of the City
), somehow managed to outline the fall of the music industry in detail without any possible knowledge of MP3s or file sharing.
3
Considering how difficult it is to predict what will
still be true a year from now, any level of accuracy on a fifty-year guess feels like a win.

Yet what is most instructive about
The Book of Predictions
is not the things that proved true. It's the bad calculations that must have seemed totally justifiable—perhaps even conservative—at the time of publication. And the quality all these reasonable failures share is an inability to accept that the status quo is temporary.
The Book of Predictions
was released in 1980, so this mostly means a failure to imagine a world where the United States and the Soviet Union were not on the cusp of war. Virtually every thought about the future of global politics focuses on either (a) an impending nuclear collision between the two nations, or (b) a terrifying alliance between the USSR and China. As far as I can tell, no one in the entire
Book of Predictions
assumed the friction between the US and Russia could be resolved without the detonation of nuclear weapons. A similar problem is witnessed whenever anyone from 1980 attempts to consider the future of interpersonal communication: Even though widespread cell phone use was right around the corner—there was already a mobile phone network in Japan in '79—it was almost impossible to think this would ever replace traditional landlines for average people. All speculation regarding human interaction is limited by the assumption that landline telephones would always be the best way to communicate. On page 29, there are even escalating predictions about the annual number of long-distance calls that would be made in the US, a problem that's irrelevant in the age of free calling. Yet as recently as twenty years
ago, this question still mattered; as a college student in the early nineties, I knew of several long-term romantic relationships that were severed simply because the involved parties attended different schools and could not afford to make long-distance calls, even once a week. In 1994, the idea of a sixty-minute phone call from Michigan to Texas costing less than mailing a physical letter the same distance was still unimaginable. Which is why no one imagined it in 1980, either.

This brand of retrospective insight presents a rather obvious problem: My argument requires a “successful” futurist to anticipate whatever it is that can't possibly be anticipated. It's akin to demanding someone be spontaneous on command. But there's still a practical lesson here, or at least a practical thought: Even if we can't foresee the unforeseeable, it's possible to project a future reality where the most logical conclusions have no relationship to what actually happens. It feels awkward to think like this, because such thinking accepts irrationality. Of course, irrational trajectories happen all the time. Here's an excerpt from a 1948 issue of
Science Digest
: “Landing and moving around the moon offers so many serious problems for human beings that it may take science another 200 years to lick them.” That prediction was off by only 179 years. But the reason
Science Digest
was so wrong was not technological; it was motivational. In 1948, traveling to the moon was a scientific aspiration; the desire for a lunar landing was analogous to the desire to climb a previously unscaled mountain.
Science Digest
assumed this goal would be pursued in the traditional manner of scientific inquiry—a grinding process of formulating theories and testing hypotheses. But when the Soviets launched the Sputnik satellite in 1957, the meaning of the enterprise changed.
Terrified Americans suddenly imagined Khrushchev launching weapons from the lunar surface. The national desire to reach the moon first was now a military concern (with a sociocultural subtext over which country was intellectually and morally superior). That accelerated the process dramatically. By the summer of '69, we were planting flags and collecting moon rocks and generating an entirely new class of conspiracy theorists. So it's not that the 1948 editors of
Science Digest
were illogical; it's that logic doesn't work particularly well when applied to the future.

Any time you talk to police (or lawyers, or journalists) about any kind of inherently unsolvable mystery, you will inevitably find yourself confronted with the concept of Occam's Razor: the philosophical argument that the best hypothesis is the one involving the lowest number of assumptions. If (for example) you're debating the assassination of John F. Kennedy, Occam's Razor supports the idea of Lee Harvey Oswald acting alone—it's the simplest, cleanest conclusion, involving the least number of unverifiable connections. With Occam's Razor is how a serious person considers the past. Unfortunately, it simply doesn't work for the future. When you're gazing into the haze of a distant tomorrow,
everything
is an assumption. Granted, some of those competing assumptions seem (or maybe feel) more reasonable than others. But we live in a starkly unreasonable world. The history of ideas is littered with more failures than successes. Retroactively, we all concede this. So in order to move forward, we're forced to use a very different mind-set. For lack of a better term, we'll just have to call it Klosterman's Razor: the philosophical belief that the best hypothesis is the one that reflexively accepts its potential wrongness to begin
with.

A Quaint and Curious Volume of (Destined-to-Be) Forgotten Lore

Let's start with books.

Now, I realize the risk inherent in this decision: By the time the questions I'm about to ask are resolved, it's possible that books won't exist. Some will argue that such an inevitability borders on the probable. But I'm starting with books, anyway, and mainly for two reasons. The first is that
this
is a book, so if all books disappear, there's no way anyone will be able to locate my mistake. The second is that I suspect we will always use the word “book” to signify whatever we incorporate in its place, even if that new thing has almost no relationship to what we consider to be a “book” right now.

Language is more durable than content. Words outlive their definitions. Vinyl represented around 6 percent of music sales in 2015, but people continue to say they listen to “records” and “albums” and (on rare occasions) “LPs” whenever they're describing any collection of music. This is even true for music that was
never pressed on vinyl at all. So-called long-playing records weren't introduced to the public until 1948 and didn't matter commercially until the sixties, but the term “record” has come to characterize the entire concept. And since books are way, way older—
The
Epic of Gilgamesh
was written somewhere in the vicinity of 2000 BC—it seems impossible that we'll ever stop using that term, even if the future equivalent of a “book” becomes a packet of granulized data that is mechanically injected directly into the cerebral cortex. We also have too many ancillary adjectives connected to books (“He's only book smart,” “She's a real bookworm,” “The pigs are gonna throw the book at him”) to jettison the root word from the lexicon. Many people use physical books as art objects in their homes, and the Library of Congress would need to be hit by a nuclear weapon in order to disappear. It's possible that no one will buy (or read) books in some remote future, but we can (tentatively) assume that people of that era will at least know what “books” are: They are the collected units containing whatever writers write. So even though future writers might not be producing anything resembling present-day books, that's still how society will refer to whatever works they are producing.

----------

[I would love to promise that the rest of this book will not be as pedantic and grinding as the previous two paragraphs. I want to believe I won't spend thousands of words describing why various nouns won't evaporate into the cultural troposphere. But I can't make that promise. It's entirely possible that—two hundred pages from now—I will find myself describing what “food” is, and explaining that food is what we put in our mouths in order to avoid
starvation, and arguing that we will always talk about food as something that exists. But take solace in the fact that you can quit at any time. I cannot.]

----------

A few pages back,
I cited
Moby-Dick
as the clearest example of a book that people were just flat-out wrong about, at least during the life span of the author. But this doesn't mean that no one thought it was good, because certain people did. That's not the point. This has nothing to do with personal taste. What any singular person thought about
Moby-Dick
in 1851 is as irrelevant as what any singular person thinks about
Moby-Dick
today. What critics in the nineteenth century were profoundly wrong about was not the experience of reading this novel; what they were wrong about was how that experience would be valued
by other people
. Because that's what we're really talking about whenever we analyze the past. And when I refer to “other people,” I don't mean the literary sphere of 1851. I mean “other people” throughout the expanse of time, including those readers a critic in 1851 could never fathom. Which forces us to consider the importance—or the lack of importance—of plot mechanics.

Moby-Dick
is about a dude hunting a whale. The novel includes autobiographical details from Herman Melville's own tenure on a whaling vessel, so one can conclude that he couldn't have written a novel with such specificity and depth if it had not been something he'd experienced firsthand. But what if the same Mr. Melville had lived a different kind of life: Could he have written a similar nine-hundred-page book about hunting a bear? Or climbing a mountain? Or working as a male prostitute? How much of
this novel's transcendent social imprint is related to what it mechanically examines?

The short answer seems to be that the specific substance of a novel matters very little. The difference between a whale and a bear and a mountain is negligible. The larger key is the tone, and particularly the ability of that tone to detach itself from the social moment of its creation.

“It's a terrifying thought,” George Saunders tells me, “that all of the things we—that I—take for granted as being essential to good literature might just be
off
. You read a ‘good' story from the 1930s and find that somehow the world has passed it by. Its inner workings and emphases are somehow misshapen. It's answering questions in its tone and form that we are no longer asking. And yet the Gaussian curve
4
argues that this is true—that most of us are so habituated to the current moment that what we do will fade and lose its power and just be an historical relic, if that. I've been reading a lot of Civil War history lately, and it is just astonishing how
wrong
nearly everyone was. Wrong—and emphatic. Even the people who were ‘right' were off, in their sense of how things would play out . . . The future we are now living in would have been utterly unimaginable to the vast majority of even the most intelligent thinkers and writers of that time.”

Saunders is an especially significant character in this discussion, based on the perception of his work within the living present. In January of 2013,
The
New York Times Magazine
published a cover story with the earnest headline “George Saunders Has
Written the Best Book You'll Read This Year.” That book,
Tenth of December
, was a slim, darkly humorous collection of short stories, most of which deal with the quintessence of kindness and the application of empathy. Though no writer can honestly be categorized as universally beloved, Saunders comes closer than any other white American male. He has never published an official novel, which plays to his advantage—the perception of his career does not hinge on the perception of any specific work. He is also viewed (with justification) as being unusually humble and extraordinarily nice to pretty much everyone he encounters.
5
So when
The
New York Times Magazine
published that story, and when
Tenth of December
subsequently made the bestseller list, there was a collective assumption that Saunders was—perhaps, maybe, arguably—this nation's greatest living author, and that it was wonderful that the person occupying that space seemed like a legitimately kind person (as opposed to some jerk we simply had to begrudgingly
concede
was better than everyone else). If George Saunders eventually becomes the distant historical figure who defines American writing at the turn of the twenty-first century, it seems like this would be good for everyone involved.

And yet . . . there is something about this notion that feels overwhelmingly impermanent. It doesn't seem plausible that
someone could do exceptional work, be recognized as exceptional, and then simply remain in that cultural space for the rest of time. Art history almost never works that way. In fact, it often seems like our collective ability to recognize electrifying genius as it occurs paradoxically limits the likelihood of future populations certifying that genius as timeless.

“What ages [poorly], it seems, are ideas that trend to the clever, the new, or the merely personal,” Saunders continues. “What gets dated, somehow, is that which is too ego inflected—that hasn't been held up against the old wisdom, maybe, or just against some innate sense of truth, and rigorously, with a kind of self-abnegating fervor. Again and again some yahoo from 1863 can be heard to be strenuously saying the obvious, self-aggrandizing, self-protective, clever, banal thing—and that crap rings so hollow when read against Lincoln or Douglass. It gives me real fear about all of the obvious, self-aggrandizing, self-protective, clever, banal things I've been saying all my life.”

Here again, I'd like to imagine that Saunders will be rewarded for his self-deprecation, in the same way I want him to be rewarded for his sheer comedic talent. But I suspect our future reality won't be dictated by either of those qualities. I suspect it will be controlled by the evolving, circuitous criteria for what is supposed to matter about anything. When trying to project which contemporary books will still be relevant once our current population has crumbled into carbon dust and bone fragments, it's hopeless to start by thinking about the quality of the works themselves. Quality will matter at the end of the argument, but not at the beginning. At the beginning, the main thing that matters is what that future world will be like. From there, you work in reverse.

[
2
]
“All I can tell you is that in 100 years I seriously doubt that the list of the 100 best writers from our time is going to be as white, as male, as straight, as monocultural as the lists we currently produce about the 100 best writers of our time.” This an e-mail from Junot Díaz, the Dominican-American novelist who won a Pulitzer Prize in 2008 and a MacArthur Fellowship in 2012. “In all frankness, our present-day evaluative criteria are so unfairly weighted towards whiteness, maleness, middle-class-ness, straightness, monoculturality—so rotted through with white supremacy—as to be utterly useless for really seeing or understanding what's going on in the field, given how little we really see and value of the art we're now producing because of our hegemonic scotoma. Who can doubt that the future will improve on that? No question that today, in the margins of what is considered Real Literature, there are unacknowledged Kafkas toiling away who are more likely women, colored, queer and poor.”

Díaz is a bombastic intellectual with a limitless career (his debut novel about an overweight weirdo,
The Brief Wondrous Life of Oscar Wao
, was named the best book of the twenty-first century by a panel of critics commissioned by the BBC). It's unsurprising that this is how he views society, and his argument is essentially bulletproof. It's a worldview that's continually gaining traction: You can't have a macro discussion about literary canons without touching on these specific points. When
The New York Times
released its 2014 “100 Notable Books” list, several readers noticed how there were exactly twenty-five fiction books by men, twenty-five fiction books by women, twenty-five nonfiction books by
men, and twenty-five nonfiction books by women. Do I have a problem with this? I have no problem with this. But it does reflect something telling about the modern criteria for quantifying art: Symmetrical representation sits at the center of the process. It's an aesthetic priority. Granted, we're dealing with a meaningless abstraction, anyway—the list is called “notable” (as opposed to “best”), it's politicized by the relationships certain authors have with the list makers, it annually highlights books that instantly prove ephemeral, and the true value of inclusion isn't clear to anyone. Yet in the increasingly collapsible, eternally insular idiom of publishing, the
Times
' “100 Notable” list remains the most visible American standard for collective critical appreciation. This is why the perfect 25:25:25:25 gender split is significant. Does it not seem possible—in fact, probable—that (say) twenty-six of the most notable novels were written by women? Or that perhaps men wrote twenty-seven of the most notable nonfiction works?
6
I suppose it's mathematically possible that an objective, gender-blind judging panel might look at every book released in 2014 and arrive at the same conclusion as
The
New York Times
. Perfect statistical symmetry is within the realm of possibility. But no impartial person believes that this is what happened. Every rational person knows this symmetry was conscious, and that this specific result either
(a) slightly invalidates the tangible value of the list, or (b) slightly elevates the intangible value of the list. (I suppose it's also possible to hold both of those thoughts simultaneously.) In either case, one thing is absolutely clear: This is the direction in which canonical thinking is drifting. Díaz's view, which once felt like an alternative perspective, is becoming the entrenched perspective. And when that happens, certain critical conclusions will no longer be possible.

Let's assume that—in the year 2112—someone is looking back at the turn of the twenty-first century, trying to deduce the era's most significant writers. Let us also assume Díaz's opinion about the present culture has metabolized into the standard view; let's concede that people of the future take for granted that the old evaluative criteria were “unfairly weighted towards whiteness, maleness, middle-class-ness, straightness, [and] monoculturality.” When that evolution transpires, here's the one critical conclusion that cannot (and will not) happen: “You know, I've looked at all the candidates, consciously considering all genders and races and income brackets. I've tried to use a methodology that does not privilege the dominant class in any context. But you know what? It turns out that Pynchon, DeLillo, and Franzen
were
the best. The fact that they were white and male and straight is just coincidental.” If you prioritize cultural multiplicity above all other factors, you can't make the very peak of the pyramid a reactionary exception, even in the unlikely event that this is what you believe (since such a conclusion would undoubtedly be shaped by social forces you might not recognize). Even more remote is the possibility that the sheer commercial force of a period's most successful writers—in the case of our period, Stephen King and J. K. Rowling—will be
viewed as an argument in their historical favor. If you accept that the commercial market was artificially unlevel, colossal success only damages their case.

Other books

FOREVER MINE by LEE, MICHELLE
The Patience Stone by Atiq Rahimi
Cited to Death by Meg Perry
The Arrow Keeper’s Song by Kerry Newcomb
The Golden Griffin (Book 3) by Michael Wallace
All American Rejects (Users #3) by Stacy, Jennifer Buck
Woken Furies by Richard K. Morgan
The New Year's Wish by Dani-Lyn Alexander