But What If We're Wrong? (22 page)

Read But What If We're Wrong? Online

Authors: Chuck Klosterman

BOOK: But What If We're Wrong?
10.44Mb size Format: txt, pdf, ePub

For a time in the early 2000s, there was a belief that bloggers would become the next wave of authors, and many big-money blogger-to-author book deals were signed. Besides a handful of notable exceptions, this rarely worked, commercially or critically. The problem was not a lack of talent; the problem was that writing
a blog and writing a book have almost no psychological relationship. They both involve a lot of typing, but that's about as far as it goes. A sentence in a book is written a year before it's published, with the express intent that it will still make sense twenty years later. A sentence on the Internet is designed to last one day, usually the same day it's written. The next morning, it's overwritten again (by multiple writers). The Internet experience is not even that similar to daily newspaper writing, because there's no physical artifact to demarcate the significance of the original moment.
67
Yet this limitation is not a failure. It proved to be an advantage. It naturally aligns with the early-adoption sensibility that informs everything else. Even when the Internet appears to be nostalgically churning through the cultural past, it's still hunting for “old newness.” A familiar video clip from 1986 does not possess virility; what the medium desires is an obscure clip from 1985 that recontextualizes the familiar one. The result is a perpetual sense of
now
. It's a continual merging of the past with the present, all jammed into the same fixed perspective. This makes it seem like our current, temporary views have always existed, and that what we believe today is what people have always believed. There is no longer any distance between what we used to think and what we currently think, because our evolving vision of reality does not extend beyond yesterday.

And this, somewhat nonsensically, is how we might be right: All we need to do is convince ourselves we always were. And now there's a machine that makes that easy.

[
4
]
“I am often wrong,” wrote satirist and critic H. L. Mencken, a statement that would seem more disarming were it not for the fact that Mencken so often opened his quotations by suggesting his forthcoming thoughts were worthless. “My prejudices are innumerable, and often idiotic. My aim is not to determine facts, but to function freely and pleasantly.”

I get this. I understand what he's getting at, and sometimes I relate to it: Since our interior thoughts are (ultimately) arbitrary and meaningless, we might as well think whatever we prefer thinking. This was especially important to a guy like Mencken, who was against US participation in World War II and hated Franklin Roosevelt. He was quite willing to concede that his most intensely held opinions weren't based on factual data, so trying to determine what the factual data actually was would only make him depressed. It's a worldview that—even if expressed as sarcasm—would be extremely unpopular today. But it's quietly become the most natural way to think about everything, due to one sweeping technological evolution: We now have immediate access to
all possible facts
. Which is almost the same as having none at all.

Back in the landlocked eighties, Dave Barry offhandedly wrote something pretty insightful about the nature of revisionism. He noted how—as a fifth-grader—he was told that the cause of the Civil War was slavery. Upon entering high school, he was told that the cause was not slavery, but economic factors. At college, he learned that it was not economic factors but acculturalized regionalism. But if Barry had gone to graduate school, the answer to
what caused the Civil War would (once again) be slavery.
68
Now, the Civil War is the most critical event in American history, and race is the defining conflict of this country. It still feels very much alive, so it's not surprising that teachers and historians want to think about it on disparate micro and macro levels, even if the realest answer is the simplest answer. But the Internet allows us to do this with everything, regardless of a subject's significance. It can happen so rapidly that there's no sense the argument has even evolved, which generates an illusion of consistency.

I've been writing this book during a period when many retired eighties-era pro wrestlers have died—the Ultimate Warrior, Dusty Rhodes, Rowdy Roddy Piper, etc. The outpouring of media recognition regarding these deaths has been significant. The obituaries frame these men as legends, and perhaps that's how they deserve to be framed. But what's been weird about this coverage is the unspoken viewpoint. Logically, it seems like a remembrance of Dusty Rhodes should include some version of the following: “We didn't think this guy was important, but he was. Culturally, we were wrong about pro wrestling.” Because during the 1980s, almost no one thought pro wrestling mattered at all. Even the teenage males who loved it rarely took it seriously. But this is not
how these remembrances were delivered. Instead, the unspoken viewpoint was
of course
these people were important, and
of course
we all accept and understand this, and
of course
there is nothing remotely strange about remembering Dusty Rhodes as a formative critic of Reagan-era capitalism. Somebody once believed this, which means it was possible for anyone to have believed this, which means everyone can retroactively adopt this view as what they've always understood to be true. No one was ever wrong about wrestling. We were always right about it. In 1976, Renata Adler wrote the experimental novel
Speedboat
. It went out of print. When it was re-released in 2013,
Speedboat
was consumed and adopted as “old newness” (“Millennials, Meet Renata Adler,” demanded a headline in
The New Republic
). In a span of two years, Adler completely reentered the critical dialogue, almost as if she had been there the whole time. The thirty-plus years this book was ignored no longer exist. Technologically, 1976 and 2013 exist in the same moment.

There's a common philosophical debate about the nature of time. One side of the debate argues that time is happening in a linear fashion. This is easy to understand. The other side argues that all time is happening at once. This is difficult to comprehend. But replace the word “time” with “history,” and that phenomenon can be visualized on the Internet. If we think about the trajectory of anything—art, science, sports, politics—not as a river but as an endless, shallow ocean, there is no place for collective wrongness. All feasible ideas and every possible narrative exist together, and each new societal generation can scoop out a bucket of whatever antecedent is necessary to support their contemporary conclusions. When explained in one sentence, that prospect seems a little
terrible. But maybe that's just because my view of reality is limited to river-based thinking.

I've slowly become an admirer of Edward Snowden, the former government employee who leaked thousands of classified documents and now lives in exile. I was initially skeptical of Snowden, until I saw the documentary
Citizenfour
. Granted,
Citizenfour
is a non-objective telling of his story, produced by the journalists Snowden was aligned with. It could be classified as a propaganda film. But it's impossible to watch Snowden speak without trusting the sincerity of his motives and the tenacity of his central argument. I believe Snowden more than I believe the government. He does, however, make one statement in
Citizenfour
that seems preposterous and wrong: While discussing the alleged greatness of the early (pre-surveillance) Internet, he notes that a child in one part of the world could have an anonymous discussion with a verified expert in another part of the world and “be granted the same respect for their ideas.” To me, that does not sound like a benefit. That sounds like a waste of time and energy, at least for the verified expert. The concept of some eleven-year-old in Poland facelessly debating Edward Witten on an equal platform, just because there's a machine that makes this possible, seems about as reasonable as letting dogs vote. But I suppose that's because I still can't accept the possibility of Witten being totally wrong, no matter how hard I try. I mean, if we found records of an eleven-year-old girl from 340 BC who contacted Aristotle and told him his idea about a rock wanting to sit on the ground was irrational bullshit, we'd name a college after her.

Only the Penitent Man Shall Pass

A large group of people are eating and drinking. They're together, but not
really
together. Some of the people know each other well and others are almost strangers; instead of one mass conversation, there are little pockets of conversations, sprinkled throughout the table. I am at this table. What I am talking about is unimportant, or—more accurately—will need to be classified as “unimportant,” as I will not be able to remember what it was when I awake in the morning. But it must be some topic where I'm expressing doubt over something assumed to be self-evident, or a subject where the least plausible scenario is the most interesting scenario, or a hypothetical crisis that's dependent on the actualization of something insane. I say this because someone at the table (whom I've met only once before) eventually joins my semi-private conversation and says, “It must be terrifying to think the world is actually like that.”

“What do you mean?” I ask. My memory of what she says next is sketchy, but it's something along the lines of: It must be terrifying
to view the world from the perspective that most people are wrong, and to think that every standard belief is a form of dogma, and to assume that reality is not real. Her analysis is delivered in a completely non-adversarial tone; it is polite, almost like she is authentically concerned for my overall well-being. My response is something like “Well, I don't really think like that,” because I don't think I think the way she thinks I think. But maybe I do. And I get what she's driving at, and I realize that—from her vantage point—any sense of wide-scale skepticism about the seemingly obvious would be a terrifying way to live.

There's an accepted line of reasoning that keeps the average person from losing his or her mind. It's an automatic mental reflex. The first part of the reasoning involves a soft acceptance of the impossible: our recognition that the specific future is unknowable and that certain questions about the universe will never be answered, perhaps because those answers do not exist. The second part involves a hard acceptance of limited truths: a concession that we can reliably agree on most statements that are technically unprovable, regardless of whether these statements are objective (“The US government did not plan the 9/11 attacks”), subjective (“Fyodor Dostoyevsky is a better novelist than Jacqueline Susann”), or idealistic (“Murder is worse than stealing, which is worse than lying, which is worse than sloth”). It's a little like the way we're biologically programmed to trust our friends and family more than we trust strangers, even if our own past experience suggests we should do otherwise. We can't unconditionally trust the motives of people we don't know, so we project a heightened sense of security upon those we do, even if common sense suggests we should do the opposite. If 90 percent of life is inscrutable, we need
to embrace the 10 percent that seems forthright, lest we feel like life is a cruel, unmanageable joke. This is the root of naïve realism. It's not so much an intellectual failing as an emotional sanctuary from existential despair.

It is not, however, necessary.

Is there a danger (or maybe a stupidity) in refusing to accept certain espoused truths are, in fact, straightforwardly true? Yes—if you take such thinking to the absolute extreme. It would be pretty idiotic if I never left my apartment building, based on the remote mathematical possibility that a Komodo dragon might be sitting in the lobby. If my new postman tells me his name is Toby, I don't ask for state-issued identification. But I think there's a greater detriment with our escalating progression toward the opposite extremity—the increasingly common ideology that assures people they're right about what they believe. And note that I used the word “detriment.” I did not use the word “danger,” because I don't think the notion of people living under the misguided premise that they're right is often
dangerous
. Most day-to-day issues are minor, the passage of time will dictate who was right and who was wrong, and the future will sort out the past. It is, however,
socially
detrimental
. It hijacks conversation and aborts ideas. It engenders a delusion of simplicity that benefits people with inflexible minds. It makes the experience of living in a society slightly worse than it should be.

If you write a book about the possibility of collective wrongness in the present day, there are certain questions people ask you the moment you explain what you're doing. Chief among these is, “Are you going to write about climate change?” Now, I elected not to do this, for multiple reasons. The main reason is that the Earth's climate
is
changing, in a documented sense, and that there
is
exponentially more carbon in the atmosphere than at any time in man's history, and that the rise of CO
2
closely corresponds with the rise of global industrialization. Temperature readings and air measurements are not speculative issues. But the more insidious reason I chose not to do this is that I knew doing so would automatically nullify the possibility of writing about any non-polemic ideas even vaguely related to this topic. It would just become a partisan, allegorical battle over what it means to accept (or deny) the central concept of global warming. This is one of those issues where—at least in any public forum—there are only two sides: This is happening and it's going to destroy us (and isn't it crazy that some people still disagree with that), or this is not happening and there is nothing to worry about (and isn't it crazy how people will just believe whatever they're told). There is no intellectual room for the third rail, even if that rail is probably closer to what most people quietly assume: that this is happening, but we're slightly overestimating—or dramatically underestimating—the real consequence. In other words, the climate of the Earth is changing, so life on Earth will change with it. Population centers will shift toward the poles. Instead of getting wheat from Kansas, it will come from Manitoba. The oceans will incrementally rise and engulf the southern tip of Manhattan, so people will incrementally migrate to Syracuse and Albany. The average yearly temperature of London (45 degrees Fahrenheit) might eventually approach the average yearly temperature of Cairo (70.5 degrees), but British society will find a way to subsist within those barren conditions. Or perhaps even the pessimists are too optimistic; perhaps it's already too late, the damage is irrevocable, and
humankind's time is finite. The international community has spent the last two decades collectively fixated on reducing carbon emissions, but the percentage of carbon in the atmosphere still continues to increase. Maybe we've already entered the so-called Sixth Extinction and there is no way back. Maybe the only way to stop this from happening would be the immediate, wholesale elimination of all machines that produce carbon, which would equate to the immediate obliteration of all industry, which would generate the same level of chaos we're desperately trying to avoid. Maybe this is how humankind is supposed to end, and maybe the downside to our species' unparalleled cerebral evolution is an ingrained propensity for self-destruction. If a problem is irreversible, is there still an ethical obligation to try to reverse it?

Such a nihilistic question is hard and hopeless, but not without meaning. It needs to be asked. Yet in the modern culture of certitude, such ambivalence has no place in a public conversation. The third rail is the enemy of both poles. Accepting the existence of climate change while questioning its consequence is seen as both an unsophisticated denial of the scientific community and a blind acceptance of the non-scientific status quo. Nobody on either side wants to hear this, because this is something people really, really need to feel right about, often for reasons that have nothing to do with the weather.
69

[
2
]
There's a phrase I constantly notice on the Internet, particularly after my wife pointed out how incessant it has become. The phrase is, “You're doing it wrong.” It started as a meme for photo captions but evolved into something different; it evolved into a journalistic device that immediately became a cliché. A headline about eyewear states, “Hey Contact Wearer, You're Doing It Wrong!” A story about how many people are watching streaming TV shows gets titled “Netflix Ratings: You're Doing It Wrong.”
Newsweek
runs a story with the headline “You're 100 Percent Wrong About Showering.”
Time
opens a banking story about disgust over ATM fees by stating, “You're doing it wrong: most Americans aren't paying them at all.” These random examples all come from the same month, and none are individually egregious. It could be argued that this is simply an expository shortcut, and maybe you think I should appreciate this phrase, since it appears to recognize the possibility that some widely accepted assumption is being dutifully reconsidered. But that's not really what's happening here. Whenever you see something defining itself with the “You're doing it wrong” conceit, it's inevitably arguing for a different approach that is just as specific and limited. When you see the phrase “You're doing it wrong,” the unwritten sentence that follows is:
“And I'm doing it right.”
Which has become the pervasive way to argue about just about everything, particularly in a Web culture where discourse is dominated by the reaction to (and the rejection of) other people's ideas, as opposed to generating one's own.

For a time,
GQ
magazine ran a monthly film column called
“Canon Fodder,” where a writer would examine a relatively contemporary movie and assert that it deserves to be considered a classic. Now, this was not exactly a groundbreaking approach to criticism. It's been attempted forever. But the concept still bothered people, mostly for the way the writer, Natasha Vargas-Cooper, framed her mission in the debut essay about
Terminator 2
: “It's an obligation that every generation must take upon itself in order for art to thrive: tear down what's come before and hail our own accomplishments as good enough . . . Let's be untethered from history, ignore the tug of the familiar, and resolve that any movie made before, say, 1986 has received its due respect and move on . . . History does not inform the value of a film; you need never see a stylized Godard flick or Cary Grant comedy to understand the enthralling power of
Fargo
or
Independence Day
. Movies are a mass art and everyone should have opinions on them regardless of if they've seen
The Deer Hunter
or not.”

As a premise for a magazine column, this is fine, outside of the suggestion that
Independence Day
isn't complete dog shit. It has been pointed out to me (on two separate occasions) that it seems like something a younger version of myself might have written and believed. But the reason it annoyed certain serious (and self-serious) film consumers was the militancy of the tone, which might have been accidental (although I doubt it). It projects a heavy “You're doing it wrong” vibe. The proposal is not that some modern movies are
also
as good as those defined by prehistoric criteria, but that there is an “obligation” to reinvent the way cinematic greatness is considered. On the surface, it might seem like deliberately ignoring history and focusing on the merit of newer movies would increase our ability to think about the art form. But it actually does
the opposite. It multiplies the avenues for small thoughts while annihilating the possibility for big ones. The easiest, most obvious example is (once again)
Citizen Kane
. Could it be argued that
Citizen Kane
has been praised and pondered enough, and that maybe it's time to move on to other concerns? Totally. But doing so eliminates a bunch of debates that will never stop being necessary. Much of the staid lionization of
Citizen Kane
revolves around structural techniques that had never been done before 1941. It is, somewhat famously, the first major movie where the ceilings of rooms are visible to the audience. This might seem like an insignificant detail, but—because no one prior to
Kane
cinematographer Gregg Toland had figured out a reasonable way to get ceilings into the frame—there's an intangible, organic realism to
Citizen Kane
that advances it beyond its time period. Those visible ceilings are a meaningful modernization that twenty-first-century audiences barely notice. It's an advance that raises a whole series of questions: Was it simply a matter of time before this innovation was invented, or did it have to come specifically from Toland (and would it have happened without the specific visual vision of Orson Welles)? And in either case, does the artist who invents something deserve a different level of credit from those who employ that invention later, even if they do so in a more interesting way? Is originality more or less important than we pretend?

Certainly, movies can be critically considered without worrying about these abstractions, just as they can be critically considered without any consideration over the visibility of ceilings. A writer can design whatever obstructions or limitations she desires. But when you do that, you're not really writing about canonical
ideas (which wouldn't be a problem, except that this was the premise of the column).

I don't want to pop this too hard, because—having written for glossy magazines (including thousands of words for
GQ
)—I know how this process works. I assume the goal here was to create a film column that immersed itself in movies the mag's audience had directly experienced, so a high-minded reason was constructed to explain why this was being done (and the explanation for that reason was amplified to create a sense of authority). In a completely honest world, the column would have been titled “Here Are Movies We Arbitrarily Want to Write About.” But I note it because this particular attempt illustrates a specific mode of progressive wisdom: the conscious decision to replace one style of thinking with a new style of thinking, despite the fact that both styles could easily coexist. I realize certain modes of thinking can become outdated. But outdated modes are essential to understanding outdated times, which are the only times that exist.

Other books

Jumping Off Swings by Jo Knowles
Boo by Rene Gutteridge
Knives at Dawn by Andrew Friedman
Love Me: The Complete Series by Wall, Shelley K.
03 - Sword of Vengeance by Chris Wraight - (ebook by Undead)
Miss Carmelia Faye Lafayette by Katrina Parker Williams
A Different Reflection by Jane L Gibson