Read Trust Me, I'm Lying: Confessions of a Media Manipulator Online
Authors: Ryan Holiday
Tags: #Business & Economics, #Marketing, #General, #Industries, #Media & Communications
Him: Why keep the headline up, since we now know it’s not true?
Blogger: You guys are so funny.
Bloggers often stick their updates way down at the bottom, because they are vain, just like the rest of us—they’d rather not shout their mistakes loudly for all to hear, or have them be the first thing the reader sees. In other cases blogs will just paste your e-mail at the bottom of the post, as though it’s “your opinion” that they’re wrong. Of course it isn’t just an opinion or they wouldn’t have been forced to post it. But they get to keep the article up by framing it as a two-sided issue. The last thing they want to do is rewrite or get rid of their post and throw away the few minutes of work they put into it.
BEING WRONG
Factual errors are only one type of error—perhaps the least important kind. A story is made of facts, and it is the concrescence of those facts that creates a news story. Corrections remove those facts from the story—but the story and the thrust remain. Even writers averse to acknowledging errors who have done so will only under the rarest of circumstances follow the logic fully: The challenged fact requires a reexamination of the premises built on top of it. In other words: We don’t need an update; we need a
rewrite
.
Like when
Business Insider
editor Henry Blodget reported “unconfirmed rumors” that three prominent journalists had been hired away from their old media jobs for blogging gigs with salaries of close to half a million dollars a year. He reported this despite the fact—as he admits, and as he quoted in the article—that a source told him the numbers were “laughable.” The next day, in a post titled “DAILY BEAST: We’re Not Paying Howard Kurtz $600,000 a Year!” he acknowledged that in response to his story another source had shot down his speculation, calling it “wildly inflated figures of hyper-active imaginations.” Not to be discouraged, Blodget finished this update with some “new information”: another set of rumors about what other journalists were being paid. All the same, he concluded—despite having the reasons for the conclusion demolished—“it looks like a new golden age for those in the news business.”
The real golden age for journalists is the one when a guy like Blodget not only gets traffic by posting jaw-dropping rumors, but then also gets traffic the next day by shooting down the same rumors he created. And then he has the balls to start the cycle all over again with his very next breath. That he was wrong doesn’t even begin to cover it: The man has an aversion to the truth and not the slightest bit of guilt about it.
He’s not alone. I once heard Megan McCarthy (
Gawker
,
TechMeme
, CNET) speak at a SXSW panel about how false stories, such as a fake celebrity death, spread online. During the Q&A I got up and asked, “This is all well and good, but what about mistakes of a less black-and-white variety? You know, something a little more complex than whether someone is actually dead or not. What about subtle untruths or slight mischaracterizations? How does one go about getting those corrected?” She laughed: “I love
your
idea that there can be nuance on the Internet.”
THE PSYCHOLOGY OF ERROR
If it were simply a matter of breaking through the endemic arrogance of bloggers and publishers, iterative journalism might be fixable. But the reality is that learning iteratively doesn’t work for readers either—not even a little.
Think of Wikipedia, which provides a good example of the iterative process. By 2010 the article on the Iraq War had accumulated more than twelve thousand edits. Enough to fill twelve volumes and seven thousand printed pages (someone actually did the math on this for an artistic book project). Impressive, no doubt. But that number obscures the fact that though the twelve thousand changes collectively result in a coherent, mostly accurate depiction, it is not what most people who looked at the Wikipedia entry in the last half decade saw. Most of them did not consume it as a final product. No, it was read, and relied upon, in piecemeal—while it was under construction. Thousands of other Wikipedia pages link to it; thousands more blogs used it as a reference; hundreds of thousands of people read these links and formed opinions accordingly. Each corrected mistake, each change or addition, in this light is not a triumph but a failure. Because for a time it was wrongly presented as being correct or complete—even though it was in a constant state of flux.
The reality is that while the Internet allows content to be written iteratively, the audience does not read or consume it iteratively. Each member usually sees what he or she sees a single time—a snapshot of the process—and makes his or her conclusions from that.
An iterative approach fails because, as a form of knowledge, the news exists in what psychologists refer to as the “specious present.” As sociologist Robert E. Park wrote, “News remains news only until it has reached the persons for whom it has ‘news interest.’ Once published and its significance recognized, what was news becomes history.” Journalism can never truly be iterative, because as soon as it is read it becomes fact—in this case, poor and often inaccurate fact.
Iterative journalism advocates try to extend the expiration date of the news’s specious present by asking readers to withhold judgment, check back for updates, and be responsible for their own fact-checking.
*
Bloggers ask for this suspended state of incredulity from readers while the news is being hashed out in front of them. But like a student taking a test and trying to slow down time so they can get to the last few questions, it’s just not possible.
Suppressing one’s instinct to interpret and speculate, until the totality of evidence arrives, is a skill that detectives and doctors train for years to develop. This is not something us regular humans are good at; in fact, we’re wired to do the opposite. The human mind “first believes, then evaluates,” as one psychologist put it. To that I’d add, “as long as it doesn’t get distracted first.” How can we expect people to transcend their biology while they read celebrity gossip and news about sports?
The science shows that we are not only bad at remaining skeptical, we’re bad at correcting our beliefs when they’re proven wrong. In a University of Michigan study called “When Corrections Fail,” political scholars Brendan Nyhan and Jason Reifler coined a phrase for it: the “backfire effect.”
3
After showing subjects a fake news article, half of the participants were provided with a correction at the bottom discrediting a central claim in the article—just like one you might see at the bottom of a blog post. All of the subjects were then asked to rate their beliefs about the claims in the article.
Those who saw the correction were, in fact,
more likely
to believe the initial claim than those who did not. And they held this belief
more confidently
than their peers. In other words, corrections not only don’t fix the error—they backfire and make misperception worse.
What happens is that the correction actually reintroduces the claim into the reader’s mind and forces them to run it back through their mental processes. Instead of prompting them to discard the old thought, as intended, corrections appear to tighten their mind’s grip on the now disputed fact.
In this light, I have always found it ironic that the name for the
Wall Street Journal
corrections section is “Corrections & Amplifications.”
*
If only they knew that corrections actually are amplifications. But seriously, there can’t really be that many cases where a newspaper would ever need “amplify” one of its initial claims, could there? What are they going to do? Issue an update saying that they didn’t sound haughty and pretentious enough the first go-round?
Bloggers brandish the correction as though it is some magical balm that heals all wounds. Here’s the reality: Making a point is exciting; correcting one is not. An accusation is much likelier to spread quicker than a quiet admission of error days or months later. Upton Sinclair used the metaphor of water—the sensational stuff flows rapidly through an open channel, while the administrative details like corrections hit the concrete wall of a closed dam.
Once the mind has accepted a plausible explanation for something, it becomes a framework for all the information that is perceived after it. We’re drawn, subconsciously, to fit and contort all the subsequent knowledge we receive into our framework, whether it fits or not. Psychologists call this cognitive rigidity. The facts that built an original premise are gone, but the conclusion remains—the general feeling of our opinion floats over the collapsed foundation that established it.
Information overload, “busyness,” speed, and emotion all exacerbate this phenomenon. They make it even harder to update our beliefs or remain open-minded. When readers repeat, comment on, react to, and hear rumors—all actions blogs are designed to provoke—it becomes harder for them to see real truth when it is finally presented or corrected.
In another study researchers examined the effect of exposure to wholly fictional, unbelievable news headlines. Rather than cultivate detached skepticism, as proponents of iterative journalism would like, it turns out that the more unbelievable headlines and articles readers are exposed to, the more it warps their compass—making the real seem fake and the fake seem real. The more extreme a headline, the longer participants spend processing it, and the more likely they are to believe it. The more times an unbelievable claim is seen, the more likely they are to believe it.
4
It is true that the iterative model can eventually get the story right, just like in theory Wikipedia perpetually moves toward higher quality pages. The distributed efforts of hundreds or thousands of blogs can aggregate a final product that may even be superior to what one dedicated newsroom could ever make. When they do, I’ll gladly congratulate them—they can throw themselves a tweeter-tape parade for all I care—but I’ll have to remind them when it’s all over that it didn’t make a difference. More people were misled than helped.
The ceaseless, instant world of iterative journalism is antithetical to how the human brain works. Studies have shown that the brain experiences reading and listening in profoundly different ways; they activate different hemispheres for the exact same content. We place an inordinate amount of trust in things that have been written down. This comes from centuries of knowing that writing was expensive—that it was safe to assume that someone would rarely waste the resources to commit to paper something untrue. The written word and the use of it conjures up deep associations with authority and credence that are thousands of years old.
Iterative journalism puts companies and people in an impossible position: Speaking out only validates the original story—however incorrect it is—while staying silent and leaving the story as it was written means that the news isn’t actually iterative. But acknowledging this paradox would undermine the premise of this very profitable and gratifying practice. I can’t decide if it is more ironic or sad that the justification for iterative journalism needs its own correction. If only Jeff Jarvis would post on his blog: “Oops, turns out errors are a lot more difficult to correct that we thought…and trying to do so only makes things worse. I guess we shouldn’t have pushed this whole ridiculous enterprise on everyone so hard.”
That would be the day.
Instead, the philosophy behind iterative journalism is like a lot of the examples of bad stories I have mentioned. The facts supporting the conclusions collapse under scrutiny, and only the hubris of a faulty conclusion remains.
*
Conveniently, this would be a reading style that would generate the most pageviews for the blog.
*
Comparatively, the wire service Reuters puts their updates and new facts at the tops of their articles and often reissues them over the wire to replace the older one.