The End of Absence: Reclaiming What We've Lost in a World of Constant Connection (12 page)

BOOK: The End of Absence: Reclaiming What We've Lost in a World of Constant Connection
13.21Mb size Format: txt, pdf, ePub
ads

The reviews themselves are often insipid or thoughtless, yet their value for those businesses is undeniable. A study published in
Economic Journal
found that when a restaurant’s Yelp rating was bumped by just half a star, it correlated with an increased number of patrons, even while all other factors (price, quality of food, service) remained constant. This naturally leads many businesses to create false enthusiastic reviews in an effort to sway public opinion, or at least sway public wallets. Perhaps the surest sign of a Yelp review’s significance is the vehemence it can inspire:
A restaurateur in Ottawa’s famous ByWard Market
, for example, was found guilty of libel and sent to jail after she launched an aggressive Internet smear campaign targeted at the author of a critical review.

It’s this devotion to, and obsession with, a flattened critical world—one where amateurism and self-promotion take the place of the “elite” critical voices we once relied upon—that leads writers like Andrew Keen (author of
The Cult of the Amateur
) to baldly state: “
Today’s internet is killing our culture
.” We get mob opinion instead of singular voices; crowdsourced culture. Consider the Unbound Publishing project, which democratizes the selection of which books get written. Authors pitch ideas to users, who then choose whether or not to fund the writing of said books. “Books are now in your hands,” enthuses the Web site. This sounds like a splendid way to produce top-rate
Twilight
porn (and I tip my hat to the creators of such best sellers), but what is the spectrum of books that such an approach will produce, and what sort does it cancel out?

We’ve proceeded this far with the belief that the broadcasting of our voices is a positive—and certainly it can be—but now our job is to temper that belief with a second question: Might we suffer from opinion glut?

• • • • •

 

In my years as a critic for various papers, I’ve reviewed visual art, opera, chamber music, dance, books, and theater. And what I’ve heard from my fellow critics—during intermission at concert halls or filing out of some independent theater—is a resounding condemnation of the new critical order. “These fucking bloggers,” one music writer said to me at a cocktail event, “they swoop in and gobble up all the advertisers, pumping out totally uninformed,
shittily
written drivel. It just makes me wonder why I even bother doing research or interviews.” Another veteran critic—from the theater beat—was resigned to the fact that “every time the Internet expands, my job gets smaller. There’s less and less space for theater reviews in the paper. Or
paid
theater reviews, anyway.” Another simply noted: “
God,
I mean, you read these reviews online and they don’t, you know, even know how to use
apostrophes
. Don’t people
care
that they’re reading stuff written by people lacking a basic grasp of the language?” When everyone becomes an expert, the old experts fade away.

Professional critics have their uses, though—we can aggregate them. Web sites like Rotten Tomatoes (with ten million unique users visiting each month) use masses of data, a crowd of critical reviews, to create an average star rating for films. For reasons best left unexplored, I wrote a review in
The
Globe and Mail
of Uwe Boll’s 2008 debacle,
Postal,
and Rotten Tomatoes has been using my grouchy opinion of the film ever since. (Naturally, I’ve not received a penny.) While it might be informative to know that 124 reviews of the Vince Vaughn comedy
The Internship
could be mashed down into a single sulking number—it got 34 percent on Rotten Tomatoes—what does such an aggregation mean for the livelihoods of the critics whose work has been the fodder for the Web site? Who would pay to read a single critic’s work when it’s already been processed by such a godly and free-of-charge algorithm? Our generation seems to be facing a crisis of critique. We want to know what’s best, we want to know where to eat and what movie to see, but we’ve begun to forget that real opinion, real critique, must always come out of an absence of voices—from a singular subjective viewpoint. You cannot aggregate taste. But in the flood of rating systems and collectivized percentage values, which guide us toward TV shows on Netflix or songs on iTunes, we don’t register the loss of that less aggressive suggestion system we always relied on before: face-to-face encounters and singular critics.

I was surprised to find a sympathetic listener in Matt Atchity, editor in chief over at Rotten Tomatoes. I told him I don’t love the idea of aggregating critical opinion, saying, “In some ways it’s anathema to the whole point of criticism, since it strips the critic of a subjective voice.” And Atchity told me, “My worry about that is the one thing that keeps me up at night.” I asked him how he thought of his own role in critical debates, and he told me his job is to amass the best opinions in the country for his millions of readers. “Sometimes I feel like I’m the town crier,” he told me. “I feel like I’m a herald.” Atchity may have good intentions, but the aggregation and flattening of critics still continues at Rotten Tomatoes.

Shall we engineer instead a kind of critical vacuum, an artificial absence of voices, in which comprehensible and highly subjective opinions can prosper?

Perhaps we’ll get more of a critic vacuum from companies like Songza, the music-streaming Web site that delivers playlists curated by experts (and occasionally celebrities, from Justin Bieber to New York City’s former mayor Mike Bloomberg). Songza is founded on a simple enough premise: If there are twenty-four million songs on the shelf, people become baffled by the panoply of content and fall back on the few songs they already know; access to everything encourages exploration of nothing. Songza’s job is to ask you what you’re in the mood for (taking a sunny stroll? preparing for bedtime?) and then introduce you to music you didn’t know you wanted for the occasion. It’s an approach that’s working. On any given day, seventy million minutes of activity are logged on Songza. I spoke with the company’s cofounder Elias Roman, a twenty-nine-year-old wunderkind from Queens who’s found his way onto the
Forbes
“30 Under 30”
list. I admit I was relieved to hear his ideas about the future of music. “Some things are easy to crowdsource,” he told me, “but when you’re interested in constructing a playlist, a coherent whole, it’s more than just aggregating a bunch of binaries. I’m saying that there
is
a value to tastemaking.”

Tastemaking?
The very term sounded antique, wonderfully elitist, coming from the founder of a digital start-up. “We have a desire here to be tastemakers,” he continued. “While our algorithms will sometimes offer music that a user has chosen in the past, we have a mandate that the site always brings forward songs you don’t know you want yet. There’s always going to be both comfort food and something surprising.”

Roman’s insistence on tastemaking flies in the face of most content providers, who seek only to gratify the known desires of users. And it’s an impulse that could go a long way toward countering something that Internet activist Eli Pariser has coined “
the filter bubble
.”

Here’s how a filter bubble works: Since 2009, Google has been anticipating the search results that you’d personally find most interesting and has been promoting those results each time you search, exposing you to a narrower and narrower vision of the universe. In 2013,
Google announced that Google Maps
would do the same, making it easier to find things Google thinks you’d like and harder to find things you haven’t encountered before. Facebook follows suit, presenting a curated view of your “friends’” activities in your feed. Eventually, the information you’re dealing with absolutely feels more personalized; it confirms your beliefs, your biases, your experiences. And it does this to the detriment of your personal evolution. Personalization—the glorification of your own taste, your own opinion—can be deadly to real learning. Only if sites like Songza continue to insist on “surprise” content will we escape the filter bubble. Praising and valuing those rare expert opinions may still be the best way to expose ourselves to the new, the adventurous, the truly revelatory.

• • • • •

 

Commensurate with the devaluing of expert opinion is the hypervaluing of amateur, public opinion—for its very amateurism. Often a comment field will be freckled with the acronym IMHO, which stands for the innocuous phrase “in my honest opinion” (or, alternatively, “in my humble opinion”). It crops up when someone wishes to say anything with impunity and has become the “get out of jail free” card of public debate. “
IMHO
Mexicans should learn to speak proper English if they’re going to work in our restaurants.” Can’t touch me! Just my opinion!

I’ve come to see “IMHO” as a harbinger of bullshit. IMHO usually portends a comment that is ill conceived and born of either private prejudice or a desire to trumpet. It’s part of a public debate culture in which the “honest opinion” is worthy of publication and consumption not because of any particular merit, but because it is “honestly” the writer’s “opinion.” In his charming little book
On Bullshit,
the moral philosopher Harry G. Frankfurt offers a useful equation for predicting the manufacture of the manure in question:

Bullshit is unavoidable
whenever circumstances require someone to talk without knowing what he is talking about. Thus the production of bullshit is stimulated whenever a person’s obligations or opportunities to speak about some topic exceed his knowledge of the facts that are relevant to that topic.

 

By this reckoning, haven’t we created bullshit machines? In the more than one hundred million amateur travel reviews that fuel TripAdvisor, for example, isn’t it likely that our ability to speak publicly almost
always
exceeds our knowledge? The invitation to bullshit, anyhow, is constant.

When I find myself drowning in bullshit—my own and that of others—I think about what it’d be like to sit outdoors at some New York City café, circa 1930, and open a copy of
The New Yorker,
maybe read a book review by Dorothy Parker. What must that have felt like? To draw in a few hundred words of commentary, both discernible and, yes, discerning, completely void of miscellaneous commentary? Wipe away the democratic clamor of “honest opinion” and find beneath a single salient voice. Ms. Parker’s “honest opinions” were often briefly stated; she knew the soul of wit (“like the soul of lingerie”) was its brevity. When Parker reviewed A. A. Milne’s now-beloved
The House at Pooh Corner
she made short work of it: After describing Pooh and Piglet humming in the snow, she demurs, “Oh darn—
there I’ve gone and given away the plot
.” And nobody jabbered a response. . . . Clarion calls like Parker’s weren’t smothered by dozens of voices clouding the air with half-baked comebacks.

The review read, the magazine folded and tossed aside, one decides to trust or not trust Parker’s opinion and leave it at that. Perhaps on rare occasions a letter is written to the editor (which might be published, if thoughtful enough), but mostly the discussion is one-way and finite. What a lovely thing, to shut up and listen and not broadcast anything back. There’s a certain serenity in it and even a kind of light grace.

There has always been an abundance of bullshit. But never before have so many been implicated in the bullshit rigmarole that is public conversation. Before, most of us could point at the bullshitters among us (the politicians and hotheaded pundits) and shake our heads. Today, no such finger-pointing can be allowed because we all swim in the mess. When the armchair philosophers are handed megaphones and the crowd of “honest opinion” starts to overwhelm the trained few, will we begin to think that bullshitting is the only and the natural way to make a call? Or will we engineer opinion vacuums, weed out the bullshit, and separate what is best from what is customary?

C
HAPTER 5
Authenticity
 

But isn’t everything here green
?

—Dorothy, in L. Frank Baum’s
The Wonderful Wizard of Oz

ANDREW
Ng holds a position in the Computer Science Department at Stanford University, where he regularly lectures, year after year, to classrooms of roughly four hundred bright and privileged students. Back in 2008, a video project he launched called Stanford Engineering Everywhere (SEE) let him broadcast base approximations of those classes online. Ng simply stuck a camera at the back of the lecture hall and posted the whole thing on his site, like the world’s most boring bootlegged movie. Yet the response—excited viewers kept chatting him up at his Silicon Valley Starbucks—got Ng thinking. There was an appetite for world-class education among those without the means or wherewithal to attend an institution like Stanford. How far could that hunger be satisfied? Could the Internet, like other communication advances, allow us (even compel us) to redistribute monopolies of knowledge? Doesn’t all information in fact
want
to be free?

Over the next few years
, Ng worked out of his living room, developing much of the technology and theory that’s used today in “massive open online courses” (MOOCs). Ng was driven by a single question: How can we develop a course that scales to arbitrarily large numbers of students? The answer came in the form of autograded quizzes, discussion forums, a more dynamic lecture recording style, and the startling proposal that peer grading could be as effective as grading from a single authority (if every student grades, and is graded by, five others). In the summer of 2011, Ng set up a new course online, and one hundred thousand students signed up. He did the math in his head:
I’ll need to teach a class at Stanford for 250 years to reach that many people.

The MOOC revolution had begun. On April 18, 2012, Ng announced (along with Daphne Koller) the online learning platform Coursera.org. And Ng’s assumptions about that hidden appetite for higher learning proved correct.
Latest numbers show Coursera hosts
more than five million students who collectively enroll in 532 courses offered by 107 institutions around the globe, including Princeton and Yale.

The advantages of MOOCs are many and clear. Online videos of lectures are massively abbreviated, so an hourlong lecture might become a five-minute video focusing on single action-minded outcomes. Instead of showing a lecturer pacing back and forth in front of bored students, Ng now overlays talk onto visuals that show graphics and handwritten key points—“just the content,” as Ng has it. “We also use video editing to cut out the boring bits,” he told me. “Some things, like a professor writing for ages on a board, you just don’t need to see.”

And then there’s the data. The piles and piles of data. Coursera doesn’t just educate you, it learns from you, too. Every keystroke, every mouse click, is logged in Coursera’s rapidly expanding data banks. When a student pauses a video, Coursera takes note; when a student needs more time than usual to answer a question, Coursera logs that, too; it knows when you skip a video, what questions you asked of peers, and what time of day you read the answer. Over its first year or so, Ng told me, “Coursera collected more educational data than the entire field of education has collected in its five-thousand-year history.”

To what end? Consider a single programming assignment that Ng devised for one of his MOOCs. Thousands of students submitted a wrong answer—but what struck Ng was that Coursera could scan its data and reveal that two thousand students had made exactly the same mistake in their work. “I was able to create a custom correction message, which would pop up when people had the usual misconception. In a normal class of one hundred students, you won’t notice these patterns. So, ironically, in order to achieve this level of personalization, what we needed was to teach a class to one hundred thousand people.” (I take his point, though I’m not sure that my definition of personalization is the same as Ng’s.) The hope, here, is that mass data analysis will allow Coursera, and other MOOC providers, to create software that personalizes education in much the same way that Netflix, Google, and Amazon already personalize your experience of movie watching, searching, and shopping. Imagine taking a class on twentieth-century literature and receiving helpful advice that “other learners like you” have found useful. The process of intellectual exploration, once highly idiosyncratic, becomes an opportunity to promote whatever material has the highest view count. “Until now,” Ng told me, “education has been a largely anecdotal science, and we can now make it data-driven.” This reminded me, of course, of Karthik Dinakar, eager to “harden” the soft sciences of psychology and psychiatry with reams of crowdsourced data.

The crowdsourcing of education is further highlighted by Ng’s interest in Wiki lecture notes. “At Stanford,” he explained to me, “I taught a class for a decade, and writing the lecture notes would take forever. And then, every year, students would find more bugs, more errors, in my notes. But for online classes, I put up a Wiki and invite students to write their own lecture notes; students watch my lectures and create the notes themselves. When you have one hundred thousand students reading and editing the same Wiki lecture notes, the result is a higher quality of text than I could create on my own. Bugs are rapidly squashed.” I ask whether the same principle that works for his engineering classes would work for classes on art history or creative writing. Ng pauses for a beat before replying: “I haven’t seen any evidence that would suggest otherwise.”

Nevertheless, MOOCs and the attendant dematerialization of the education process are creating a certain crisis of authenticity. A large Pew Research Center survey found that most people believe we’ll see a mass adoption of “distance learning” by 2020, and many are wondering whether that will brush aside the sun-dappled campuses, shared coffees, and lawn lolling that pre-Internet students considered so essential to their twenty-something lives.

There are also more concrete points to consider. Graduation rates, for starters: Another MOOC godfather at Stanford, Sebastian Thrun (of Udacity), was tantalized for a while by the possibility of bringing Ivy League education to the world’s unfortunates, but he later announced in
Fast Company
magazine that less than 10 percent of his MOOC students were actually completing courses. He had become deeply dissatisfied with the MOOC world he had helped to bring about: “
We don’t educate people as others wished
, or as I wished,” he said. “We have a lousy product.” After signing up nearly two million students for online courses, Thrun despaired at the dismal completion rates; and only about half of those who did complete courses appeared to be learning the material in a meaningful way.

Ng remains hopeful. “I think a lot of content can and will go online,” he told me. “The economics certainly work out better that way. But I don’t see us replicating the crucial mentor experience, the small-group experience. What I’d like to do is automate routine activities like grading papers and lectures, to free up the professor’s time for real conversations. The role of the traditional university is going to be transformed.” Meanwhile, the nonprofit enterprise edX announced in 2013 an artificial intelligence program that instantly grades essays and written answers, eliminating the need for a professor’s comments.

Ng himself often compares the digital revolution with the original Gutenberg moment, so it follows that he would assume a digital enlightenment is about to follow. “I think we can change the world one day,” he says matter-of-factly. “If a poor child in a developing country takes a dozen courses in computer sciences that he didn’t have access to before, and then can earn a decent wage, I think in that way we can change the world.” Who would deny such an enlightenment? But it may be worth noting here that most Coursera students are not from developing countries. At present, Africa makes up 3.6 percent of the students, while more than a third come from North America and a further third hail from Europe.

Neil Postman, the pioneering technology critic, argues in
Technopoly
that “
school was an invention of the printing press
and must stand or fall on the issue of how much importance the printed word has.” By this measure, Coursera and its ilk are a kind of necessity, a rearrangement of education that’s inevitable as our means of communication changes. “For four hundred years schoolteachers have been part of the knowledge monopoly created by printing,” continues Postman, “and they are now witnessing the breakup of that monopoly.” In the days of Thamus (see chapter 2), the written word was a kind of inauthentic knowledge, and then it became the only true form of knowledge. Is it so unlikely that we’re undergoing a similar reevaluation today?

The new knowledge monopoly will feel comparatively abstract, if history is any guide. Advances in cartography, for example, delivered maps that substituted an abstract space for firsthand experiences with natural landscapes; the mechanical clock parsed leisurely “natural time” into regimented sections so that the gong of a church bell had more sway over your comings and goings than your body’s own inclinations.
12
Arguably, the larger and more
productive
world that our technologies deliver is simultaneously an impoverished version of the older one—a version that rejects direct experience and therefore rejects an earlier conception of reality that had its own value. We see more, yet our vision is blurred; we feel more things, yet we are numbed.
Marshall McLuhan argues that whenever we amplify
some part of our experience with a given technology, we necessarily distance ourselves from it, too. (A friend of mine saw those airplanes crash into the World Trade Center while sitting in her living room on the other side of the continent—and thought, against her will, of a movie.)

• • • • •

 

Some lens has been shuttered over our vision. We all have felt it. Even as we draw more of the world into our lives, gaining access to people and events we never had access to before, we feel that the things we gather lose some veracity in transit. It happens to me constantly. At my brother’s wedding, a hundred of us gathered in my parents’ backyard, beneath the glow of trailing paper lanterns strung throughout the trees and white tents. I remember breaking away from the festivities to check my phone, only to find that my friend was posting photos of the very wedding I’d stepped away from: pixelated simulacra of the moment I had left.

The most obvious reason a person would ditch the authentic is, of course, to gain access to a heightened version of dull reality. Enter the promise and wonder of Google Glass, released in 2013, which offers just that—
augmented reality
. The “wearable computer” is a (slightly futuristic, slightly dorky) headset fixed with a miniature display and camera, which responds to voice commands. We can tell it to take a picture of what we’re looking at or simply pull up Google Images’ archive of vintage Hulk Hogan photos because we want to compare the hairdo being sported by that guy on the metro. The company’s welcoming Web site smiles: “
Welcome to a world through glass
.” Welcome to augmented (read: inauthentic) reality.

Remember that the Emerald City in
The Wonderful Wizard of Oz
isn’t actually emerald. In the Hollywood film version, yes, Judy Garland and her gang traipse through a gorgeous, sparkling town. But in L. Frank Baum’s original book, Dorothy and the others are exhorted to put on “safety goggles” to protect their eyes. “If you do not,” they are warned, “
the brightness and glory of the Emerald City
would blind you.” Only much later do they discover that it was the green-tinted goggles all along that gave the city its apparent luster. The Emerald City (like the “wizard” behind the curtain) is a fake. “But isn’t everything here green?” asks Dorothy. “
No more than in any other city
,” replies Oz. “But my people have worn green glasses on their eyes so long that most of them think it really is an Emerald City.”

When we wear emerald glasses with the intention of augmenting reality, we’re always giving ourselves over to some authority’s vision and relinquishing a portion of our own independent sight.

All our screen time, our digital indulgence, may well be wreaking havoc on our conception of the authentic—how could it not? But, paradoxically, it’s the impulse to hold more of the world in our arms that leaves us holding more of reality at arm’s length. Coursera.org delivers the world’s great teachers to your living room but turns education into a screen interface; a child’s cell phone keeps her in constant touch with her friends but trains her to think of text messaging as a soulful communication.

When Walter Benjamin meditated on the advent of mechanical reproduction in 1936, he was already wondering at the uncanny changes that take place when “
a cathedral quits its site
to find a welcome in the studio of an art lover” or “a choral work performed in a hall or in the open air can be heard in a room.” When Benjamin went to the movies—which were now, amazingly, delivering
talking
images on the screen—he saw that they turned rare beauties into easily accessible experiences, all the while degrading the “aura” of that which they projected, their “genuineness.” He wrote: “
The genuineness of a thing
is the quintessence of everything about its creation that can be handed down, from its material duration to the historical witness that it bears.” What a strange concern, we might think—
historical witness
. It’s that antique notion of actually being
there, of becoming richer by being one of the few people or things to have lived in a singular moment, a singular place. Benjamin even worried about the actors he saw on movie screens, noting that “
for the first time
 . . . a person is placed in the position, while operating with his whole being, of having to dispense with the aura that goes with it. For that aura is bound to his here and now; it has no replica.” It’s a worry, a sensibility, that’s nearly demolished by YouTube and its ilk; we aren’t trained to care about the
genuineness
of things when digital copies give a zombie-scale crowding of content. This outdated concern for genuineness—for
aura—
requires absence, that one thing we have in short supply. The endgame is this: Without absence in our lives, we risk fooling ourselves into believing that things (a message from a lover, the performance of a song, the face of a human body) matter less. De Beers hoards its diamonds to invent a scarcity that equals preciousness. Perhaps we now need to engineer scarcity in our communications, in our interactions, and in the things we consume. Otherwise our lives become like a Morse code transmission that’s lacking breaks—a swarm of noise blanketing the valuable data beneath.

BOOK: The End of Absence: Reclaiming What We've Lost in a World of Constant Connection
13.21Mb size Format: txt, pdf, ePub
ads

Other books

Deeds of Honor by Moon, Elizabeth
The Chief by Robert Lipsyte
In Her Sights by Perini, Robin
When the Curtain Rises by Rachel Muller
Rune by H.D. March