The App Generation (19 page)

Read The App Generation Online

Authors: Howard Gardner,Katie Davis

BOOK: The App Generation
13.3Mb size Format: txt, pdf, ePub

One way to think about this conundrum is to imagine whether identity, intimacy, and imagination might have evolved in the manner described, even if computers had ceased to evolve after 1950—the date at which Riesman's and Erikson's pivotal books were published and, as it happens, about the time that Howard entered elementary school. In the sciences, we call this a “thought experiment”: no desktops, laptops, tablets, web, Internet, or social networks. Howard can easily imagine such a world, because that's the world into which he (and, for that matter, all previous generations) was born. It's difficult for Katie to imagine, and probably next to impossible for Molly or for Howard's grandchildren to envision—if late, or if lost, what
would
you do without your cell phone?

(Example: At one of the groups on which Howard was permitted to eavesdrop, young people were discussing how they phone family and friends while driving a car or walking their dog. If they could not check in at those times, they lamented, they'd probably never link up with these most valued other persons. Afterward, Howard reminded participants that, when their parents and grandparents were growing up, mobile phones had not yet been invented.)

In the pages of twentieth-century science fiction, we can discern anticipations of the twenty-first-century world—the kinds of worlds envisioned by writers like Isaac Asimov, Ray
Bradbury, Robert Heinlein, Ursula K. Le Guin, or, for that matter, Anthony Burgess. Clearly a digital world could be imagined, even in its actual absence, with the utopian implications that excited, or the totalitarian implications that disturbed, imaginative writers and observers. But could the world that we've discerned actually have come about without technological innovations?

Here is our best guess. Some of the features we've described could have come to pass even if the technology had been frozen at midcentury. To give one example, the reluctance to take risks may emerge because of a belief that there is a best way to do everything and that way is to find the right “app”; but it might also have come about if, for any reason, resources were sharply reduced or competition sharply increased. Getting into a desirable college and securing the right job has long been a goal of young persons (and their parents!)—and understandably so. When opportunities were plentiful, there was a lesser need to walk the straight and narrow, more opportunity to take chances, to forge new paths. (Howard's generation benefited from that brief, Camelot moment.) But when, for any number of reasons, these goals prove far more difficult to attain, a predilection to follow a well-trod path is readily understandable and just as readily justified.

Other features that we've discerned seem much more closely tied to the digital revolution. For example, it is hard to imagine how students could be connected to one another day and night in the absence of mobile phones—and that connectedness has clear implications for intimacy and, as we've
suggested, for identity as well. The capacity of human beings to create and promulgate new knowledge is also radically transformed by digital ware. Information that took days or even weeks to track down in Howard's youth—he remembers countless (and often fruitless) treks through the stacks of libraries—can now be located in a manner of seconds on the web; by the same token, findings, claims, and counterclaims are also subject to the 24/7 deluge of information that changes fundamentally the contours, if not the frontiers, of knowledge. We've already seen the differential effect on creativity in two expressive media: literary language and graphic depiction. The pros and cons of “creativity by groups”—be they large “sourced” crowds or small e-salons—have yet to be determined for the digital era.

So far, we've only mentioned the possible causal roles of specific technologies—or of their absence. But of course, many other factors have been at work in the catalysis of generations in earlier times. Clearly, young people will have been defined in part by epochal political and military events—in the United States at the time of the Revolutionary or Civil War, in France or Russia or China at the time of their respective political revolutions, anywhere in battle during the First World War, the Second World War, or the Vietnam or Iraq (or Middle Eastern or Balkan) conflicts. And as we've noted, the consciousness of generations can be engendered by other occurrences, be they financial (the Great Depression, the rise of the consumer society, the eruption of the mortgage crisis), natural disasters (fires, plagues, earthquakes, tsunamis), or manmade
occurrences (the
Apollo
mission to the moon, the
Challenger
explosion, the attack on the Twin Towers).

The very identification of other causative factors is salutary and humbling. Even if the current generation is inconceivable in the absence of the technologies of the past half century, these technologies do not and cannot act in isolation. Doubtless, there will be interactions among technological, financial, political, military, natural, and manmade epoch-making events.The most careful students of generational consciousness need and should trace these factors and their interactions scrupulously.
12
(Those who want to understand the American civil rights revolution or the concomitant assertion of women's rights are equally well advised to examine a congeries of events.) And yet we believe that there is a need for some observers to step back and to try to discern the “forest consciousness” that may undergird the many “contributing trees.” In introducing and coining the App Generation, that's what we have tried to do.

One more point, which we need to state as clearly and forcibly as possible: Much of what we've written in this book can be seen as critical of the current generation. Characterizations such as “risk-averse,” “dependent,” “superficial,” and “narcissistic” have been asserted, even bandied about. We have to stress, accordingly, that even if these descriptors have merit, in no sense are we
blaming
members of the App Generation. Clearly, these characterizations have come about, at least in significant part, because of the ways in which young persons have been reared (or failed to be reared) by their elders—in
this case, Howard's generation and the ones that immediately followed his. If there is a finger to be pointed, it should be aimed at earlier generations and
not
at the adolescents and young adults of our time.

At the head of the chapter, we've affixed a statement by the philosopher Alfred North Whitehead. Though it may be well known among the digerati (Howard first heard it quoted by a leading technologist), we had not encountered it until we were putting the finishing touches on this book. At first blush, the statement sounds just right. One finds oneself nodding in agreement—yes, we value those inventions that allow us to make habitual those thoughts and actions that could assume much time and effort. And indeed, we can think of many manmade devices (ranging from the creation of script to the invention of the credit card) that have allowed us to simplify formerly complex operations and to move on to other things. Is civilization even imaginable without a multitude of labor-saving devices that free our hands and minds? Thank goodness for the “flywheel of civilization”!

Yet on reflection, Whitehead's statement seems increasingly dual-edged to us. For sure, most of us would like to automatize as much as possible—our psychological antagonists—behaviorists and constructivists—would agree. But do we want to automatize
everything?
And
who decides
what is important? And
where do we draw the line
between an operation and the content on which the operation is carried out? The contrasting cases are brought up sharply by Anthony Burgess. The uncivilized Alex decides too much on his own, and that
creates mayhem. But the overly civilized Alex has lost altogether the power of decision—all has been molded and modeled by the outside forces. (Remember the end of
The Adventures of Huckleberry Finn:
“But I reckon I got to light out for the Territory ahead of the rest, because Aunt Sally she's going to adopt me and sivilize me and I can't stand it. I been there before.”)
13
As we consider the effects of the digital (and, particularly, the app) revolution on our society, we need perennially to ask the question: Do we want to automate the most important operations or do we want to clear the deck so that we can focus, clear-eyed and with full attention, on the most important issues, questions, enigmas?

BEYOND THE THREE IS: THE REALMS OF RELIGION AND ETHICS

As psychologically oriented scholars focused on youth, writing in a post-Riesman, post-Erikson era, we can justify our decision to focus here on the issues of identity, intimacy, and imagination. (Had we studied young children, we might have chosen to talk about trust or initiative or industry; had we focused on older persons, issues of generativity or integrity might have come to the fore.) But especially at a time when claims about life stages and life cycles are being reexamined, we should also touch on a few other spheres in which digital technologies may cast a wide shadow.

First, religion. In one sense, religion (especially as we have known it in the West) can easily be thought of in “app” terms. Many, perhaps most of the rituals involved in regular religious practices can be thought of as “apps”—though of course they are human-initiated and human-choreographed rather than downloaded on one's device. Indeed, the prayer or ritual only works if it is carried out according to the specified procedures. Stepping back, it is also possible to think of the religious life, well lived, or appropriately lived, as a kind of super-app—we must attempt as much as possible to emulate the lives of saints while avoiding the sins (and the sinners) of greed, envy, and other vices.

Yet, perhaps paradoxically, it seems that, in some ways, the app world is antipathetic to religion, or at any rate to traditional organized religion. At least in the United States and much of Europe, young people today are less religious, certainly less formally religious, more skeptical of organized religion, more willing to shift religions, to marry across religious boundaries, and the like. Clearly these trends are not particularly dependent on apps, in the literal sense. Some have unfolded over decades, if not centuries. And yet, the diversity of apps may push us toward defining our own religious practice, in our own way, even our own brand of spirituality, whether or not it happens to conform to that practiced across town or around the neighborhood or even in the next room. And indeed, such exploration can be aided by various apps, which range from Note to God (this app allows users to submit
notes to a nondenominational God) to Buddha Box (this app provides chants and sounds to enhance meditation practices).
14

Here as elsewhere, we encounter the lure of app-dependence as well as the option of app-enablement. Ready-made prayer or ritual apps make it easier than ever simply to rely on what the technology affords. The plethora of religion-related apps also makes it possible to choose from a variety of belief systems and practices: in a democratic society, there is little risk of a Big Brother–dictated religious regimen and, instead, much enabling of unusual or even unique theological mixtures. And of course, the apps themselves are only one variable. Users on the adventurous side will concoct their own religious (or atheistic) brew; others will remain ever on the lookout for the one true belief.

Closely related to the arena of religion is that encompassing morality and ethics. Having studied “good work” for many years, members of our research group found it natural to investigate the effect of newly emerging media on venerable, ethically suffused issues such as privacy, protection of intellectual property, trustworthiness, credibility, and citizenship. We did this as part of our Good Play Project.
15
We realized, early on, that aspects of the new media—their speed, their public nature, the ease of accessing, transferring, and transforming information, the possibilities for anonymity or for multiple identities—were creating a virtual Wild West. Ethical issues that, in an earlier time, might have been considered settled were necessarily coming up for reexamination and, perhaps, for reconceptualization.

Our principal findings can be readily summarized. To begin with, as we look across age groups, there is not a radical difference in orientation toward ethical issues. That is to say, we find more similarity than differences across tweens, teens, and adults. Second, there is little evidence in any age group of proactive ethics or exemplary citizenship. When subjects tell us that they avoid missteps, they do so principally out of fear of punishment (“If I send this file illegally, I might get caught and punished”); few state or even imply other, purer ethical motives. The minority who embrace an ethical course is composed primarily of individuals who have themselves seen the harms caused by ethical violations and want to do their part to discourage further ones (“I saw how I felt when someone took credit for lyrics that I had written”). On a more positive note, many young people lament the absence of effective mentors who could model how best to handle an ethical dilemma. Perhaps when such models emerge—and they can come from the ranks of the wise young, as well as the wise old—behavior online may seek and even meet a higher ethical standard.
16

There may be a more insidious aspect to ethics in the digital era. Even as some individuals believe that ethics should be left to each person (political theorist Alan Wolfe terms this stance “moral freedom”), a surprising number of people assert that ethics is self-evident.
17
We've heard this sentiment frequently from individuals who are part of the Silicon Valley scene or members of groups that promote digital freedom. In the spirit of Google's motto, “Don't be evil,” there is the apparent belief that people of goodwill can be counted on to behave in
a righteous way. What we know about human behavior is that it is all too easy for individuals to
believe
that they have good motives and behave well, even though informed observers dispute this characterization.
18
It is also easy to believe that others of goodwill necessarily concur with one's own views. (“It's obvious that we need to protect individual privacy” versus “What people
really
want is complete transparency about all matters.”) Being genuinely ethical requires much soul-searching, conversing with informed peers, a willingness to admit that one has been wrong, and striving to do better the next time. These steps are far more difficult to execute than a simple delineation of what is ethical and what is not. (“It's OK to mislead a novice in World of Warcraft, because, after all, it's only a game.”) Put differently, apps may help to raise consciousness about ethical conundrums, but they cannot confidently designate the best course of action in a particular situation. Take note, Professor Whitehead!

Other books

Reset (Book 2): Salvation by Druga, Jacqueline
Hollywood Buzz by Margit Liesche
Now, Please by Willow Summers
The Law of Desire by Gwyneth Bolton
What God Has For Me by Pat Simmons
A Woman's Place by Lynn Austin
Makeovers Can Be Murder by Kathryn Lilley