If a representative sample of the American children of 1997 were to take that 1932 test, their average IQ would come out to be around 120. This would put about one-quarter of the 1997 population above the 130 cutoff for “very superior”—10 times as many as in 1932. Does that seem plausible?
If we go by more recent norms instead, we arrive at an equally bizarre conclusion. Judging the American children of 1932 by today’s standards—considering the scores they would have obtained if they had somehow taken a test normalized this year—we find that their average IQ would have been only about 80! Hardly any of them would have scored “very superior,” but nearly one-quarter would have appeared to be “deficient.” Taking the tests at face value, we face one of two possible conclusions: Either America is now a nation of shining intellects, or it was then a nation of dolts.
Two possibilities, both extreme, yet both in accord with the notion that pop culture makes us smarter, that better tools make better minds, that more wiring and more channels yield more intelligence. Either way, we should notice momentous signs and wonders of intellect all around us.
Flynn himself recognized in 1987 that the Effect should have thrown us into an era of genius, and that we should be enjoying “a cultural renaissance too great to be overlooked.” Needless to say, the 1980s fell far short, and Flynn found no evidence in Europe that mathematical or scientific discovery had increased in the preceding decades, adding as well that “no one has remarked on the superiority of contemporary schoolchildren.” So, he concluded, IQ scores must measure something less than general intelligence. Observing that results for verbal and mathematical tests hadn’t increased at nearly the rate that those for spatial-reasoning tests did, he characterized the intelligence measured by IQ tests as an “abstract problem-solving ability.” The more tests emphasize “learned content” such as vocabulary, math techniques, and cultural knowledge, the less the Flynn Effect shows up. The more they involve “culturally reduced” material, puzzles and pictures that require no historical or verbal context, the more the gains surface. Moreover, the significance of those gains apart from the test itself diminishes. “We know people solve problems on IQ tests; we suspect those problems are so detached, or so abstracted from reality,” Flynn remarked, “that the ability to solve them can diverge over time from the real-world problem-solving ability called intelligence.”
Flynn’s analysis explains the curious bifurcation in the intellectual lives of young adults. On one hand, they navigate the multimedia environment like pros, wielding four email accounts and two virtual identities, jumping from screen to keypad to iPod without pause, creating “content” and expressing themselves to the world. On the other hand, they know remarkably little about the wider world, about civics, history, math, science, and foreign affairs, and their reading and writing skills remain at 1970s levels. If Johnson, Richard Sweeney, Randy Bomer, and, more important, some serious cognitive researchers are right to link screen activities to higher intelligence, we may agree with them partway, but, with Flynn, limit the intelligence that screen activities produce. The screen doesn’t involve learning per se, but, as Sweeney says, a particular “learning style,” not literacy in general, but “viewer literacy” (Bomer’s term). It promotes multitasking and discourages single-tasking, hampering the deliberate focus on a single text, a discrete problem. “Screenmindedness” prizes using search engines and clicking 20 Web sites, not the plodding, 10-hour passage through a 300-page novel. It searches for information, fast, too impatient for the long-term acquisition of facts and stories and principles. As an elementary school principal told me last year, when the fifth-grade teachers assign a topic, the kids proceed like this: go to Google, type keywords, download three relevant sites, cut and paste passages into a new document, add transitions of their own, print it up, and turn it in. The model is information retrieval, not knowledge formation, and the material passes from Web to homework paper without lodging in the minds of the students.
Technophiles celebrate the ease and access, and teens and young adults derive a lesson from them. If you can call up a name, a date, an event, a text, a definition, a calculation, or a law in five seconds of key-punching, why fill your mind with them? With media feeds so solicitous, the slow and steady methods of learning seem like a bunch of outmoded and counterproductive exercises. In this circumstance, young people admit the next connection, the new gadget, smoothly into their waking hours, but the older “learning styles,” the parents’ study habits, are crowded out. Years of exposure to screens prime young Americans at a deep cognitive level to multitasking and interactivity. Perhaps we should call this a certain kind of intelligence, a novel screen literacy. It improves their visual acuity, their mental readiness for rushing images and updated information. At the same time, however, screen intelligence doesn’t transfer well to non-screen experiences, especially the kinds that build knowledge and verbal skills. It conditions minds against quiet, concerted study, against imagination unassisted by visuals, against linear, sequential analysis of texts, against an idle afternoon with a detective story and nothing else.
This explains why teenagers and 20-year-olds appear at the same time so mentally agile and culturally ignorant. Visual culture improves the abstract spatialization and problem solving, but it doesn’t complement other intelligence-building activities. Smartness there parallels dumbness elsewhere. The relationship between screens and books isn’t benign. As “digital natives” dive daily into three visual media and two sound sources as a matter of disposition, of deep mental compatibility, not just taste, ordinary reading, slow and uniform, strikes them as incompatible, alien. It isn’t just boring and obsolete. It’s irritating. A Raymond Chandler novel takes too long, an Emily Dickinson poem wears them down. A history book requires too much contextual knowledge, and science facts come quicker through the Web than through
A Brief History of Time.
Bibliophobia is the syndrome. Technophiles cast the young media-savvy sensibility as open and flexible, and it is, as long as the media come through a screen or a speaker. But faced with 100 paper pages, the digital mind turns away. The bearers of e-literacy reject books the way eBay addicts reject bricks-and-mortar stores.
Or rather, they throw them into the dustbin of history. In the video “Are Kids Different Because of Digital Media?” sponsored by the MacArthur Foundation as part of its 2006 Digital Learning project, one young person declares as a truism that “books are not the standard thing for learning anymore.” Another divulges, “My parents question me why I don’t have my books. . . . The books are online now, so you don’t really need the books.” They back their dismissal up with their purchases. In 1990, according to the
Consumer Expenditure Survey
(Bureau of Labor Statistics), the average person under 25 spent $75 on reading and $344 on television, radio, and sound equipment. In 2004, reading spending dropped to a measly $51, while TV, radio, and sound climbed to $500. Imagine the proportions if computers were added to the mix.
The rejection of books might be brushed aside as just another youthful gripe, today’s version of the longstanding resistance by kids to homework. But the kids aren’t the only ones opposing books to screens and choosing the latter. In 2005, Steven Johnson stated in a Canadian Broadcasting interview that “reading books and playing video games are equally beneficial, roughly and in distinct ways.” Around the same time, however, he wrote an article in
Wired
magazine with a provocative either/or subtitle: “Pop Quiz: Why Are IQ Test Scores Rising Around the Globe? (Hint: Stop reading the great authors and start playing
Grand Theft Auto.
)” A 16-year-old boy who hates reading could hardly concoct a better rationale for his C grades in English class. In the article, Johnson rehearses the Flynn Effect, then highlights the test questions that produce the largest gains, those involving spatial problem solving and pattern recognition. “This is not the kind of thinking that happens when you read a book or have a conversation with someone or take a history exam,” he observes, and for this kind of aptitude, “the best example of brain-boosting media may be video games.” Johnson inserts a few cautionary remarks, noting that the real test of media-inspired intelligence will come when the generation of kids who “learned to puzzle through the visual patterns of graphic interfaces before they learned to read” reaches adulthood and provides a new round of IQ scores. But his enthusiasm is clear, and it’s all on the side of games. Kids may look no further than the subtitle to justify their multitasking, nonreading lives. Forget your books, even the “great” ones, and grab the joystick. When your parents bother you about it, tell them you do it for your own brain development.
With a thousand marketers and stores to entice them, and popular commentators backing them up, teens and 20-year-olds hear the book-pushing voices of teachers and librarians as pesky whispers. They vote with their feet. One day after I entered the Apple Store at Lenox, as young customers flooded the floor on a Saturday afternoon, I went to the Buckhead library five minutes away. Only eight teens and 20-year-olds were there in the entire building, half of them at a computer station. Clusters of small children climbed around the kiddie section, a half-dozen elderly folks sat quietly in the periodicals room, and middle-aged women browsed the fiction stacks. The very young and the middle and elder ages might make the library a weekend destination, but the 13- to 29-year-old group had other plans.
On those days when teens and twenty-somethings do visit the library, they don’t give up their interests, the technology-over-books disposition. Fortunately for them, the library now offers both, and the kids make their choice clear. At every university library I’ve entered in recent years, a cheery or intent sophomore sits at each computer station tapping out emails in a machine-gun rhythm. Upstairs, the stacks stand deserted and silent. Audiovisuals at the library draw more attention every year, and books are losing out as the center of attraction for younger patrons. From 1994 to 2004 at the St. Louis Public Library, for instance, the percentage of total item circulation made up of books fell from 82 to 64 percent, while audiovisuals doubled from 18 to 36 percent. In New Orleans from 1999 to 2003, book circulation dropped by 130,000 while audiovisuals rose 8,000. As a result, for libraries to survive in the digital future, many librarians and administrators have determined that they must transform themselves from book-lending institutions to multimedia information centers. In just one year, 2000 to 2001, the number of public-use Internet terminals in public libraries jumped from 99,453 to 122,798, a 23 percent increase (National Center for Education Statistics,
Public Libraries in the United States, 1993-2001
). The title of the New York Public Library’s 2005 annual report declaims the shift:
The Digital Age of Enlightenment
. A story in the
Wall Street Journal
titled “Libraries Beckon, But Stacks of Books Aren’t Part of the Pitch” (see Conkey) found that even academic libraries join the trend: “Threatened with irrelevance, the college library is being reinvented—and books are being de-emphasized. The goal: Entice today’s technology-savvy students back into the library with buildings that blur the lines between library, computer lab, shopping mall and living room” (see Conkey).
The Apple Store couldn’t be happier. Apple favors the screens-over -books setup, and the message came through loud and clear in summer 2005 when I first spotted one of its outlets. It was in Arlington, Virginia, two blocks from the Clarendon Metro station, where a bunch of trademark shops had opened (Whole Foods, Ann Taylor, Crate & Barrel) and energized a suburban village, drawing hundreds of loungers and promenaders for the day. My wife and I sauntered along the street on a Sunday afternoon, pushing our son in a stroller, enjoying an ice cream, and hearing a Janet Jackson song pumping through speakers hidden in the planters lining the sidewalk. We paused outside the glistening façade, eyeing the sleek design and bustling atmosphere of the place. It had the same layout of goods as the Lenox Square location, and store personnel drifted throughout demonstrating the latest gadgets to curious customers as if they, too, marveled at the wizardry. No need to make a hard sell here. The machines marketed themselves.
Except for the entry. Apple had arranged a display inside the plate-glass window in front that posed a decisive lifestyle choice to anyone who passed by. It had three parts, top, middle, and bottom. At the top was a 15-foot-wide and three-foot-tall photograph hanging from the ceiling and mounted on posterboard showing three shelves of books, mostly classics of literature and social science. Along the bottom was a parallel photograph of serious books, two shelves running from the side wall to the store entrance. In the break between them was propped a real shelf, smooth and white and long, with five lustrous ivory laptops set upon it, their screens open to the street.
The shelf had a caption: “The only books you’ll need.” It didn’t say, “Here’s the new Apple model, the best ever!” No announcements about speed or weight or price. Nothing on compatibility or wireless features. Usually, advertisers present a good as preferable to other brands of the same good—say, auto insurers promising the best rates—but in this case, Apple offered the laptop as preferable to something entirely different: books. In fact, it proclaimed, the laptop has rendered all those books pictured in the display window obsolete. Who needs them? The computer is now the
only
book you need. Many books and journals you can find online, especially now that the Google project is well under way. And soon enough, e-publishers predict, all books will appear in digital format, making hardcover and paperback books something of a curiosity, like card catalogs and eight-track tapes. Already, with the Internet, we have enough handy information to dispense with encyclopedias, almanacs, textbooks, government record books, and other informational texts.