Read American Studies Online

Authors: Louis Menand

American Studies (12 page)

BOOK: American Studies
4.72Mb size Format: txt, pdf, ePub
ads
As appealing as it understandably is to many people, the idea that a core curriculum of great books is the solution to the diversification
of ability and occupation among students and future citizens in a democracy is surely the weakest point in the general education program. For the “great books” don’t, taken together, express anything like a coherent worldview. They don’t even express a set of coherent individual worldviews. Skepticism about such coherence is precisely one of the things in which, in many cases, their greatness consists. It is probably enlightening for students to encounter this kind of skepticism; but it is not (whatever the term is supposed to mean) “binding.” Still, the Harvard report’s sensitivity to socioeconomic diversity (a subject rarely addressed in discussions of higher education today) is the frankest and the most admirable thing about it. It is the invocation of a homogenized conception of “culture” as the palliative to class difference, and the belief (elaborated on at length in the report) that educational institutions can replace the family, the church, and the community as the means of acculturation, that seem misconceived.
For there is great merit in the idea of “general education” when it is not circumscribed by a “great books” program. American colleges do fail to provide a common core of learning. Most students graduate without any exposure to knowledge about American political, legal, and business institutions; they are no better equipped to petition a congressman, or to write a will, or to buy stock than they were when they left high school. What they have received, for the most part, is specialized training in a scholarly discipline—the consequence of the curriculum having been handed over to the departments, whose members are selected on the basis of professional attainment rather than commitment to teaching or to “general” learning. Despite the widespread call for it in Conant’s time, general education has seldom been tried, even in the “great books” format. Where it has been, it has commonly taken the form of “distribution requirements”—that is, mandatory smattering.
What happened to Conant’s educational ideals? The three decades after the Second World War, from 1945 to 1975, were a period
of enormous growth in American higher education. It is a period known in the literature on higher education as the Golden Age. The number of American undergraduates increased by almost 500 percent, the number of graduate students by nearly 900 percent.
21
In the 1960s alone, enrollments more than doubled, from 3.5 million to just under eight million; the number of doctorates awarded annually tripled; and more faculty were hired than had been hired in the entire 325-year history of American higher education to that point.
22
At the height of the expansion, between 1965 and 1972, new community college campuses were opening in the United States at the rate of one every week.
23
This growth was fueled in part by the baby boom, in part by the sustained high domestic economic growth rate in the 1950S, and in part by cold war priorities. After the Second World War, the national government began the practice of contracting research out to universities, largely through the efforts of Conant and his government colleague Vannevar Bush, former vice president and dean of engineering at MIT and director of the Office of Scientific Research and Development during the war. After Sputnik, the National Defense Education Act of 1958 provided large government grants to universities, directed principally at science and foreign languages. In this expanding universe, the ideals of meritocracy, disinterested inquiry, and the general education curriculum centered on the “great books”—the ideals for which Conant stood—were not often questioned. They were part of the culture of assumptions in which higher education operated.
After 1975, though, the higher education system changed. Its growth leveled off, and the economic value of a college degree began to fall. In the 1970s, the income differential between college graduates and high school graduates dropped from 61 percent to 48 percent.
24
The percentage of students going on to college therefore began to drop as well, and a system that had quintupled, and more, in the span of a single generation suddenly found itself with empty dormitory beds and a huge tenured faculty. One of the ways in which colleges and universities responded to this crisis was by expanding the pool of candidates for admission, since there were fewer white American males for selective schools to choose from.
After 1970, virtually every nonmilitary all-male college in the United States went coed. People had talked before 1970 about the educational desirability of coeducational and mixed-race student bodies, but in the end it was economic necessity that made them do it.
25
In 1947, 71 percent of college students in America were men; as late as 1965, 94 percent of college students in the United States were classified as white. By 1998, a minority of college students, 44 percent, were men, and 71 percent were classified as white.
26
Most of this diversification happened after 1975, and a single statistic makes the point. In the decade between 1984 and 1994, the total enrollment in American colleges and universities increased by two million, but not one of those two million new students was a white American-born man. They were all nonwhites, women, and foreign students. The absolute number of white American men in American higher education actually declined between 1984 and 1994.
27
Faculty demographics changed in the same way, a reflection not so much of changes in hiring practices as of changes in the group that went to graduate school after 1975. Current full-time American faculty who were hired before 1985 are 28 percent female and about 11 percent nonwhite or Hispanic. Full-time faculty hired since 1985—that is, for the most part, faculty who entered graduate school after the Golden Age—are half again as female (40 percent) and more than half again as nonwhite (18 percent).
28
In 1997, there were 45,394 doctoral degrees conferred in the United States; 40 percent of the recipients were women (in the arts and humanities, just under 50 percent were women), and only 63 percent were classified as white American citizens. The other 37 percent were nonwhite Americans and foreign students.
29
The demographic mix in higher education, both students and faculty, completely changed in the span of about a generation.
As the new populations began to arrive in numbers in American universities after 1970, the meritocratic rationale was exploded. For it turned out that cultural differences were not only not so easy to bracket as men like Conant had imagined; those differences suddenly began to seem a lot more interesting than the similarities. This trend was made irreversible by Justice Lewis Powell’s decision
in
Regents of the University of California v. Bakke,
handed down by the U.S. Supreme Court in 1978.
30
Powell changed the language of college admissions by decreeing that if admissions committees wanted to stay on the safe side of the Constitution, they had to stop talking about quotas and start talking about diversity instead. Powell’s opinion blew a hole in meritocratic theory, because he pointed out what should have been obvious from the beginning, which is that college admissions, even at places like Harvard, have never been purely meritocratic. Colleges have always taken nonstandardized and nonstandardizable attributes into account when selecting a class, from musical prodigies to football stars, alumni legacies, and the offspring of local bigwigs. If you admitted only students who got top scores on the SATs, you would have a very boring class. “Diversity” is the very word Powell used in the
Bakke
opinion, and there are probably very few college catalogues in the country today in which the word “diversity,” or one of its cognates, does not appear.
In this radically more heterogeneous environment, the value of the SAT began to be questioned—in 2001 the University of California system announced it was dropping the test from its requirements for admission—and the curriculum began to undergo a series of changes. These changes have become visible in the recent emphasis on multiculturalism (meaning exposure to specifically ethnic perspectives and traditions) and values (the ethical implications of knowledge); in a renewed interest in service (manifested in the emergence of internship and off-campus social service programs) and in the idea of community; in what is called “education for citizenship”; and in a revival of a Deweyite conception of teaching as a collaborative process of learning and inquiry. The vocabulary of “disinterestedness,” “objectivity,” “reason,” and “knowledge,” and talk about things like “the scientific method,” the canon of great books, and “the fact-value distinction,” were replaced, in many fields, by talk about “interpretations” (rather than “facts”), “perspective” (rather than “objectivity”), and “understanding” (rather than “reason” or “analysis”). An emphasis on universalism and “greatness” was replaced by an emphasis on diversity and difference; the scientistic norms which once prevailed in many of the “soft” disciplines
began to be viewed with skepticism; context and contingency were continually emphasized; attention to “objects” gave way to attention to “representations.”
31
This transformation accompanied the change in the demographics of higher education; it was not caused by that change. For in many ways it was essentially a backlash against the excessive respect for scientistic norms that characterized the early cold war university. The transformation also demonstrates how historically specific the ideals of Conant and his generation of academic leaders, for all their patina of postideological universality, really were.
People like Conant did have a remarkable confidence in their beliefs; it’s one of the things that make them seem a little remote to most Americans on this side of the cold war. Conant once asked the Harvard librarian to undertake secretly an appraisal of the costs of microfilming the printed record of Western civilization, which he proposed to bury in various places around the country, thus preserving it for survivors of a nuclear war. The librarian advised that the costs would probably be huge, and Conant dropped the project, having convinced himself that university libraries outside major cities would escape destruction in a nuclear exchange. But he stuck with the idea. “Perhaps the fated task of those of us now alive in this country,” he wrote in
Education in a Divided World,
in 1948, “is to develop still further our civilization for the benefits of the survivors of World War III in other lands.”
32
He had what seems today an almost naive faith in the virtues of the society for which he worked. It does not seem to have crossed his mind that the great works of a civilization that had ended in an act of self-destruction might not be the first thing the survivors of a nuclear holocaust would think it worthwhile to have.
W
illiam S. Paley became president of the Columbia Broadcasting System in 1928, when he was twenty-six, and he ran the company until 1983, when he retired and assumed the title of “founder chairman.” Already a wealthy man when he and his family bought Columbia (the Paleys were cigar manufacturers), he was deeply attached to the style of living that enormous amounts of money make possible, and he cultivated a taste in fine art, fine furniture, fine clothing, and fine food. He married two striking and intelligent women-Dorothy Hart Hearst, whom he divorced in 1947, and Barbara Cushing Mortimer, called Babe, whom he married soon after his divorce and who died in 1978—in a time when it was expected of striking and intelligent women of a certain class that they would devote themselves to the comfort and adornment of their husbands’ lives. He was, on social occasions and on most business occasions, a charming man who disliked unpleasantness and preferred to let others act as the
agents of his disapproval, and as radio and then television grew to become the most influential and lucrative communications media in the history of the world, he had a personality equipped to extract a full measure of satisfaction from the power and prestige his position afforded him—something that was not lost on those who knew him. “He looks,” Truman Capote once remarked, “like a man who has just swallowed an entire human being.”
Sally Bedell Smith’s biography of Paley,
In All His Glory,
came out in 1990, the year of Paley’s death. The book epitomizes the reaction most people have to a life like Paley’s: it is written for two audiences—one that would like a peek at the glamour of Paley’s world, and another that would like to confirm its intuition that someone other people find so glamorous must actually be a person of rather limited accomplishment. For the first audience, the home furnishings, vacations, distinguished golfing partners, and romantic liaisons in Paley’s life are carefully catalogued. One appendix lists the bequests, mostly of expensive jewelry, in Babe Paley’s will; another gives the dollar value of Paley’s holdings in CBS stock for each year of his life, beginning in 1928. (It increased.) For the audience that wants to see what the legend looks like with the varnish removed, there are stories of coldness to friends and to children, of bad business decisions and of credit stolen from others for good ones, and of personal mythmaking on an imperial scale. First we are asked to admire the cake, and then we get to eat it. This is a common enough form of celebrity biography, and satisfying in its mildly opportunistic way. It’s nice to know how people who strike it rich spend their money, and it’s also nice to feel that if we struck it rich ourselves we’d deserve it a little more and spend the money a little less selfishly. When we read of Babe Paley’s being driven by her chauffeur to Kennedy Airport so that she can pick up the freshly shot game bird she has had flown in from Europe for her husband’s dinner, our disappointment at being financially incapable of this sort of thing is exactly balanced by our satisfaction in feeling morally incapable of it as well.
Smith’s chapters on Paley’s personal life contain many stories like the story of the imported fowl, but the story of the fowl is about
as exciting as most of them get. For Paley aspired merely to live well, and in what he understood to be the best possible taste; and although this aspiration led him, given his means, to excesses, it precluded any genuine folly. He was, with a few significant philanthropic exceptions, much too prudent to waste his money on anything but himself. His single traditional vice, apparently, was philandering, which is neither the most unusual vice for a very rich man to have nor the most interesting.
Smith’s treatment of Paley’s career as a broadcaster is somewhat less breathless than her treatment of his career as a devotee of the high life. Her pages on the business side of Paley’s life are concerned mostly with debunking his reputation as a broadcasting genius. She does concede, as most commentators do, that Paley understood better and sooner than anyone else in broadcasting the importance of programming. (It seems odd that people in broadcasting ever doubted that the choice and quality of programs were important, but they did.) And she believes that he had a genuine instinct for guessing what most Americans wanted to hear and see. But she also points out that Paley went into radio not because he had a precocious sense of its potential, as he later claimed, but simply in order to get some executive training before returning to the cigar business; that he saw no money in television when it appeared, and discouraged efforts to move his company into it; and that during the years—from the late 1940s through the mid-1950s—when CBS assembled its television network and overtook NBC to become the dominant force in the industry, he was generally distracted by the enjoyment of his private life, and by a brief stint in public service, and it was really Frank Stanton, the president of the company (by then Paley had become chairman), who engineered CBS’s triumph.
This part of the cake has been sliced by others, as well—in Robert Metz’s
CBS: Reflections in a Bloodshot Eye
(1975) and in David Halberstam’s
The Powers That Be
(1979), a work whose sections on CBS so irritated Paley when he read a version of them in the
Atlantic Monthly
in 1975 that he (and a large staff) composed his own memoir, As
It Happened,
and arranged to have it published a few weeks before Halberstam’s book appeared. What distinguishes
Smith’s book from those earlier efforts is all the stargazing attention it pays to Paley’s personal life. But the stargazing makes a point of its own. For the way Paley lived—the homes, the art collection, even the wives—had as much to do with the success of CBS, and of network television generally, as his business decisions did. There was nothing foreordained about the dominance of network television; it was achieved in defiance of the normal mechanisms of the market and the normal tinkering instincts of politicians. Network television was an empire protected by an image, and it was Paley’s real genius to understand why it was that every enhancement of his private life was also an investment in the continuing prosperity of the company he ran and the medium he helped to establish.
Americans who grew up in the postwar era are so accustomed to television as a fixture in their lives that its presence seems almost a dispensation of nature. Virtually everyone’s memory of it is the same. If you had a set in 1955, it had twelve VHF (very high frequency) channels, all except three of which probably broadcast static—unless, by performing calisthenics with your aerial, you could pick up a network station from a distant city, the ghostly twin of a local channel. The picture was black and white, and if you switched on the set very early or very late in the day, you could contemplate an eerie piece of electronic arcana, now nearly forgotten—a test pattern. Ed Sullivan had already been on the air for seven years.
In 1970 your set had an extra dial, for UHF, the ultra-high frequency spectrum (Channels 14 to 83). This was a piece of machinery required by Congress on all televisions made after 1963, and it was somehow awkward to operate: you always seemed to be dialing past your station. On the VHF dial, there was now an “educational” channel: the Public Broadcasting System (PBS) had begun operating in 1969, the culmination of seventeen years of efforts to establish a national noncommercial network. You mostly watched the commercial networks, though, or an unaffiliated channel that,
when it wasn’t showing sports or old movies, showed network reruns. The colors (color programming began in 1965) were so oversaturated that they seemed radioactive. Ed Sullivan was still on the air.
Today, even the Ed Sullivan impressionists are gone. You watch a sleek cube fed by a cable, and, by keeping your thumb pressed to a remote control, you can skim dozens of channels in a few seconds. One channel plays music videos all day; one broadcasts world news; one has a local talk show (a consequence of mandated “public access”); one or more, for a fee, show recent movies, uncensored; one has a psychic you can call for on-the-air advice; one displays jewelry you can shop for by phone. You can watch programs on which pickup trucks with oversized tires are driven across rows of parked cars, and programs on which naked people discuss sex in a manner so unstimulating as to make you turn back to watch the pickup trucks. There is always sport, and most of the local teams’ games are available. There are (since the arrival of Fox, in 1986) five “over the air” networks, and one or more “superstations,” beamed into the system by satellite. There is still, it’s true, nothing to watch, but you can turn to Channel 3 and put a rented movie in your videocassette or DVD player.
Because we tend to think of technological development as analogous with biological development, we’re likely to assume that changes in our experience of television reflect changes in television technology. It seems like a simple matter of evolution. We had to have black-and-white pictures before we could have color; we had to have twelve VHF channels before we could have seventy UHF channels, and to have national over-the-air networks before we could have cable, pay-per-view channels, and local programming. We lived with broadcasting so that one day we could have narrowcasting. In fact, the development of American television had almost nothing to do with technology. Network television was no more natural or inevitable than any of the other empires that locked the cold war world into place. It was no more accidental, either, but (like those other empires) it considered itself extremely vulnerable to accident, and understood eternal vigilance to be the price of its survival.
One of the problems with scholarship on television is that a technological and corporate history of the medium often brings us no closer to understanding television as a cultural phenomenon—though it is a common assumption of mass-culture scholarship that such an approach must. An analysis of the economics of television still gives us no way to choose among the various slants on the medium people generally take: television is escapist, and television is propagandistic; television reflects what people are thinking, and television tells people what to think; television is too commercial, and the commercialism of television is inevitable; television is run by liberal elites, television is a pawn of politicians, and television is the tool of corporate America. There is also the tendency to express surprise at the obvious—for example, television’s generally patriotic and consumerist biases. These are sometimes taken to have a brainwashing effect: scholars sometimes write as though they had forgotten that no one has ever been forced to watch a television show. It is pointless to blame everything that is wrong with television on capitalism, unless you are prepared to say that America has never produced any commercial culture worth caring about, which is something you would have to be culturally benumbed to believe. Still, where a biographer like Smith, in the interests of keeping her subject vividly before us, asks us to think of what we watched in the network years as largely a function of personality—“the flickering images on CBS represented the soul and sensibility of Bill Paley,”
1
as she puts it—technological and economic histories of the medium remind us that the style, the quality, the content, and even the color of network television programs were determined by forces much too strong for any personality, even an oversized one like Paley’s, to have resisted.
2
Television predates the Second World War. NBC started regular broadcasting in 1939, and although America’s entry into the war delayed the development of national networks for several years, television technology had been quite fully explored by 1945. The return of the troops produced a massive potential audience and, for advertisers, massive consumer demand; and in 1948, when less than half of 1 percent of American households had television sets, national
broadcasting made its real debut. Confronted with an industry poised to proliferate wildly and in need of elaborate technical coordination (standardization of signals and receivers, allocation of stations in the broadcast spectrum, and so on), the Federal Communications Commission (FCC) imposed a freeze on the licensing of new television stations in September 1948. The freeze turned out to be a kind of cultural Yalta; it lasted until 1952, by which time 33 percent of American homes had television sets, and NBC and CBS, the companies that had dominated radio broadcasting, and whose affiliates had secured most of the television licenses granted before the freeze, essentially controlled the field—which, along with the less powerful ABC (a company created in 1943 after the government ordered NBC to sell one of its two radio networks), they continued to do for more than thirty years.
The man who had the most to say about the way in which television was to enter the culture was not Paley but Paley’s rival, NBC’s David Sarnoff. NBC was the broadcasting arm of RCA, of which Sarnoff was the president. RCA also had a manufacturing arm, which produced television sets—sets engineered, as it happened, to receive twelve VHF channels and show a black-and-white picture. A color-picture technology had been developed by CBS in 1940; but Sarnoff, though he knew of this achievement, was not interested in color television. RCA had not pursued it, and CBS color technology was incompatible with RCA sets. In 1947, after Sarnoff promised that RCA was working on a color technology that could be used with its own sets, the FCC refused to approve the CBS—or any other—color system. (Several months after the decision was announced, the FCC’s chairman left to become a vice president of RCA.) By 1953, when the commission lifted the ban on the manufacture of color sets, RCA, which had evidently
not
been working on a color picture, had flooded the market: there were twenty-three million black-and-white sets already in use.
The licensing freeze left the UHF market permanently underdeveloped. The networks’ programs were designed to be broadcast on VHF, since their affiliates were VHF stations; and most sets could not receive UHF signals (which, incidentally, transmit color pictures
better than VHF signals do). In 1953 the FCC began licensing UHF stations, but television manufacturers were reluctant to equip their sets to receive the signal—as late as 1960, only 7 percent of the sets in the country had UHF reception—for the simple reason that many set manufacturers owned VHF broadcasting stations (which might possibly explain why the UHF dial was such a nuisance to use). And cable, far from being a recent refinement, is a technology that predates broadcasting itself: transmitting electronic signals through wires is more rudimentary than transmitting them through the air (or through the ether, as the pioneers of radio imagined it). There were subscriber-supported cable systems for radio as early as 1923, and television networks have always used coaxial cable, leased from phone companies, to transmit their pictures to broadcasting stations.
In short, almost the only technologies used by television in the 1990S which could
not
have been used in the 1950s were satellite transmission and the VCR, and even those were not recent inventions. Video recorders went on the market in 1957 (though they were expensive, and did not use cassettes), and Telstar, the first television satellite, was launched in 1962. (The networks undertook to control development in those areas, too, for obvious reasons.) What we might have had for the last forty years is what, almost everywhere, we have had only since around 1990: a mixture of local and national programming and commercial-free pay services on a hundred channels—and all in living color.
BOOK: American Studies
4.72Mb size Format: txt, pdf, ePub
ads

Other books

The Lost & Found by Katrina Leno
There Was an Old Woman by Hallie Ephron
The Knitting Diaries by Debbie Macomber
Immortal Promise by Magen McMinimy, Cynthia Shepp Editing
Playing Pretend by Tamsyn Bester
The House You Pass on the Way by Jacqueline Woodson
The Human Age by Diane Ackerman
Medieval Hunting by Richard Almond
Reckless by Ruth Wind