Read Grand Expectations: The United States, 1945-1974 Online
Authors: James T. Patterson
Tags: #Oxford History of the United States, #Retail, #20th Century, #History, #American History
Biologists, medical researchers, and doctors seemed nearly omnipotent. Having developed penicillin and streptomycin in the 1940s, scientists came up with antihistamines, cortisone, and other new antibiotics in the next few years. The National Institute of Health, an insignificant government agency at its founding in 1930, received better congressional funding, expanded into an ever-larger number of disease-specific institutes, and had to be renamed (in 1948) the National Institutes of Health. In 1953 a team of researchers at the University of Cambridge, England, made a spectacular breakthrough by describing DNA (deoxyribonucleic acid), thereby stimulating unprecedented advances in genetics and molecular biology. One of the team members, James Watson, was an American biologist.
20
Physicians, who as late as the 1930s had been able to do little more than diagnose people and to console patients when they got sick, found that they now had a huge pharmacopoeia at their disposal, and they used it.
21
In 1956, 80 percent of drugs being prescribed had reached the market in the previous fifteen years. These included tranquilizers such as Milltown ("don't give a damn pills,"
Time
called them), which were first introduced in the mid-1950s. Sales of tranquilizers were beginning to boom by 1960, suggesting that prosperity, for all its blessings, was associated with anxieties of its own.
22
Medical leaders fought confidently against other scourges. Heart disease was by far the number one killer, and doctors attacked it with open-heart surgery, artificial replacement of valves, and installation of pacemakers. Two other scourges were whooping cough and diphtheria, much-feared killers of children as late as the 1930s. Vaccines greatly reduced the incidence and mortality from both by the 1950s. Researchers also developed promising leads in the effort to prevent or control mumps, measles, and rubella; their efforts began to pay off in the 1960s. Doctors were happy to take credit for these advances and for the health of the American population. People were living longer (an average of 69.7 years by 1960, as opposed to 62.9 in 1940), growing to full stature earlier (by age 20 instead of by age 25 in 1900), and becoming taller and stronger.
23
In fact, physicians and scientists claimed too much. Better nutrition—a blessing of affluence—promoted a good deal of the improvement in life expectancy.
24
Doctors continued to be far from expert about many things. Despite innumerable claims of "breakthroughs," cancer, the number two killer, remained a mysterious, dread disease.
25
Some physicians, moreover, compromised themselves by touting cigarettes in ads, even after studies in the early 1950s had begun to demonstrate the serious health hazards of tobacco: the
Journal of the American Medical Association
still accepted cigarette advertisements at the time.
26
Medical care was so expensive that millions of Americans, lacking health insurance, continued to rely on home medications, faith-healers, or fatalistic grin-and-bear-it.
Still, growing numbers of middle-class Americans, rapidly enrolling in private medical insurance plans and enjoying easier access to care, grew enamored with the medical profession. Doctors reached the peak of their prestige and cultural status during the 1950s and 1960s, by which time they began to be celebrated in television series such as "Dr. Kildare," "Ben Casey," and "Marcus Welby, M.D." Norman Rockwell illustrations continued to lionize the friendly family doctor who came day or night, rain or shine, to heal the sick and console the dying. Men went out of their way to tip their hats (most people still wore them in the 1950s) to doctors on the street.
27
Nothing did more to enhance the status of medical research—and to escalate already growing expectations about the capacity of science to save the world—than the fight against poliomyelitis, a deeply feared scourge of the era. Polio mainly struck children and young people, sometimes killing them, sometimes leaving them paralyzed or confined to "iron lungs" so that they could breathe.
28
Because polio was known to be contagious, especially in warm weather, many schools closed earlier in the spring or opened later in the fall. Terrified parents kept their children out of crowded places, such as movie theaters, stores, or swimming pools. Those with money rushed their children to the country. Desperate for a cure, some 100,000,000 people—nearly two-thirds of the population in the early 1950s—contributed to the March of Dimes, the major organization sponsoring research against the disease. Still, the scourge persisted. An epidemic in 1950 afflicted nearly 32,000 children; another in 1952 affected nearly 58,000 and killed 1,400.
A crash research program then paid off, especially in the lab of Dr. Jonas Salk of the University of Pittsburgh Medical School. Having developed a killed-virus vaccine against the disease, Salk (with government help) mounted a nationwide inoculation program in 1954–55. The testing operated amid relentless publicity and increasingly nervous popular anticipation. Finally, on April 12, 1955, the tenth anniversary of the death of FDR, a polio patient, Salk announced that the vaccine was effective. It was one of the most exciting days of the decade. People honked their horns, rang bells, fired off salutes, dropped work, closed schools, and thanked God for deliverance. Within a few years, by which time most American children had been inoculated, polio ceased to be a major concern. There were only 910 cases in 1962.
29
E
CONOMIC GROWTH AND AFFLUENCE
, many contemporaries thought, were further eroding the class, ethnic, and religious divisions of American society. The onset of "post-industrial society," they said, was ushering in a world of relative social calm and of "consensus."
30
This notion was appealing, especially when it was used to differentiate the United States, prosperous and apparently harmonious, from the harsh and presumably conflict-ridden Soviet Union and other Communist societies. It was also a notion that was highly debatable, for affluence—great engine of change though it was—was neither all-embracing nor all-powerful.
Optimists who perceived the erosion of class distinctions pointed to undeniably significant changes in the world of work. By 1960 some of the larger corporations, such as IBM, offered their employees clean, landscaped places to work in as well as benefits such as employer-subsidized health care, paid holidays, and sick leaves. Work weeks declined a bit, to an average of around forty hours in manufacturing by 1960. The greater availability of leisure time helped to drive the boom in recreation. By the early 1960s millions of American employees could count on annual paid vacations—an unthinkable blessing for most people in the 1930s.
31
Workers also benefited from the expansion of the Social Security program, a contributory system that paid benefits to the elderly from collection of payroll taxes on employers as well as employees. By 1951 roughly 75 percent of employed workers and their survivors had become part of the system.
32
Benefits were hardly high, averaging $42 per month for retired workers in 1950 and $70 by 1960. Retired women workers, most of whom had earned less while employed, generally received less than men, as did survivors. Still, growth in coverage and in benefits was of some help to millions of Americans. The number of families receiving Social Security checks increased from 1.2 million in 1950 to 5.7 million in 1960; in the same period the total paid in benefits rose from $960 million to $10.7 billion.
Labor unions, too, continued to secure improvements for working people. As in the past, these gains were far from universal. Some unions continued to exclude the unskilled, including large numbers of blacks or women. Labor leaders in the 1950s, moreover, largely abandoned hope of achieving governmental direction of such social policies as health insurance, concentrating instead on wringing benefits from employers. The result was that the United States continued to feature a social welfare system that was more private than those in other nations. Still, unions remained a force for many workers in the 1950s. In 1954 they represented nearly 18 million people. That was 34.7 percent of non-agricultural workers, a percentage second only to the 1945 high of 35.5 percent.
33
Union leaders focused on bread-and-butter improvements, often succeeding in achieving higher wages, shorter hours, and better working conditions. Some secured guaranteed cost-of-living adjustments to their members; by the early 1960s more than 50 percent of major labor union contracts included them.
34
Unions also struggled to win benefits—or fringes, as they came to be called. Many managed to negotiate contracts that solidified the advantages of employee seniority and that introduced clear-cut grievance procedures, sometimes with provisions for binding arbitration. These procedures were important, for they strengthened the rule of law in the workplace, they provided much-cherished job security, and they helped management and labor avert strikes.
35
Friction hardly disappeared: a major strike in the steel industry, for instance, shook the nation in 1959. But the number of strikes (and worker-hours lost) did drop dramatically after the highs of the mid-1940s and early 1950s.
36
As promising as these changes were, they did not fully capture the higher expectations that gripped sizeable numbers of American workers at the time. These workers, to be sure, recognized that American society remained unequal, and they were far too sensible to buy into myths about progress from rags to riches. But they were delighted to have the means to buy homes, cars, and household conveniences. These gave them a larger stake in capitalist society, enhancing their dignity as individuals and their sense of themselves as citizens. Many workers also believed (at least in their more hopeful moments) that the United States promised significant opportunity and upward mobility—in short, that social class was not a hard-and-fast obstacle. Those who became parents—a commonplace experience in the baby boom era—came to expect that their children would enjoy a better world than the one they themselves had grown up in during the "bad old days," relatively speaking, of the 1930s.
Were such expectations realistic? According to census definitions of occupation, the hopeful scenario seemed to have some validity. Thanks in large part to labor-saving technology, the percentage of people engaged in some of the most difficult and ill-paid work—in mining and agriculture—continued to plummet. As of 1956, the census also revealed, there were more Americans doing white-collar work than manual labor. Millions of these upwardly mobile people swelled the migration to suburbs, thereby depopulating factory-dominated neighborhoods where working-class styles of life had prevailed. Having broken away from their old neighborhoods, many of these migrants behaved in ways that at least superficially resembled those of the middle classes. For the first time they bought new cars and major household conveniences, shopped in supermarkets instead of mom-and-pop stores, ate processed and frozen foods (which boomed in the 1950s), and dressed—at least while off the job—like many white-collar friends and neighbors.
37
Given such developments, it was hardly surprising that some contemporaries thought the United States was entering a post-industrial stage of capitalism in which class distinctions were withering away.
38
In fact, however, nothing quite so dramatic happened either then or later. While the percentage of people defined by occupation as manual workers declined over time, the
numbers
of workers so employed continued to rise slowly but steadily (from 23.7 million in 1950 and 25.6 million in 1960 to 29.1 million in 1970). If one adds the 4.1 million farmers and farm workers and the 7.6 million "service workers" (a broad category that includes janitors, maids, waitpersons, firefighters, gas station attendants, guards, and domestics) to the 25.6 million people counted as manual workers in 1960, one gets a total of 37.3 million Americans who mainly used their hands on the job. This was 10.1 million more than the number designated as white-collar at the time. "White-collar," moreover, was a misleadingly capacious category: it included 14.4 million clerical and sales workers among its total of 27.2 million in 1960.
39
Many of these people were ill paid and semi-skilled. However one looks at these numbers, two points are clear: blue-collar workers remained central to the economy of the 1950s, and classlessness—as defined by work—remained a mirage.
40
Millions of American workers, moreover, hardly considered themselves white-collar or well-off in their jobs. Contemporary critics such as the sociologist C. Wright Mills and Paul Goodman insisted accurately that much work remained routine, boring, low-paid, and geared to the making, advertising, and selling of consumer gadgetry. People employed at such labor were often dissatisfied and angry. Absenteeism and sloppy workmanship plagued assembly lines.
41
Increasing numbers of these poorly paid employees were women, who in the 1950s were entering the labor force in record numbers.
42
Moreover, blue-collar people who moved to suburbs such as the Levittowns did not suddenly become "middle-class."
43
Rather, they tended to maintain their values and styles of life and continued to think of themselves as members of the "working classes."