B005HFI0X2 EBOK (46 page)

Read B005HFI0X2 EBOK Online

Authors: Michael Lind

BOOK: B005HFI0X2 EBOK
13.15Mb size Format: txt, pdf, ePub

The United States was now the world’s largest debtor and Japan was the world’s largest creditor. The Japanese central bank bought huge quantities of US government bonds to keep the yen artificially low, thereby subsidizing Japanese exports while hurting American exports. The same mercantilist technique would be adopted on a much larger scale by China a few decades later, with disastrous results for the economy of the United States and the world.

THE END OF THE NEW DEAL

The New Deal era came to an end in 1976, not 1980. The Age of Reagan should be called the Age of Carter.

Carter, not Reagan, pioneered the role of the fiscally conservative governor who runs against the mess in Washington, promising to shrink the bureaucracy and balance the budget. Early in his administration, Carter was praised by some on the Right for his economic conservatism. Reagan even wrote a newspaper column entitled “Give Carter a Chance.” The most conservative Democrat in the White House since Grover Cleveland, Carter fought most of his battles with Democratic liberals, not Republican conservatives.

Today’s Democrats would like to forget that supply-side economics was embraced by many members of their own party during the Carter years, while it was resisted by many old-fashioned fiscal conservatives in the GOP. As the economist Bruce Bartlett points out in a history of supply-side economics, “By 1980, the JEC [Joint Economic Committee of Congress] was a full-blown advocate of supply-side economics, despite having a majority of liberal Democrats, such as Senators Edward Kennedy (D-MA) and George McGovern (D-SD). Its annual report that year was entitled, ‘Plugging in the Supply Side.’ ”
62

In defense spending, as in supply-side economics, Reagan continued what his predecessor in the White House had begun. The reversal in the post-Vietnam decline of American military spending began under Carter, following the shock of the Iranian revolution and the Soviet invasion of Afghanistan. Carter called for raising defense spending from a starting point of 4.7 percent of GDP to 5.2 percent of GDP in his final budget for fiscal year 1981. The Carter administration called for defense spending to rise even further by 1987 to 5.7 percent of GDP—only a little below the 6.2 percent at which it peaked in 1986.
63

In hindsight, the neoliberal cure was far worse than the New Deal liberal disease. The maturity of the New Deal’s system of regulated, managerial capitalism coincided with the post–World War II boom and the greatest expansion of the middle class in American history. Consumer advocates, however, blamed it for stifling diversity, libertarians and conservatives claimed it choked off economic progress, and political scientists denounced it for spawning “interest-group liberalism.”

To the applause of liberal Democrats and conservative Republicans alike, the New Deal system of regulation was dismantled in one sector of the economy after another in the late 1970s and 1980s. The result was not the flourishing diversity hoped for by liberal consumer activists nor the solid, sustainable economic growth promised by free-market ideologues. Instead, the result was the collapse of unions, the decline of private R&D, three decades of wage stagnation, and an economy driven by financialization, speculation, and rising debt rather than by productive industry and rising wages.

THE ARGUMENT

T
he Information Age began to transform daily life toward the end of the twentieth century. But the technologies that produced the personal computer and the World Wide Web originated in the mid-twentieth century.

Most of the transformative technologies of the third industrial revolution were products of research backed by the US government. R&D funded by the military, during World War II and the early Cold War, led to nuclear energy, computers, and the Internet.

Combined with a new global infrastructure based on container ships, cargo jets, and satellite communications, computer technology made possible the emergence of global corporations engaged in production in many countries and several continents. The first attempt at an information-age global economy, however, was profoundly flawed. China, Japan, Germany, and other export-oriented nations sought to maintain permanent manufacturing trade surpluses, while American consumers, supplementing stagnant wages with unsustainable levels of debt, provided the engine of growth for the world economy. Mediated by an ever more reckless financial industry and swollen by the flow of the gains from growth to the gambling few rather than the consuming many, global imbalances built up as they had in the 1920s until the world economy crashed in 2008.

Today, it is truer than ever that basic research is the pacemaker of technological progress. In the nineteenth century, Yankee mechanical ingenuity, building largely upon the basic discoveries of European scientists, could greatly advance the technical arts. Now the situation is different. A nation which depends upon others for its new basic scientific knowledge will be slow in its industrial progress and weak in its competitive position in world trade, regardless of its mechanical skill.

—Vannevar Bush, 1945
1

B
eneath a concrete marker in Flushing, New York, is a message to the people of the year AD 6939. Designed by Westinghouse as part of its exhibit at the New York World’s Fair of 1939, the time capsule was buried on the autumnal equinox, September 23, 1939. In addition to the Bible and “The Book of the Record of the Time Capsule,” which contains messages on special paper in nonfading ink from Albert Einstein among others, the buried cache contains an electric lamp socket and electric wall switch, pieces of industrial machinery, samples of alloys, Portland cement, a newsreel, a watch, an alarm clock, a camera, a safety razor, fountain pen, swatches of cloth and asbestos, a lady’s hat, coins and a dollar bill, and a Mickey Mouse cup.

At one of the darkest moments in history in 1939, the introduction to “The Book of the Record of the Time Capsule” was defiant in its optimism: “In our time many believe that the human race has reached the ultimate in material and social development; others that humanity shall march onward to achievements splendid beyond the imagination of this day, to new worlds of human wealth, power, life and happiness. We choose, with the latter, to believe that men will solve the problems of the world, that the human race will triumph over its limitations and its adversities, that the future will be glorious.”

In the three decades that followed World War II, that optimism was vindicated, at least in the United States and its Western European and East Asian allies. The Westinghouse Corporation buried another time capsule nearby during the 1964 World’s Fair. The later time capsule contained items that had not existed at the time of its predecessor, including graphite from the world’s first nuclear reactor under Stagg Field at the University of Chicago in 1942, a reentry heat shield from the Mercury Aurora 7 spacecraft, the synthetic fibers Orlon, Dacron, and Lycra, a plastic heart valve, a laser rod, a transistor radio, parts of a satellite, and credit cards. And while there was also a Bible, the inclusion of the Beatles single “A Hard Day’s Night,” a bikini, and birth control pills suggested the social changes that had taken place. The technological advances of the mid-twentieth century amounted to a third industrial revolution.
2

VANNEVAR BUSH

Thirteen miles from ground zero, Vannevar Bush lay on a tarpaulin thrown over the desert sand in the darkness of the early morning. The fifty-five-year-old director of the Office of Scientific Research and Development (OSRD) waited expectantly, next to his deputy, James Conant. They listened to the countdown by the physicist Saul Allison: “Three . . . two . . . one . . . zero.” At 5:29:45 Mountain War Time on July 16, 1945, civilization changed forever.

The flash lit the distant desert mountains. Through a piece of dark glass, Bush looked at the fireball rising above the New Mexico desert. The Trinity test was a success. The United States had exploded the first atomic bomb. Returning to the gate of the nearby base, Bush waited for the physicist J. Robert Oppenheimer, the chief scientist in the project, to drive past on his way to a vacation. Bush tipped his hat. Oppenheimer wrote later: “We knew the world would not be the same. A few people laughed, a few people cried, most people were silent. I remembered the line from the Hindu scripture, the Bhagavad-Gita. Vishnu is trying to persuade the Prince that he should do his duty and to impress him takes on his multi-armed form and says, ‘Now, I am become Death, the destroyer of worlds.’ ”
3

On August 6, the United States dropped an atomic bomb on Hiroshima, Japan, killing 100,000 of its inhabitants. On August 9, a second bomb incinerated Nagasaki. On August 15, Emperor Hirohito announced the surrender of Japan. Earlier, on May 7, following the suicide of Adolf Hitler on April 30, the German government had formally surrendered. World War II was over. And the third industrial revolution was under way.

Many individuals contributed to the third industrial revolution of the mid-twentieth century, which engendered nuclear energy and information technology, as the second industrial revolution had produced electric power generation and the internal combustion engine and the first had given rise to the steam engine and the telegraph. But even when he was playing only a supporting role, Bush was present at key moments in the early years of the third industrial era, with personal connections to everything from the Manhattan Project and the computer to microwave ovens.

Thomas Edison was similarly ubiquitous during the second industrial revolution, inventing or contributing to the development of transformative technologies including electric-power generation, the incandescent lightbulb, the phonograph, and the motion picture. Edison was a folk hero in his day and his fame endures. In contrast, Bush is little known, except to historians of information technology. What explains Edison’s continuing celebrity and Bush’s relative obscurity?

One reason is publicity. Partly in order to raise funds from private investors, Edison did everything in a blaze of self-generated publicity, often making promises of imminent breakthroughs that he could not keep. In contrast, the wartime work of Bush and the Office of Scientific Research and Development was classified.

Another reason for the failure of Bush to find a place in the imagination of later generations has to do with the changing nature of technological innovation. Already by Edison’s time, corporate and government laboratories and university research institutes in Germany, Britain, and the United States were displacing the individual inventor, even as enormous, consolidated managerial companies were succeeding small firms owned and operated by their founders. By World War II, the personnel and resource requirements of basic scientific research were so immense that only governments like the US federal government could organize and fund them.

As an engineer and scientist in his own right, Bush made many contributions to the evolution of technology and inspired countless others. But his most lasting contribution may be not any particular technology but the institutional structure that generates technological breakthroughs. In his report and later book
Science, the Endless Frontier
, commissioned by President Roosevelt, and his work with Congress in establishing the National Science Foundation, Bush helped to lay the foundation for the creative collaboration among government, the academy, and industry from which most transformative innovations in recent generations have emerged.

“AS WE MAY THINK”

In July 1945, as Bush was overseeing the Trinity test, the
Atlantic
published his essay “As We May Think,” in which he speculated about the possibilities for technological augmentation of the human mind.
4
Among the new technologies that Bush correctly predicted in “As We May Think” were a “thinking machine” (the calculator), a “vocoder” which would type in response to dictation (voice-activated software), and a “cyclops camera” worn on the forehead (this has yet to arrive, although cell phones with digital photography are close approximations).

The most important of the imaginary devices that Bush described in his 1945 essay was the memex, a version of the personal computer that became a universal appliance in developed societies by 2000. He got the mechanism wrong, speculating that the combination of pictures and text would be embodied in microtape. But his vision of a desk and a screen inspired the engineers who developed the monitor and the mouse. And his speculations about a universal network with a library that could be accessed through “trails” mimicking the associative nature of human thought anticipated the Internet, online dictionaries like Wikipedia, and hyperlinks.

The impact of “As We May Think” was magnified by media interest in the article. After its initial publication in the
Atlantic
, the essay was popularized on July 23, 1945, in
Time
under the title “A Machine That Thinks” and on September 10, 1945,
Life
magazine published an illustrated condensation: “A top U.S. scientist foresees a possible future world in which man-made machines will start to think.”
5

What prevented “As We May Think” from being a catalog of gadgets was Bush’s discussion of how information technology could be used to augment human intelligence. Popularization of his work notwithstanding, he was more interested in a machine to help thinking than in a thinking machine. Because human thought is based to a large degree on associations among ideas and images, Bush believed that there was a need for “associative indexing, the basic idea of which is a provision whereby any item may be caused at will to select immediately and automatically another. This is the essential feature of the memex. The process of tying two items together is the important thing.”

Bush’s memex would tie together items by means of “trails.” He provided an example: “The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected [on the screen]. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item.”

In addition to main trails and side trails, there would be the “skip trail which stops only on the salient items.” Individuals would save their information trails and share them with friends and colleagues. Bush predicted: “Wholly new forms of encyclopedia will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.”

In tribute to Bush, this chapter will be organized as though it were a series of associative trails on an imaginary memex. By following them, we discover that Bush is celebrated as a pioneer of virtually every aspect of today’s computer technology, from its physical form to hyperlinks and the Internet.

MISTER SCIENCE

Let us begin with the main trail of Bush’s biography. “It is interesting that Mister Science looks so much like Mr. America,”
Coronet
magazine reported in a profile of Bush published in 1952. “He reminds you of somebody—Will Rogers? Uncle Sam? Anyway, you’ve seen this face before, and it belonged to a man you liked.”
6

Bush was the quintessential New England Yankee inventor, born the son of a Universalist minister in Everett, Massachusetts, in 1890. Earning degrees from Tufts, Harvard, and MIT, he worked for the navy during World War I on the problem of detecting submarines and then became a professor at MIT. While working on a “network analyzer” that simulated electrical networks, Bush and his MIT team developed the differential analyzer. The differential analyzer was an early computer that used electromechanical gears and spinning disks to do calculations, a version of the “difference engine” of which the Victorian British scientist Charles Babbage had dreamed. Bush continued to improve the analyzer after being appointed vice president of MIT and dean of its School of Engineering.

In 1938, as the world moved toward the second global war in a generation, Bush’s appointment as president of the Carnegie Institute brought him to Washington, DC. In June 1940, Bush met with President Roosevelt and argued that there was a need for an organization that could coordinate research in military technology. With Roosevelt’s backing, Bush became chairman of the new National Defense Resource Committee (NDRC). In his memoirs Bush wrote: “There were those who protested that the action of setting up NDRC was an end run, a grab by which a small company of scientists and engineers, acting outside established channels, got hold of the authority and money for the program of developing new weapons. That, in fact, is exactly what it was.”
7

In June 1941, the NDRC was superseded by the OSRD. As its director, Bush reported directly to the president. He presided over the greatest R&D organization of all time. With inexhaustible military resources at his command and teams made up of great scientists who had fled Hitler, such as Leo Szilard, Enrico Fermi, and Niels Bohr, as well as brilliant Americans, Bush supervised one breakthrough after another, in fields as different as radar and nuclear energy, jet engines, and early computers. Science was now organized, marshaled, and mobilized in the service of the war against Hitler and his allies. The press called Bush “the general of science.”

Bush found the perfect patron and partner in Roosevelt. All his life FDR was entranced by visions of abundance and freedom made possible by technological advances. Following World War I, in his role as assistant secretary of the navy, Roosevelt promoted the infant American radio industry by creating the public-private Radio Corporation of America (RCA). In the only book he ever published,
Whither Bound?
, a 1926 lecture at the Milton Academy prep school, Roosevelt praised “the scientist people, and economists, and industrialists, and wild-eyed progressives” who were “bringing so many new things into everyday life.” He foresaw the day “when by the twist of a knob or the push of a button you see and talk with some true friend half the world away. . . . Cheaper power, canned power, compressed into light weight and small bulk, will make our present flying abilities childish within our own lives. So, too, with transportation by land.”
8

Other books

A Piece of Me by Yvette Hines
Blue Blue Eyes: Crime Novel by Helena Anderson
Something Like This (Secrets) by Eileen Cruz Coleman
Rock Hard by LJ Vickery
Target: Tinos by Jeffrey Siger
The Christmas Tree Guy by Railyn Stone
014218182X by Stephen Dobyns
Parish by Murphy, Nicole
Dance of Demons by Gary Gygax