The Day After Roswell (23 page)

Read The Day After Roswell Online

Authors: Philip J. Corso

Tags: #Non-Fiction, #Science, #Paranormal, #Historical, #Politics, #Military

BOOK: The Day After Roswell
13.82Mb size Format: txt, pdf, ePub

Right about the time of Hollerith’s death, a German
engineer named Konrad Zuse approached some of the same challenges that
had confronted Charles Babbage a hundred years earlier: how to build
his own version of a universal computing machine that could reconfigure
itself depending upon the type of calculation the operator wanted to
perform. Zuse decided that instead of working with a machine that
operated on the decimal system, which limited the types of arithmetic
calculations it could perform, his machine would use only two numbers,
0 and 1, the binary system. This meant that he could process any type
of mathematical equation through the opening or closing of a series of
electromagnetic relays, switches that would act as valves or gates
either letting current through or shutting it off. These relays were
the same types of devices that the large telephone companies, like the
Bell system in the United States, were using as the basis of their
networks. By marrying an electrical power supply and electric switches
to the architecture of Babbage’s Analytical Engine and basing
his computations in a binary instead of a decimal system, Zuse had come
up with the European version of the first electrical digital computer,
an entirely new device. It was just three years before the German
invasion of Poland and the outbreak of World War II.

In the United States at about the same time as Zuse was
assembling his first computer in his parents’ living room,
Harvard mathematics professor Howard Aiken was trying to reconstruct a
theoretical version of Babbage’s computer, also using
electromagnetic relays as switching devices and relying on a binary
number system. The difference between Aiken and Zuse was that Aiken had
academic credentials and his background as an innovative mathematician
got him into the office of Thomas Watson, president of IBM, to whom he
presented his proposal for the first American digital computer. Watson
was impressed, authorized a budget for $1 million, and, right before
the attack on Pearl Harbor, the project design was started up at
Cambridge, Massachusetts. It was then moved to IBM headquarters in New
York during the war.

Because of their theoretical ability to calculate large sets
of numbers in a relatively short period of time, digital computers were
drafted into the war effort in the United Kingdom as a code breaking
device. By 1943, at the same time that IBM’s first shiny
stainless steel version of Aiken’s computer was up and
running in Endicott, New York, the British were using their dedicated
crypto analytical Colossus computer to break the German codes and
decipher the code creating ability of the German Enigma - the code
machine that the Nazis believed made their transmissions indecipherable
to the Allies. Unlike the IBM-Aiken computer at Harvard and Konrad
Zuse’s experimental computer in Berlin, the Colossus used
radio vacuum tubes as relay switches and was, therefore, hundreds of
times faster than any experimental computer using electromagnetic
relays. The Colossus, therefore, was a true breakthrough because it
married the speed of vacuum tube technology with the component design
of the Analytical Engine to create the first modern era digital
computer.

The British used the Colossus so effectively that they quickly
felt the need to build more of them to process the increasingly large
volume of encrypted transmissions the Germans were sending, ignorant of
the fact that the Allies were decoding every word and outsmarting them
at every turn. I would argue even to this day that the technological
advantage the Allies enjoyed in intelligence gathering apparatus,
specifically code breaking computers and radar, enabled us to win the
war despite Hitler’s initial successes and his early weapon
advantages. The Allies’ use of the digital computer in World
War II was an example of how a superior technological advantage can
make the difference between victory and defeat no matter what kinds of
weapons or numbers of troops the enemy is able to deploy.

The American and British experience with computers during the
war and our government’s commitment to developing a viable
digital computer led to the creation, in the years immediately
following the war, of a computer called the Electronic Numerical
Integrator and Calculator, or ENIAC.  ENIAC was the brain
child of Howard Aiken and one of our Army R&D brain trust
advisers, the mathematician John von Neumann. Although it operated on a
decimal instead of a binary system and had a very small memory, it
relied on radio vacuum tube switching technology. For its time it was
the first of what today are called “number crunchers.

When measured against the way computers developed over the
years since its first installation, especially the personal computers
of today, ENIAC was something of a real dinosaur. It was loud, hot,
cumbersome, fitful, and required the power supply of an entire town to
keep it going. It couldn’t stay up for very long because the
radio tubes, always unreliable even under the best working conditions,
would blow out after only a few hours’ work and had to be
replaced. But the machine worked, it crunched the numbers it was fed,
and it showed the way for the next model, which reflected the
sophisticated symbolic architectural design of John von Neumann.

von Neumann suggested that instead of feeding the computer the
programs you wanted it to run every time you turned it on, the programs
themselves could be stored in the computer permanently. By treating the
programs themselves as components of the machine, stored right in the
hardware, the computer could change between programs, or the routines
of subprograms, as necessary in order to solve problems. This meant
that larger routines could be processed into subroutines, which
themselves could be organized into templates to solve similar problems.
In complex applications, programs could call up other programs again
and again without the need of human intervention and could even change
the subprograms to fit the application. von Neumann had invented block
programming, the basis for the sophisticated engineering and business
programming of the late 1950s and 1960s and the great, great
grandmother of today’s object oriented programming.

By 1947, it had all come together: the design of the machine,
the electrical power supply, the radio vacuum tube technology, the
logic of machine processing, von Neumann’s mathematical
architecture, and practical applications for the computer’s
use. But just a few years shy of the midpoint of the century, the
computer itself was the product of eighteenth and nineteenth century
thinking and technology. In fact, given the short comings of the radio
tube and the enormous power demands and cooling requirements to keep
the computer working, the development of the computer seemed to have
come to a dead end. Although IBM and Bell Labs were investing huge sums
of development money into designing a computer that had a lower
operational and maintenance overhead, it seemed, given the technology
of the digital computer circa 1947, that there was no place it could
go. It was simply an expensive to build, expensive to run, lumbering
elephant at the end of the line. And then an alien spacecraft fell out
of the skies over Roswell, scattered across the desert floor, and in
one evening everything changed.

In 1948 the first junction transistor - a microscopically thin
silicon sandwich of w-type silicon, in which some of the atoms have an
extra electron, and p-type silicon, in which some of the atoms have one
less electron - was devised by physicist William Shockley. The
invention was credited to Bell Telephone Laboratories, and, as if by
magic, the dead end that had stopped the development of the dinosaur
like ENIAC generation of computers melted away and an entirely new
generation of miniaturized circuitry began. Where the radio tube
circuit required an enormous power supply to heat it up because heat
generated the electricity, the transistor required very low levels of
powers and no heating up time because the transistor amplified the
stream of electrons that flowed into its base. Because it required only
a low level of current, it could be powered by batteries. Because it
didn’t rely on a heat source to generate current and it was
so small, many transistors could be packed into a very small space,
allowing for the miniaturization of circuitry components. Finally,
because it didn’t burn out like the radio tube, it was much
more reliable. Thus, within months after the Roswell crash and the
first exposure of the silicon wafer technology to companies already
involved in the research and development of computers, the limitations
on the size and power of the computer suddenly dropped like the removal
of a roadblock on a highway and the next generation of computers went
into development. This set up for Army R&D, especially during
the years I was there, the opportunity for us to encourage that
development with defense contracts calling for the implementation of
integrated circuit devices into subsequent generations of weapons
systems.

More than one historian of the microcomputer age has written
that no one before 1947 foresaw the invention of the transistor or had
even dreamed about an entirely new technology that relied upon
semiconductors, which were silicon based and not carbon based like the
Edison incandescent tube. Bigger than the idea of a calculating machine
or an Analytical Engine or any combination of the components that made
up the first computers of the 1930s and 1940s, the invention of the
transistor and its natural evolution to the silicon chip of integrated
circuitry was beyond what anyone could call a quantum leap of
technology. The entire development arc of the radio tube, from
Edison’s first experiments with filament for his incandescent
lightbulb to the vacuum tubes that formed the switching mechanisms of
ENIAC, lasted about fifty years. The development of the silicon
transistor seemed to come upon us in a matter of months. And, had I not
seen the silicon wafers from the Roswell crash with my own eyes, held
them in my own hands, talked about them with Hermann Oberth, Wernher
von Braun, or Hans Kohler, and heard the reports from these now dead
scientists of the meetings between Nathan Twining, Vannevar Bush, and
researchers at Bell Labs, I would have thought the invention of the
transistor was a miracle. I know now how it came about.

As history revealed, the invention of the transistor was only
the beginning of an integrated circuit technology that developed
through the 1950s and continues right through to the present. By the
time I became personally involved in 1961, the American marketplace had
already witnessed the retooling of Japan and Germany in the 1950s and
Korea and Taiwan in the late 1950s through the early 1960s. General
Trudeau was concerned about this, not because he considered these
countries our economic enemies but because he believed that American
industry would suffer as a result of its complacency about basic
research and development. He expressed this to me on many occasions
during our meetings, and history has proved him to be correct. General
Trudeau believed that the American industrial economy enjoyed a harvest
of technology in the years immediately following World War II, the
effects of which were still under way in the 1960s, but that it would
soon slow down because R&D was an inherently costly undertaking
that didn’t immediately contribute to a company’s
bottom line. And you had to have a good bottom line, General Trudeau
always said, to keep your stockholders happy or else they would revolt
and throw the existing management team right out of the company. By
throwing their efforts into the bottom line, Trudeau said, the big
American industries were actually destroying themselves just like a
family that spends all its savings.

“You have to keep on investing in yourself, Phil,
” the General would like to say when he’d look up
from his Wall Street Journal  before our morning meetings and
remark about how stock analysts always liked to place their value on
the wrong thing. “Sure, these companies have to make a
profit, but you look at the Japanese and the Germans and they know the
value of basic research, ” he once said to me.
“American companies expect the government to pay for all
their research, and that’s what you and I have to do if we
want to keep them working. But there’s going to come a time
when we can’t afford to pay for it any longer. Then
who’s going to foot the bill?”

General Trudeau was worrying about how the drive for new
electronics products based upon miniaturized circuitry was creating
entirely new markets that were shutting out American companies. He said
that it was becoming cheaper for American companies to have their
products manufactured for them in Asia, where companies had already
retooled after the war to produce transistorized components, than for
American companies, which had heavily invested in the manufacturing
technology of the nineteenth century, to do it themselves. He knew that
the requirement for space exploration, for challenging the hostile EBEs
in their own territory, relied on the development of an integrated
circuit technology so that the electronic components of spacecraft
could be miniaturized to fit the size requirements of rocket propelled
vehicles. The race to develop more intelligent missiles and ordnance
also required the development of new types of circuitry that could be
packed into smaller and smaller spaces. But retooled Japanese and
German industries were the only ones able to take immediate advantage
of what General Trudeau called the “new electronics.

For American industry to get onto the playing field the basic
research would have to be paid for by the military. It was something
General Trudeau was willing to fight for at the Pentagon because he
knew that was the only way we could get the weapons only a handful of
us knew we needed to fight a skirmish war against aliens only a handful
of us knew we were fighting. Arthur Trudeau was a battlefield general
engaged in a lonely military campaign that national policy and secrecy
laws forbade him even to talk about. And as the gulf of time widened
between the Roswell crash and the concerns over postwar economic
expansion, even the people who were fighting the war alongside General
Trudeau were, one by one, beginning to die away. Industry could fight
the war for us, General Trudeau believed, if it was properly seeded
with ideas and the money to develop them. By 1961, we had turned our
attention to the integrated circuit.

Other books

The Second Adventure by Gordon Korman
Infidels by J. Robert Kennedy
More Than Human by Theodore Sturgeon
Abby the Witch by Odette C. Bell
...O llevarás luto por mi by Dominique Lapierre, Larry Collins
Being There by Jerzy Kosinski
All That Glitters by Catrin Collier
Una tienda en París by Màxim Huerta
In the Heart of the City by Cath Staincliffe