The Information (42 page)

Read The Information Online

Authors: James Gleick

Tags: #Non-Fiction

BOOK: The Information
4.69Mb size Format: txt, pdf, ePub

What happens if you switch on one of these mechanical computers but forget to turn it off before you leave for lunch? Well, I’ll tell you. The same thing would happen in the way of computers in America that happened to Australia with jack rabbits. Before you could multiply 701,945,240 by 879,030,546, every family in the country would have a little computer of their own.…

 

Mr. Shannon, I don’t mean to knock your experiments, but frankly I’m not remotely interested in even one computer, and I’m going to be pretty sore if a gang of them crowd in on me to multiply or divide or whatever they do best.

 

Two years after Shannon raised his warning flag about the bandwagon, a younger information theorist, Peter Elias, published a notice complaining about a paper titled “Information Theory, Photosynthesis, and Religion.”

There was, of course, no such paper. But there had been papers on information theory, life, and topology; information theory and the physics of tissue damage; and clerical systems; and psychopharmacology; and geophysical data interpretation; and crystal structure; and melody. Elias, whose father had worked for Edison as an engineer, was himself a serious specialist—a major contributor to coding theory. He mistrusted the softer, easier, platitudinous work flooding across disciplinary boundaries. The typical paper, he said, “discusses the surprisingly close relationship between the vocabulary and conceptual framework of information theory and that of psychology (or genetics, or linguistics, or psychiatry, or business organization).… The concepts of structure, pattern, entropy, noise, transmitter, receiver, and code are (when properly
interpreted) central to both.” He declared this to be larceny. “Having placed the discipline of psychology for the first time on a sound scientific basis, the author modestly leaves the filling in of the outline to the psychologists.” He suggested his colleagues give up larceny for a life of honest toil.

These warnings from Shannon and Elias appeared in one of the growing number of new journals entirely devoted to information theory.

In these circles a notorious buzzword was
entropy
. Another researcher, Colin Cherry, complained, “We have heard of ‘entropies’ of languages, social systems, and economic systems and of its use in various method-starved studies. It is the kind of sweeping generality which people will clutch like a straw.”

He did not say, because it was not yet apparent, that information theory was beginning to change the course of theoretical physics and of the life sciences and that entropy was one of the reasons.

In the social sciences, the direct influence of information theorists had passed its peak. The specialized mathematics had less and less to contribute to psychology and more and more to computer science. But their contributions had been real. They had catalyzed the social sciences and prepared them for the new age under way. The work had begun; the informational turn could not be undone.


As Jean-Pierre Dupuy remarks: “It was, at bottom, a perfectly ordinary situation, in which scientists blamed nonscientists for taking them at their word. Having planted the idea in the public mind that thinking machines were just around the corner, the cyberneticians hastened to dissociate themselves from anyone gullible enough to believe such a thing.”

9 | ENTROPY AND ITS DEMONS
 
(You Cannot Stir Things Apart)
 

Thought interferes with the probability of events, and, in the long run therefore, with entropy.

—David L. Watson (1930)

 

IT WOULD BE AN EXAGGERATION TO SAY
that no one knew what
entropy
meant. Still, it was one of those words. The rumor at Bell Labs was that Shannon had gotten it from John von Neumann, who advised him he would win every argument because no one would understand it.

Untrue, but plausible. The word began by meaning the opposite of itself. It remains excruciatingly difficult to define. The
Oxford English Dictionary
, uncharacteristically, punts:

1. The name given to one of the quantitative elements which determine the thermodynamic condition of a portion of matter.

 
 

Rudolf Clausius coined the word in 1865, in the course of creating a science of thermodynamics. He needed to name a certain quantity that he had discovered—a quantity related to energy, but not energy.

Thermodynamics arose hand in hand with steam engines; it was at first nothing more than “the theoretical study of the steam engine.”

It concerned itself with the conversion of heat, or energy, into work. As this occurs—heat drives an engine—Clausius observed that the heat does not
actually get lost; it merely passes from a hotter body into a cooler body. On its way, it accomplishes something. This is like a waterwheel, as Nicolas Sadi Carnot kept pointing out in France: water begins at the top and ends at the bottom, and no water is gained or lost, but the water performs work on the way down. Carnot imagined heat as just such a substance. The ability of a thermodynamic system to produce work depends not on the heat itself, but on the contrast between hot and cold. A hot stone plunged into cold water can generate work—for example, by creating steam that drives a turbine—but the total heat in the system (stone plus water) remains constant. Eventually, the stone and the water reach the same temperature. No matter how much energy a closed system contains, when everything is the same temperature, no work can be done.

It is the unavailability of this energy—its uselessness for work—that Clausius wanted to measure. He came up with the word
entropy
, formed from Greek to mean “transformation content.” His English counterparts immediately saw the point but decided Clausius had it backward in focusing on the negative. James Clerk Maxwell suggested in his
Theory of Heat
that it would be “more convenient” to make entropy mean the opposite: “the part which
can
be converted into mechanical work.” Thus:

When the pressure and temperature of the system have become uniform the entropy is exhausted.

 
 

Within a few years, though, Maxwell turned about-face and decided to follow Clausius.

He rewrote his book and added an abashed footnote:

In former editions of this book the meaning of the term Entropy, as introduced by Clausius, was erroneously stated to be that part of the energy which cannot be converted into work. The book then proceeded to use the term as equivalent to the available energy; thus introducing great confusion into the language of thermodynamics. In this edition I have endeavoured to use the word Entropy according to its original definition by Clausius.

 
 

The problem was not just in choosing between positive and negative. It was subtler than that. Maxwell had first considered entropy as a subtype of energy: the energy available for work. On reconsideration, he recognized that thermodynamics needed an entirely different measure. Entropy was not a kind of energy or an amount of energy; it was, as Clausius had said, the
unavailability
of energy. Abstract though this was, it turned out to be a quantity as measurable as temperature, volume, or pressure.

It became a totemic concept. With entropy, the “laws” of thermodynamics could be neatly expressed:

First law: The energy of the universe is constant.

 

Second law: The entropy of the universe always increases.

 
 

There are many other formulations of these laws, from the mathematical to the whimsical, e.g., “1. You can’t win; 2. You can’t break even either.”

But this is the cosmic, fateful one. The universe is running down. It is a degenerative one-way street. The final state of maximum entropy is our destiny.

William Thomson, Lord Kelvin, imprinted the second law on the popular imagination by reveling in its bleakness: “Although mechanical energy is
indestructible
,” he declared in 1862, “there is a universal tendency to its dissipation, which produces gradual augmentation and diffusion of heat, cessation of motion, and exhaustion of potential energy through the material universe. The result of this would be a state of universal rest and death.”

Thus entropy dictated the universe’s fate in H. G. Wells’s novel
The Time Machine:
the life ebbing away, the dying sun, the “abominable desolation that hung over the world.” Heat death is not cold; it is lukewarm and dull. Freud thought he saw something useful there in 1918, though he muddled it: “In considering the conversion of psychical energy no less than of physical, we must make use of the concept of an entropy, which opposes the undoing of what has already occurred.”

Thomson liked the word
dissipation
for this. Energy is not lost, but
it dissipates. Dissipated energy is present but useless. It was Maxwell, though, who began to focus on the confusion itself—the disorder—as entropy’s essential quality. Disorder seemed strangely unphysical. It implied that a piece of the equation must be something like knowledge, or intelligence, or judgment. “The idea of dissipation of energy depends on the extent of our knowledge,” Maxwell said. “Available energy is energy which we can direct into any desired channel. Dissipated energy is energy which we cannot lay hold of and direct at pleasure, such as the energy of the confused agitation of molecules which we call heat.” What
we
can do, or know, became part of the definition. It seemed impossible to talk about order and disorder without involving an agent or an observer—without talking about the mind:

Confusion, like the correlative term order, is not a property of material things in themselves, but only in relation to the mind which perceives them. A memorandum-book does not, provided it is neatly written, appear confused to an illiterate person, or to the owner who understands it thoroughly, but to any other person able to read it appears to be inextricably confused. Similarly the notion of dissipated energy could not occur to a being who could not turn any of the energies of nature to his own account, or to one who could trace the motion of every molecule and seize it at the right moment.

 
 

Order is subjective—in the eye of the beholder. Order and confusion are not the sorts of things a mathematician would try to define or measure. Or are they? If disorder corresponded to entropy, maybe it was ready for scientific treatment after all.

As an ideal case, the pioneers of thermodynamics considered a box of gas. Being made of atoms, it is far from simple or calm. It is a vast ensemble of agitating particles. Atoms were unseen and hypothetical, but these theorists—Clausius, Kelvin, Maxwell, Ludwig Boltzmann, Willard Gibbs—accepted the atomic nature of a fluid and tried to work out the
consequences: mixing, violence, continuous motion. This motion constitutes heat, they now understood. Heat is no substance, no fluid, no “phlogiston”—just the motion of molecules.

Individually the molecules must be obeying Newton’s laws—every action, every collision, measurable and calculable, in theory. But there were too many to measure and calculate individually. Probability entered the picture. The new science of statistical mechanics made a bridge between the microscopic details and the macroscopic behavior. Suppose the box of gas is divided by a diaphragm. The gas on side A is hotter than the gas on side B—that is, the A molecules are moving faster, with greater energy. As soon as the divider is removed, the molecules begin to mix; the fast collide with the slow; energy is exchanged; and after some time the gas reaches a uniform temperature. The mystery is this: Why can the process not be reversed? In Newton’s equations of motion, time can have a plus sign or a minus sign; the mathematics works either way. In the real world past and future cannot be interchanged so easily.

“Time flows on, never comes back,” said Léon Brillouin in 1949. “When the physicist is confronted with this fact he is greatly disturbed.”

Maxwell had been mildly disturbed. He wrote to Lord Rayleigh:

If this world is a purely dynamical system, and if you accurately reverse the motion of every particle of it at the same instant, then all things will happen backwards to the beginning of things, the raindrops will collect themselves from the ground and fly up to the clouds, etc, etc, and men will see their friends passing from the grave to the cradle till we ourselves become the reverse of born, whatever that is.

 
 

His point was that in the microscopic details, if we watch the motions of individual molecules, their behavior is the same forward and backward in time. We can run the film backward. But pan out, watch the box of gas as an ensemble, and statistically the mixing process becomes a one-way street. We can watch the fluid for all eternity, and it will never divide itself into hot molecules on one side and cool on the other. The clever young Thomasina says in Tom Stoppard’s
Arcadia
, “You cannot
stir things apart,” and this is precisely the same as “Time flows on, never comes back.” Such processes run in one direction only. Probability is the reason. What is remarkable—physicists took a long time to accept it—is that every irreversible process must be explained the same way. Time itself depends on chance, or “the accidents of life,” as Richard Feynman liked to say: “Well, you see that all there is to it is that the irreversibility is caused by the general accidents of life.”

For the box of gas to come unmixed is not physically impossible; it is just improbable in the extreme. So the second law is merely probabilistic. Statistically, everything tends toward maximum entropy.

Other books

How to Seduce a Sheikh by Kaye, Marguerite
American Hunger by Richard Wright
Vengeance Trail by Bill Brooks
The Alliance by Stoker,Shannon
Only You by Deborah Grace Stanley
Minor Corruption by Don Gutteridge