Stephen Hawking (6 page)

Read Stephen Hawking Online

Authors: John Gribbin

BOOK: Stephen Hawking
6.02Mb size Format: txt, pdf, ePub

Einstein used to say that the inspiration for his general theory of relativity (which is, above all, a theory of gravity)
came from the realization that a person inside a falling elevator whose cable had snapped would not feel gravity at all. We can picture exactly what he meant because we have seen film of astronauts orbiting the Earth in spacecraft. Such an orbiting spacecraft is not “outside” the influence of the Earth's gravity; indeed, it is held in orbit by gravity. But the spacecraft and everything in it is falling around the Earth with the same acceleration, so the astronauts have no weight and float within their capsule. For them, it is as if gravity does not exist, a phenomenon known as free fall. But Einstein had never seen any of this and had to picture the situation in a freely falling elevator in his imagination. It is as if the acceleration of the falling elevator, speeding up with every second that passes, precisely cancels out the influence of gravity. For that to be possible, gravity and acceleration must be exactly equivalent to one another.

The way this led Einstein to develop a theory of gravity was through considering the implications for a beam of light, the universal measuring tool of special relativity. Imagine shining a flashlight horizontally across the elevator from one side to the other. In the freely falling elevator, objects obey Newton's laws: they move in straight lines, from the point of view of an observer in the elevator, bounce off each other with action and reaction equal and opposite, and so on. And, crucially, from the point of view of the observer in the elevator, light travels in straight lines.

But how do things look to an observer standing on the ground watching the elevator fall? The light would appear to follow a track that always stays exactly the same distance below the roof of the elevator. But in the time it takes the light
to cross the elevator, the elevator has accelerated downward, and the light in the beam must have done the same. In order for the light to stay the same distance below the roof all the way across, the light pulse must follow a curved path as seen from outside the elevator. In other words, a light beam must be bent by the effect of gravity.

Einstein explained this in terms of bent spacetime. He suggested that the presence of matter in space distorts the spacetime around it, so that objects moving through the distorted spacetime are deflected, just as if they were being tugged in ordinary “flat” space by a force inversely proportional to the square of the distance. Having thought up the idea, Einstein then developed a set of equations to describe all this. The task took him ten years. When he had finished, Newton's famous inverse-square law reemerged from Einstein's new theory of gravity; but general relativity went far beyond Newton's theory, because it also offered an all-embracing theory of the whole Universe. The general theory describes all of spacetime and therefore all of space and all of time. (There is a neat way to remember how it works. Matter tells spacetime how to bend; bends in spacetime tell matter how to move. But, the equations also insisted, spacetime itself can also move, in its own fashion.)

The general theory was completed in 1915 and published in 1916. Among other things, it predicted that beams of light from distant stars, passing close by the Sun, would be bent as they moved through spacetime distorted by the Sun's mass. This would shift the apparent positions of those stars in the sky—and the shift might actually be seen, and photographed, during a total eclipse, when the Sun's blinding light is blotted
out. Just such an eclipse took place in 1919; the photographs were taken and showed exactly the effect Einstein had predicted. Bent spacetime was real: the general theory of relativity was correct.

But the equations developed by Einstein to describe the distortion of spacetime by the presence of matter, the very equations that were so triumphantly vindicated by the eclipse observations, contained a baffling feature that even Einstein could not comprehend. The equations insisted that the spacetime in which the material Universe is embedded could not be static. It must be either expanding or contracting.

Exasperated, Einstein added another term to his equations, for the sole purpose of holding spacetime still. Even at the beginning of the 1920s, he still shared (along with all his contemporaries) the Newtonian idea of a static Universe. But within ten years, observations made by Edwin Hubble with a new and powerful telescope on a mountaintop in California had shown that the Universe
is
expanding.

The stars in the sky are not moving farther apart from one another. The individual stars we can see from Earth all belong to a huge system, the Milky Way Galaxy, which contains about a hundred billion stars and is like an island in space. In the 1920s, astronomers discovered with the aid of new telescopes that there are many other galaxies beyond the Milky Way, many of them containing hundreds of billions of stars like our Sun. And it is the galaxies, not individual stars, that are receding from one another, being carried farther apart as the space in which they are embedded expands.

If anything, this was an even more extraordinary and impressive prediction of the general theory than the bending
of light detectable during an eclipse. The equations had predicted something that even Einstein at first refused to believe but which observations later showed to be correct. The impact on scientists' perception of the world was shattering. The Universe was not static, after all, but evolving; Einstein later described his attempt to fiddle the equations to hold the Universe still as “the greatest blunder of my life.” Even at the end of the 1920s, the observations and the theory agreed that the Universe is expanding. And if galaxies are getting farther apart, that means that long ago they must have been closer together. How close could they ever have been? What happened in the time when galaxies must have been touching one another and before then?

The idea that the Universe was born in a super-dense, super-hot fireball known as the Big Bang is now a cornerstone of science, but it took time—over fifty years—for the theory to become developed. Just at the time astronomers were finding evidence for the universal expansion, transforming the scientific image of the Universe at large, their physicist colleagues were developing the quantum theory, transforming our understanding of the very small. Attention focused chiefly on the development of the quantum theory over the next few decades, with relativity and cosmology (the study of the Universe at large) becoming an exotic branch of science investigated by only a few specialist mathematicians. The union of large and small still lay far in the future, even at the end of the 1920s.

As the nineteenth century gave way to the twentieth, physicists were forced to revise their notions about the nature of
light. This initially modest readjustment of their worldview grew, like an avalanche triggered by a snowball rolling down a hill, to become a revolution that engulfed the whole of physics—the quantum revolution.

The first step was the realization that electromagnetic energy cannot always be treated simply as a wave passing through space. In some circumstances, a beam of light, for example, will behave more like a stream of tiny particles (now called photons). One of the people instrumental in establishing this “wave-particle duality” of light was Einstein, who in 1905 showed how the way in which electrons are knocked out of the atoms in a metal surface by electromagnetic radiation (the photoelectric effect) can be explained neatly in terms of photons, not in terms of a pure wave of electromagnetic energy. (It was for this work, not his two theories of relativity, that Einstein received his Nobel Prize.)

This wave-particle duality changes our whole view of the nature of light. We are used to thinking of momentum as a property to do with the mass of a particle and its speed (or, more correctly, its velocity). If two objects are moving at the same speed, the heavier one carries more momentum and will be harder to stop. A photon does not have mass, and at first sight you might think this means it has no momentum either. But, remember, Einstein discovered that mass and energy are equivalent to one another, and light certainly does carry energy—indeed, a beam of light is a beam of pure energy. So photons do have momentum, related to their energy, even though they have no mass and cannot change their speed. A change in the momentum of a photon means that it has changed the amount of energy it carries, not its velocity; and
a change in the energy of a photon means a change in its wavelength.

When Einstein put all of this together, it implied that the momentum of a photon multiplied by the wavelength of the associated wave always gives the same number, now known as Planck's constant in honor of Max Planck, another of the quantum pioneers. Planck's constant (usually denoted by the letter
h
) soon turned out to be one of the most fundamental numbers in physics, ranking alongside the speed of light,
c
. It cropped up, for example, in the equations developed in the early decades of the twentieth century to describe how electrons are held in orbit around atoms. But although the strange duality of light niggled, the cat was only really set among the pigeons in the 1920s when a French scientist, Louis de Broglie, suggested using the wave-particle equation in reverse. Instead of taking a wavelength (for light) and using this to calculate the momentum of an associated particle (the photon), why not take the momentum of a particle (such as an electron) and use it to calculate the length of an associated wave?

Fired by this suggestion, experimenters soon carried out tests that showed that, under the right circumstances, electrons do indeed behave like waves. In the quantum world (the world of the very small, on the scale of atoms and below), particles and waves are simply twin facets of
all
entities. Waves can behave like particles; particles can behave like waves. A term was even coined to describe these quantum entities—“wavicles.” The dual description of particles as waves and waves as particles turned out to be the key to unlocking the secrets of the quantum world, leading to the development of a satisfactory theory to account for the behavior of atoms,
particles, and light. But at the core of that theory lay a deep mystery.

Because all quantum entities have a wave aspect, they cannot be pinned down precisely to a definite location in space. By their very nature, waves are spread-out things. So we cannot be certain where, precisely, an electron is—and uncertainty, it turns out, is an integral feature of the quantum world. The German physicist Werner Heisenberg established in the 1920s that all observable quantities are subject, on the quantum scale, to random variations in their size, with the magnitude of these variations determined by Planck's constant. This is Heisenberg's famous “uncertainty principle.” It means that we can never make a precise determination of all the properties of an object like an electron: all we can do is assign probabilities, determined in a very accurate way from the equations of quantum mechanics, to the likelihood that, for example, the electron is in a certain place at a certain time.

Furthermore, the uncertain, probabilistic nature of the quantum world means that if two identical wavicles are treated in an identical fashion (perhaps by undergoing a collision with another type of wavicle), they will not necessarily respond in identical fashions. That is, the outcome of experiments is also uncertain, at the quantum level, and can be predicted only in terms of probabilities. Electrons and atoms are not like tiny billiard balls bouncing around in accordance with Newton's laws.

None of this shows up on the scale of our everyday lives, where objects such as billiard balls do bounce off each other in a predictable, deterministic fashion, in line with Newton's laws. The reason is that Planck's constant is incredibly small:
in standard units used by physicists, it is a mere 6 × 10
−34
(a decimal point followed by 33 zeros and a 6 [i.e., 0.0000000000000000000000000000000006]) of a joule-second. And a joule is indeed a sensible sort of unit in everyday life—a 60-watt light bulb radiates 60 joules of energy every second. For everyday objects like billiard balls, or ourselves, the small size of Planck's constant means that the wave associated with the object has a comparably small wavelength and can be ignored. But even a billiard ball, or yourself, does have an associated quantum wave—even though it is only for tiny objects like electrons, with tiny amounts of momentum, that you get a wave big enough to interfere with the way objects interact.

It all sounds very obscure, something we can safely leave the physicists to worry about while we get on with our everyday lives. To a large extent, that is true, although it is worth realizing that the physics behind how computers or TV sets work depends on an understanding of the quantum behavior of electrons. Laser beams, also, can be understood only in terms of quantum physics, and every compact disc player uses a laser beam to scan the disc and “read” the music. So quantum physics actually does impinge on our everyday lives, even if we do not need to be a quantum mechanic to make a TV set or a hi-fi system work. But there is something much more important to our everyday lives inherent in quantum physics. By introducing uncertainty and probability into the equations, quantum physics does away once and for all with the predictive clockwork of Newtonian determinism. If the Universe operates, at the deepest level, in a genuinely unpredictable and indeterministic way, then we are given back our
free will, and we can after all make our own decisions and our own mistakes.

Other books

Stars in Jars by Chrissie Gittins
Millions by Frank Cottrell Boyce
When Only a Rake Will Do by Jennifer McNare
The Fallen Angel by David Hewson
El camino de los reyes by Brandon Sanderson
The Christmas Genie by Dan Gutman, Dan Santat
Intimate by Kate Douglas