Authors: Robert M. Hazen
Everything—particles, energy, the rate of electron
spin—comes in discrete units
.
and
You can’t measure anything without changing it
.
Together, these two basic facts explain the operation of atoms, things inside atoms, and things inside things inside atoms.
Quantum mechanics is the branch of science devoted to the study of the behavior of atoms and their constituents.
Quantum
is the Latin word for “so much” or “bundle,” and “mechanics” is the old term for the study of motion. Thus, quantum mechanics is the study of the motion of things that come in little bundles.
A particle like the electron must come in a “quantized” form. You can have one electron, or two, or three, but never 1.5 or 2.7. It’s not so obvious that something we usually think of as continuous, like light, comes in this form as well. In fact, the quantum or bundle of light is called the “photon” (you may find it useful to remember the “photon torpedoes” of
Star Trek
fame). It is even less obvious that quantities such as energy and how fast electrons spin come only in discrete bundles as well, but they do. In the quantum world,
everything
is quantized and can be increased or decreased only in discrete steps.
The behavior of quanta is puzzling at first. The obvious expectation is that when we look at things like electrons, we should find that they behave like microscopic billiard balls—that
the world of the very small should behave in pretty much the same way as the ordinary world we experience every day. But an expectation is not the same as a commandment. We can
expect
the quantum world to be familiar to us, but if it turns out not to be, that doesn’t mean nature is somehow weird or mystical. It just means that things are arranged in such a way that what is “normal” for us at the scale of billiard balls is not “normal” for the universe at the scale of the atom.
The strangeness of the quantum world is especially evident in the operation of the uncertainty principle, sometimes called the Heisenberg uncertainty principle after its discoverer, the German physicist Werner Heisenberg (1901–76). The easiest way to understand the uncertainty principle is to think about what it means to say that you “see” something. In order for you to see these words, for example, light from some source (the sun or a lamp) must strike the book and then travel to your eye. A complex chemical process in your retina converts the energy of the light into a signal that travels to your brain.
Think about the interaction of light with the book. When you look at the book, you do not see it recoil when the light hits it, despite the fact that floods of photons must be bouncing off every second in order for you to see the page. This is the classic Newtonian way of thinking about measurement. It is assumed that the act of measurement (in this case, the act of bouncing light off the book) does not affect the object being measured in any way. Given the infinitesimal energy of the light compared to the energy required to move the book, this is certainly a reasonable way to look at things. After all, baseballs do not jitter
around in the air because photographers are using flashbulbs, nor does the furniture in your living room jump every time you turn on the light.
But does this comfortable, reasonable, Newtonian viewpoint apply in the ultrasmall world of the atom? Can you “see” an electron in the same way that you see this book?
If you think about this question for a moment, you will realize that there is a fundamental difference between “seeing” these two objects. You see a book by bouncing light off it, and the light has a negligible effect on the book. You “see” an electron, on the other hand, by bouncing another electron (or some other comparable bundle) off it. In this case, the thing being probed and the thing doing the probing are comparable in every way, and the interaction cannot leave the original electron unchanged. It’s as if the only way you could see a billiard ball was to hit it with another billiard ball.
There is a useful analogy that will help you think about measurement in the quantum domain. Suppose there was a long, dark tunnel in a mountain and you wanted to know whether there was a car in the tunnel. Suppose further that you couldn’t go into the tunnel yourself or shine a light down it—that the only way you could answer your question was to send another car down that tunnel and then listen for a crash. If you heard a crash, you could certainly say that there was another car in the tunnel. You couldn’t say, however, that the car was the same after your “measurement” as it was before. The very act of measuring—in this case the collision of one car with the other—changes the original car. If you then sent another car down the tunnel to make a second measurement, you would no longer be measuring the original car, but the original car as it has been altered by the first measurement.
In the same way, the fact that to make a measurement on an
electron requires the same sort of disruptive interaction means that the electron (or any other quantum particle) must be changed whenever it is measured. This simple fact is the basis for the uncertainty principle and, in the end, for many of the differences that exist between the familiar world and the world of the quantum.
The uncertainty principle is a statement that says, in effect, that the changes caused by the act of measurement make it impossible to know everything about a quantum particle with infinite precision. It says, for example, that you cannot know both the position (where something is) and velocity (how fast it’s moving) exactly—the two pieces of data that are significant in describing any physical object.
The important thing about the uncertainty principle is that if you measure the position of a tiny particle with more and more precision, so that the error becomes smaller and smaller, the uncertainty in velocity must become greater to compensate. The more care you take to know one thing, the more poorly you know the other. The very act of measuring changes the thing you are measuring, so you must always be uncertain about something.
The fact that one cannot measure a quantum system without changing it leads to an extremely important conclusion about the way that such systems must be described. Suppose that large objects, like airplanes, behaved the way electrons do. Suppose you knew that an airplane was flying somewhere in the Midwest and you wanted to predict where it would be a few hours later. Because of the uncertainty principle, you couldn’t know both where the plane was and how fast it was going, and you’d have to make some
compromise. You might, for example, locate the plane to within 50 miles and its velocity to within 100 miles per hour.
If you now ask where the plane will be in two hours, the only answer you can give is, “It depends.” If the plane is traveling 500 miles per hour, it will be 1,000 miles away; if it’s traveling 400 miles per hour, it will be only 800 miles away. And since we don’t know exactly where the plane started, there is an additional uncertainty about its final location.
One way to deal with this problem is to describe the final location of the plane in terms of probabilities: there’s a 30 percent chance it will be in Pittsburgh, a 20 percent chance it will be in New York, and so on. You could, in fact, draw a graph that would show the probable location of the plane at any point east of the Mississippi. For historical reasons, a collection of probabilities like this is called the “wave function” of the plane.
We normally don’t worry about wave function for airplanes, because in the everyday world the amount of change caused by a measurement is so small as to be negligible, so the uncertainties in the plane’s position and speed are tiny. In the quantum world, however,
every
measurement causes appreciable change in the object being measured, and hence everything has to be described in terms of probabilities and wave functions. It was this unpredictable aspect of the quantum world that troubled Albert Einstein and caused him to remark that “God does not play dice with the universe.” (His old friend Niels Bohr is supposed to have replied, “Albert, stop telling God what to do.”)
We all run into trouble when we try to visualize a quantum object like an electron. Our inventory of mental images is limited
to the sorts of things we can see in our familiar world, and unfortunately, electrons just don’t fit anywhere on our mental file cards. Nowhere is this problem of visualization more difficult than in discussion of particles and waves in the quantum world.
In our normal world, energy can be transferred by particle or by wave, as you can see by thinking about a bowling alley. Suppose there was one pin standing at the other end and you wanted to knock it down. You would have to apply energy to the pin to do this, of course, and might choose to expend some energy to get a bowling ball moving, and then let the ball roll down the alley and knock the pin over. This process involves a particle—the bowling ball—carrying energy from you to the pin. Alternatively, you could set up a line of pins and then knock the first one over. It would knock over the second, which would knock over the third, and so on like dominoes until the final pin fell. In this case, the energy would be transmitted by the wave of falling pins, and no single object travels from you to your target.
When scientists started to explore the subatomic world, they naturally asked, “Are electrons particles or are they waves?” After all, an electron transfers energy, and if energy can be transferred only by particles and waves, then the electron must be one or the other.
Unfortunately, things aren’t that simple. Experiments performed on electrons have found that in some situations they seem to act as particles, in other situations as waves. Similarly, something we normally consider to be a wave—light, for example—can appear to be a particle under certain circumstances. In the early years of the twentieth century, this seemingly inexplicable behavior was called “wave-particle duality” and was supposed to illustrate the strangeness of the quantum world.
There is, however, nothing particularly mysterious about
“duality.” The behavior of electrons and light simply tells us that in the quantum world, our familiar categories of “wave” and “particle” do not apply. The electron is not a wave, and it’s not a particle—it’s something else entirely. Depending on the experiment we do, we can see wave-like or particle-like behavior. The wave-particle problem lies not with nature, but with our own minds.
Suppose you were a Martian who, for some reason or other, had been able to pick up radio broadcasts from Earth only in the French and German languages. You might very well come up with a theory that every language on Earth was either French or German. Suppose then that you came to Earth and landed in the middle of an American city. You hear English for the first time, and you note that some of the words are like French and some are like German. You would have no problem if you realized that there was a third type of language of which you had been previously ignorant, but you could easily tie yourself in philosophical knots if you didn’t. You might even develop a theory of “French-German duality” to explain the new phenomenon.
In the same way, as long as we are willing to accept that things at the atomic level are not like things in our everyday world, no problem arises with the question of whether things are waves or particles. The correct answer to the wave-particle question is simply “none of the above.”
Of course, this means that we cannot visualize what an electron is like—we can’t draw a picture of it. For creatures as wedded to visual imagery as we are, this is deeply troubling. Physicists and nonphysicists alike rebel and try to make mental images, whether they are “real” or not. The authors are no different, and, for the record, we imagine the electron as something like a tidal wave, located in one general area, like a particle, but with crests and troughs, like a wave.
The length of the “tidal wave” associated with different kinds of particles varies tremendously. That of an electron, for example, is smaller than that of an atom, while a photon of ordinary visible light is about three feet long. Viewed this way, both “waves” like light and “particles” like electrons have the same basic structure. The distinction in classical physics between wave and particle turns out to be a meaningless distinction in the quantum world.
The most important role quantum mechanics plays in science is explaining how the atom is put together. In the last chapter we described the peculiar property of electrons in the Bohr atom to adopt fixed orbits. These fixed orbits are a consequence of quantized electron energies. Electrons can only have certain precise energies, and any quantum leap between orbits must correspond exactly to the difference in orbital energies. Each quantum leap by one electron leads to the absorption or the production of one photon.
Electrons moving up and down in their orbits are analogous to a person moving up and down a staircase: It requires energy to climb, and energy is released upon descent. And, like a person on a staircase, an electron cannot be found between “steps”—in other words, it can be found only in allowed orbits.
Although it is tempting to think of electrons in orbits as particles, little lumps of matter, scientists often picture them in terms of their wave functions. The peak in the electron “wave,” corresponding to the highest probability of finding the electron, is at
the place where the electron would be located if we pictured it as a particle.
From grocery lines to rock concerts, compact discs to the most advanced weaponry, lasers pervade our world and have changed the ways we use light.
“Laser” is an acronym for Light Amplification by Stimulated Emission of Radiation, an imposing name for a remarkable de vice. Lasers work like this: You start off with a collection of atoms, each with an electron in a high-energy orbit. The chromium atoms in crystals of ruby serve this function in many red lasers. Photons with exactly the same energy as the excited electrons are focused on those special atoms. When one of these photons comes near an atom it “stimulates” the electron in the atom to jump down, emitting another photon in the process—one that is not only of the same wavelength as the original, but is precisely aligned, crest to crest, trough to trough. The two photons now pass through the material, stimulating other atoms until a flood of precisely aligned photons results. In this way, one photon “amplifies” itself.