No computer system is sufficiently powerful to find the one interesting event in such a crowd of useless data. For this reason, experiments always include
triggers
—devices in which hardware and software elements act like nightclub bouncers and permit only potentially interesting events to be recorded. Triggers in CDF and D0 reduced the number of events that experimenters had to sift through to about one in one hundred thousand—still an enormously challenging task, but far more tractable than one in ten trillion.
Once information is recorded, physicists try to interpret it and reconstruct the particles that emerged from any interesting collision.
Because there are always many collisions and many particles and only a limited number of pieces of information, reconstructing the result of a collision is a formidable task, one that has stretched people’s ingenuity and is likely to lead to further data processing advances in the years to come.
By 1994, several of CDF’s working groups had seen events that looked like the top quark (see Figure 54 for an example), but they weren’t really sure. Although CDF couldn’t say with certainty that they found the top quark that year, both D0 and CDF confirmed discovery in 1995. A friend of mine on D0, Darien Wood, described the intensity of the final editorial board meeting at which D0 completed the data analysis and the paper that would report their results. The meeting went through the night and into the next day, with people occasionally napping on table tops.
Figure 54.
A top quark event as recorded by D0, which detects the decay products of the top quark and top antiquark that are produced simultaneously. The line in the upper right is a muon, which reaches the outer portion of the detector. The four rectangular-like blocks are four jets that were produced. The line to the right is the missing energy of the neutrino.
D0 and CDF received joint credit for discovering the top quark. A new particle was produced that had never been seen before. This
newly discovered particle joined the ranks of other, established Standard Model particles. By now, so many top quarks have been seen that we know the top quark’s mass and its other properties extremely precisely. In the future, we expect higher-energy colliders to produce so many top quarks that there is a danger that the top quarks themselves will become the
background
that mimics and interferes with the discovery of other particles.
New physics is almost certainly there to be seen. We will soon see why unresolved Standard Model issues are telling us that new particles and physical processes should appear when colliders reach only slightly higher energies than is possible at present. Experiments at the Large Hadron Collider (LHC) will look for evidence of structure beyond the Standard Model. If those experiments are successful, the reward will be fabulous—a better understanding of the underlying structure of all matter. High energy, many-particle collisions, and clever ideas will all contribute to accomplishing this difficult task.
Precision Tests of the Standard Model
We will now briefly move from the plains of Illinois to mountainous Switzerland—the location of CERN, the Conseil Européen pour la Recherche Nucléaire (now called the Organisation Européenne pour la Recherche Nucléaire or, in English, the European Organization for Nuclear Research, though the old acronym, CERN, has stuck). Many experiments have tested the Standard Model’s predictions, but none were as spectacular as those performed between 1989 and 2000 at the Large Electron-Positron collider (LEP) located at the CERN accelerator facility.
The CERN site was chosen for its central location within Europe. CERN’s main entrance is so close to the French border that the guard booth separating the two countries is almost directly outside. Many CERN employees live in France and cross the border twice daily. They are rarely bothered when crossing the border—unless their car isn’t up to Helvetic standards, in which case the Swiss won’t let them in. The only other danger is being an absent-minded professor, as one colleague can attest to. The guards stopped and searched him when
he didn’t stop at the border because he was distracted by thoughts about black holes.
The difference between the locations of Fermilab and CERN could not be more striking. CERN is adjacent to the beautiful Jura mountains (see Figure 55) and is only a short drive from Chamonix, a remarkable valley that runs between mountains covered with glaciers that descend practically to the road (though less so with global warming), and lies at the foot of Mont Blanc, the highest mountain in Europe. At CERN, many fortunate physicists pass the winter with tanned faces despite the persistent cloud cover in town because of the
time they spend in the mountains nearby skiing, snowboarding, or hiking.
Figure 55.
The CERN site with the Alps in the background. The Large Hadron Collider ring, in which two beams of protons will circulate underground, is indicated.
CERN was created after World War II, in the nascent atmosphere of international collaboration. The original twelve member states were West Germany, Belgium, Denmark, France, Greece, Italy, Norway, the Netherlands, the United Kingdom, Sweden, Switzerland, and Yugoslavia (which left in 1961). Subsequently, Austria, Spain, Portugal, Finland, Poland, Hungary, the Czech and Slovak Republics, and Bulgaria have joined. Observer states involved in CERN activities include India, Israel, Japan, the Russian Federation, Turkey, and the United States. CERN is truly an international enterprise.
CERN, like the Tevatron, has many accomplishments to its credit. Carlo Rubbia and Simon van der Meer were awarded the 1984 Nobel Prize for Physics for designing the original CERN collider and discovering the weak gauge bosons, a success story that destroyed America’s monopoly on particle discoveries. CERN was also where an employee, the Englishman Tim Berners-Lee, came up with the World Wide Web, HTML (hypertext markup language), and http (hypertext transfer protocol). He developed the Web so that many experimenters in scattered nations could be instantaneously linked to information and so that data could be shared among many computers. Of course, the repercussions of the Web have been felt far beyond CERN—it’s often difficult to foresee the practical applications of scientific research.
In a few years, CERN will be the nexus of some of the most exciting physics results. The Large Hadron Collider, which will be able to reach seven times the present energy of the Tevatron, will be located there, and any discoveries made at the LHC will almost inevitably be something qualitatively new. Experiments at the LHC will seek—and very likely find—the as yet unknown physics that underlies the Standard Model, confirming or rejecting models such as the ones I describe in this book. Although the collider is in Switzerland, the LHC will truly be an international effort; experiments for the LHC are currently being developed all over the globe.
But back in the 1990s, physicists and engineers built the unbelievable LEP (Large Electron-Positron collider) at CERN, a Z boson “factory” that churned out millions of Zs. The Z gauge boson is
one of the three gauge bosons that communicate the weak force. By studying millions of Zs, experimenters at LEP (and also at SLAC, the Stanford Linear Accelerator Center in Palo Alto, California) could do detailed measurements of the Z boson’s properties, testing the predictions of the Standard Model to an unprecedented level of precision. It would take us too far off track to describe each of these measurements in detail, but in a moment I’ll give you an idea of the stunning precision that was achieved.
The basic premise behind the Standard Model tests was very simple. The Standard Model makes predictions for the masses of the weak gauge bosons and the decays and interactions of the fundamental particles. We can test the consistency of the theory of weak interactions by checking whether the relationships among all these many quantities fit the theory’s predictions. If there were a new theory with new particles and new interactions that became important at energies near the weak scale, there would be new ingredients that could change the weak interaction predictions from their Standard Model values.
Models that go beyond the Standard Model therefore make slightly different predictions for the Z boson’s properties than those predicted by the Standard Model itself. In the early 1990s everyone used an incredibly cumbersome method for predicting the Z’s properties in these alternative models so that the predictions could be tested. The method was very hard to penetrate and was outlined in a document with more pages than I cared to carry. At the time, I was a postdoctoral fellow at the University of California, Berkeley. In the summer of 1992, while I was attending a summer workshop at Fermilab, I decided that the relationships among different physical quantities could not possibly be as cumbersome as the method in the multipage document implied.
With Mitch Golden, then a postdoc at Fermilab, I developed a more concise way to interpret experimental results about the weak interactions. Mitch and I showed how to systematically incorporate the effects of new heavy (as yet unseen) particles by adding only three new quantities to the Standard Model that would summarize all possible non-Standard Model contributions. I spent a few weeks trying to get it all straight, and the answers finally came together during one intense weekend of work. It was extremely rewarding to discover how
all the processes that the Z-factories would measure could be elegantly related. Mitch and I felt we had developed a much clearer picture of how theory and measurements were related, and it was very satisfying. We were not alone in our discovery, however. Michael Peskin at SLAC and his postdoc Takeo Takeuchi did similar work concurrently, and others followed rapidly in our footsteps.
But the real success story concerns LEP tests of the Standard Model, which were incredibly precise. I won’t go into the details, but I will tell you two anecdotes that demonstrate their impressive sensitivity. The first is about finding the exact energy at which the positrons and electrons collided. The experimenters needed to know this energy to determine the precise value of the Z boson’s mass. They had to take into account everything that might affect the value of its energy. But even after they had accounted for everything they could think of, they noticed that the energy seemed to rise and fall slightly when they measured it at particular times. What was causing the variation?
Incredibly, it turned out to be tides in Lake Geneva. The level of the lake rose and fell with the tides and with the heavy rain that year. This in turn affected the nearby terrain, which slightly altered the distance over which the electrons and positrons traveled inside the collider. Once the tidal effect was factored in, the spurious time-dependent measurement of the mass of the Z went away.
The second anecdote is also quite impressive. Electrons and positrons in the collider are kept in place by strong magnetic fields, which in turn require a large amount of power. It seemed that, periodically, the electrons and positrons would become slightly misaligned, indicating some variation in the collider’s magnetic fields. A worker on the site observed that this variation correlated well with passages of the TGV, the express train that travels between Geneva and Paris. Apparently, there were power spikes associated with the French DC current that slightly disrupted the accelerator. Alain Blondel, a Parisian physicist working at CERN, told me the funniest part of this story. The experimenters had a real opportunity to absolutely confirm this hypothesis. Given that many of the TGV staff are French, there was inevitably a strike, so the experimenters were treated to a spike-free day!
What to Remember