There have been many examples of this process of discovery. Newton's laws of gravity hold true except when movement approaches the speed of light or when gravity becomes enormously strong, as it does near a black hole. Einstein's newer, more fundamental description in terms of space-time does not break down, as Newton's laws do, in these extreme circumstances. But Einstein's description also presents problems that challenge the assumption of unity and harmony. They predict that there will be singularities â points of infinite density â at the origin of the universe and at the centre of black holes. At a singularity, all the laws of physics break down. And so the search must go on for a more fundamental set of laws, on the Pythagorean assumption that at absolute bedrock there are laws that break down in no situations whatsoever. The underlying unchanging laws, whatever they are, and the nearest approaches to them that have been found, do obviously allow a vast range of changes and events to occur, a vast range of behaviour and experience. How far we have come from the early Pythagoreans, as they hurriedly and superficially applied this same faith in numbers! How unfathomably deep beyond their imagination the true connections lie! Beyond ours, too, perhaps.
The first challenge to the Pythagorean assumption of rationality in the universe to occur in the twentieth century was Russell's paradox, the discovery of Bertrand Russell that was discussed in Chapter 18. That happened early, in 1901. Another, in 1931, was Austrian Kurt Gödel's âincompleteness theorem'. Gödel was then a young man working in Vienna; he would later join Einstein at the Institute for Advanced Study in Princeton. Gödel's discovery was that in any mathematical system complex enough to include the addition and multiplication of whole numbers â hardly fringe territory; any schoolchild is familiar with that â there are propositions that can be stated, that we can see are true, but that cannot be proved or disproved mathematically within the system. This means that all significant mathematical systems are open and incomplete. Truth goes beyond the ability to prove that it is true. Gödel also showed that it is not possible to prove whether or not any system rich enough to include addition and multiplication of whole numbers is self-consistent.
These discoveries constituted a serious reversal of hopes for some, and a serious undermining of assumptions for others. The great mathematician David Hilbert and his colleagues had previously been able to demonstrate that logical systems less complex than arithmetic were consistent, and it seemed certain that they would be able to go on to demonstrate the same for all of arithmetic. Not so. With Gödel, the soaring Pythagorean staircase to sure knowledge, built of numbers, became something more resembling a staircase in an Escher drawing, and it is no wonder that the most famous book about Gödel is Douglas R. Hofstadter's
Gödel, Escher, Bach
. The Bach is Johann Sebastian Bach. Bertrand Russell was one of those who were badly shaken by Gödel's theorems â particularly so because he misread Gödel and thought he had proved that arithmetic was not incomplete but
inconsistent.
Instead, Gödel had demonstrated that no one ever would be able to prove whether it was consistent or not. David Hilbert was not so discouraged as Russell: Until his death in 1943, he refused to recognise that Gödel had put paid to his hopes. The influence of Gödel's discoveries was profound, and yet, on one level, rather inconsequential. As John Barrow wrote in 1992, âIt loomed over the subject of mathematics in an ambiguous fashion, casting a shadow over the whole enterprise, but never emerging to make the slightest difference to any truly practical application of mathematics.'
5
Though Gödel's discoveries may have undermined some forms of faith in mathematics, in a manner that seemed to resemble the Pythagorean discovery of incommensurability, Gödel's view of mathematics was, in fact, Pythagorean. He believed that mathematical truth is something that actually exists apart from any invention by human minds â that his theorems were âdiscoveries' about objective truth, not his own creations.
This was not a popular idea in the 1930s. Many mathematicians disagreed. In fact, the concept of anything existing in an objective sense â waiting out there to be discovered and not in any way influenced by the actions of the investigator â had been called into question by a development in physics. A far more dramatic and far-reaching crisis than the one caused by Gödel's incompleteness theorem had occurred in the 1920s and was having a profound effect on the way scientists and others viewed the world. It was the discovery of the uncertainty principle of quantum mechanics.
The way cause and effect work had long seemed good evidence that the universe is rational. It also seemed that if cause and effect operate as they do on levels humans can perceive, they surely must operate with equal dependability in regions of the universe, or at levels of the universe, that are more difficult â or even impossible â to observe directly. Cause and effect could be used as a guide in deciding what happened in the very early universe and what conditions will be like in the far distant future. No one was thinking of belief in cause and effect as a âbelief' at all, though, in fact, there was nothing to prove that cause and effect would not cease to operate in an hour or so, or somewhere else in the universe. Then, in the 1920s, came developments that required reconsideration of the assumption that every event has an unbroken history of cause and effect leading up to it.
The quantum level of the universe is the level of the very small: molecules, atoms, and elementary particles. It is on that level that a commonsense description breaks down. Here there are uncaused events, happenings without a history of the sort it is normally assumed any event must have. Atoms are not miniature solar systems. You cannot observe the position of an electron orbiting the nucleus and predict where it will be at a later given moment and what path it will take to get there or say where it was an hour ago â as you could with fair accuracy for the planet Mars in the solar system. An electron never has a definite position and a definite momentum at the same time. If you measure precisely the position of a particle, you cannot at the same time measure its momentum precisely. The reverse is also true. It is as though the two measurements â position and momentum â are sitting at opposite ends of a seesaw. The more precisely you pin down one, the more up-in-the-air and imprecise the other becomes. This is the Heisenberg uncertainty principle of quantum physics â the twentieth century's âincommensurability'. It was first articulated by Werner Heisenberg in 1927. Not only did it undermine faith in a rational universe, it also seemed to undermine the notion that truth was something objective, something waiting out there to be discovered. On the quantum level, your measurement affects what you find.
On the other hand, the existence of quantum uncertainty itself was apparently a very unwelcome piece of objective truth waiting out there that no physicist could change, as much as he or she might wish to, no matter what observational methods he or she used. Einstein in particular rebelled at the notion that no future advance in science and no improvement in measuring equipment was ever going to resolve this uncertainty. Until his death, he went on trying to devise thought experiments to get around it. He never succeeded, nor has he succeeded posthumously as others have found ways to carry out experiments he invented in his head. âGod does not play dice!' Einstein wrote on one occasion to Niels Bohr, who was far more ready to accept quantum uncertainty than Einstein. âAlbert, don't tell God what he can do!' Bohr answered. The Bohr-Einstein debate about how to interpret the quantum level of the universe continued and became famous.
It is easy to sympathise with Einstein. The quantum world and the paradoxes implicit in it did not seem to be the work of a rational mind. Einstein might have rephrased the complaint Kepler registered when faced with a similar problem: âHeretofore we have not found such an ungeometrical conception in His other works!' How could what happened to one particle affect another over time and space with no link between them? How could a cat be both dead and alive at the same time â as one had to accept in the famous example of âSchrödinger's cat'? How could something be a wave some times and a particle at others, depending on the experimental situation? It was a
Through the Looking Glass
world â and still is, in spite of the reassurance that it
is
possible to predict things on the quantum level of the universe, if one can be satisfied with probabilities. It does seem that the staircase to knowledge about the universe can have a firm footing on the quantum level, with probabilities forming a sort of superstructure above the quagmire. All is far from lost for the Pythagorean climb.
The dawning awareness of a new aspect of the universe, in chaos and complexity theories developed later in the twentieth century, was not nearly so great a shock as quantum uncertainty. However, it did seem to hint that science had been discovering one orderly, predictable system after another only because it was impossible or at least terribly discouraging to try to study any other kind of system in a meaningful fashion. The relatively easy to study predictable systems actually turned out to be the exception rather than the rule. But for those of a Pythagorean cast of mind, it was the discoveries of the repeating patterns in chaos â the pictures deep in the Mandelbrot and Julia sets, and also in nature itself â that gloriously seemed to uphold, as never before, the ancient conviction that beauty and harmony are hidden everywhere in the universe and have nothing to do with any invention of humans. Less immediately mind-boggling, but no less impressive, was the realisation in the study of chaos and complexity that there seem to be mysterious organising principles at work. There are probabilities, but by some calculations they are vanishingly low, that the universe would have organised itself into galaxies, stars, and planets; that life on this earth would have been organised into ecosystems and animal and human societies. Yet that is what has happened. Thus, as with the other challenges to faith in the Pythagorean assumptions underlying science, when scientists began to get a handle on chaos and complexity, the theories having to do with them became not threats but new avenues in the search for better understanding of nature and the universe.
Twentieth-century âpostmodern' thinking, combined with suspicions raised by the discovery of quantum uncertainty and our inability to examine the quantum world without affecting it, led to fresh doubts about other Pythagorean pillars of science. Is there really such a thing as objective reality? Is anything real, waiting to be discovered? Does the fact that science continues to discover things that make sense, and suspects or dismisses anything that does not, mean that we are finding out more and more about a rational universe . . . or only that we are selecting the information and discoveries that fit our very Pythagorean expectations?
The assumption of rationality lies at the root of modern arguments about âintelligent design'. It is true that the world's design, as the Pythagoreans found out, is intelligent to a degree that would send any discoverer of a new manifestation to his or her knees â but before what, or whom? Does discovering rationality necessarily mean one has glimpsed the Mind of God? On the other hand, does a good scientist have to repress the strong impression that it does? Those who attack belief in God do so from several directions. One is rather old-fashioned now, but still heard: Everything is so perfectly laid out, in so tight and orderly a design, that there is no room for God to act at any point. It all goes like clockwork. Or, a newer argument: Everything happens â and has always happened â entirely by chance. The impression of any underlying rationality in nature is an illusion. The âanthropic principle' says that if things had not fallen out just the way they have, we could not be here to observe them â and that is the
only
reason we find a universe that is amenable to our existence. Or . . . our entire picture of the universe is created, by us, in the self-centred image of our own minds, and we are discovering something not far different from the ten heavenly bodies of the Pythagoreans. Plato might have enjoyed the late-twentieth-century discussions about whether mathematical rationality might be powerful enough to create the universe, without any need for God. Quantum theory made possible the suggestion that ânothingness' might have been unstable in a way that made it statistically probable that ânothingness' would decay into âsomething'.
Pythagorean principles and issues also showed up in other ways in twentieth-century culture. Peter Shaffer's trilogy of plays
The Royal Hunt of the Sun, Equus
, and
Amadeus
were all profound explorations of the theme of rationality and irrationality and reflected the sort of love/hate humanity has for both: Is there a Mind behind the universe? Is that Mind sane or mad? Tennessee Williams dubbed the so-called ârationality' of God the rationality of a senile delinquent. In music, âtwelve tone' compositions were the most mathematically bound compositions ever written, but this form of music was also clear evidence that the Pythagorean insight had been correct that certain combinations of tones â and
only
certain combinations â have a deep link with what the human ear recognises as harmonious and beautiful. On
Sesame Street
, numbers came to life and danced and sang in a way that probably would have delighted the Pythagoreans â if they did not find it irreverent â but probably would have annoyed Aristotle.
The music of the spheres remained a popular metaphor, but in the second half of the century it moved beyond the âspheres'.
6
As Richard Kerr has put it, âthe idea of heavenly harmonics is now making a comeback among astronomers. Instead of listening to the revolutions of the spheres, modern astronomers are tuning in to the vibrations within stars'.
7