Letters to a Young Mathematician (11 page)

BOOK: Letters to a Young Mathematician
4Mb size Format: txt, pdf, ePub

Keep your mind open, but not so open that your brains fall out.

Over the years, a number of new areas of math have emerged from a diversity of sources, inspired by questions in the real world, or extracted from abstract theories because someone thought they were interesting. Some have attracted media attention, including fractal geometry, nonlinear dynamics (“chaos theory”), and complex systems. Fractals are shapes that have detailed structure on all scales of magnification, like ferns and mountains. Chaos is highly irregular behavior (such as weather) caused by deterministic laws. Complex systems model the interactions of large numbers of relatively simple entities, such as traders in the stock market. In the professional literature and mathematical house magazines, you will occasionally find criticisms of these areas that have that all-too-familiar reactionary feel: dismissive of anything that hasn’t had a century-long track record or that the critic does not work on. What has really annoyed the critics is not the content of these new areas
but the media exposure, which their own area, so obviously superior, is not getting.

It’s actually rather easy to assess the scientific influence of, for instance, fractals or chaos. All you have to do is read
Nature
or
Science
for a month, and you will see them being used to investigate such things as how molecules break up during a chemical reaction, how gas giant planets capture new moons, or how species in an ecosystem partition resources. The scientific community accepted them long ago, to the extent that their use is now routine and unremarkable. Yet some diehards, who apparently don’t sample broader reaches of science, still dispute that these areas have any importance. I’m afraid that they are about twenty years out of date. You can’t dismiss something as a nine-day wonder when it has survived for nine thousand days and is currently thriving.

These people need to get out more.

Both Kac and Hammersley were unusually creative in their own fields, where their attitudes were imaginative and forward-looking. So it is slightly unfair to hold them up as examples of reactionary mathematicians. They expressed attitudes that were common in their day. Kac made major advances in probability theory, and his “shape of a drum” paper is a gem. Hammersley’s 2004 obituary in the
Independent on Friday
had this to say about his work: “Hammersley . . . posed and solved some beautiful problems, among the best of which are self-avoiding [random] walks and percolation. He was delighted
to learn in retirement of the recognition accorded thereto by mathematicians and physicists, and of the enormous progress made since his own pioneering work.” But it added, “Ironically, recent progress has been made via a general theory rather than by the type of hands-on technique favoured by Hammersley.”

This may be ironic but it is also entirely predictable. Hammersley was from the make-do-and-mend generation of applied mathematicians. Nowadays, more attention is paid to having the right tools for the job.

We live in a world whose technological abilities, and needs, are exploding. New questions require new methods, and purity of method remains vital, however practical the context. So do intuitive leaps, when they lead in creative directions, even if at first there are no proofs: new mathematics paves the way to new understanding. Which brings me back to Wigner and his classic essay “The Unreasonable Effectiveness of Mathematics in the Natural Sciences.” Wigner wasn’t just wondering why mathematics is
effective
in informing us about nature. Many people have picked up on this aspect of the issue, and offered what I think is an excellent answer: whether or not any particular mathematician notices, the development of mathematics is, and has always been, a two-way trade between real-world problems and symbolic or geometric methods devised to obtain answers.
Of course
math is effective for understanding nature; that, ultimately, is where it comes from.

But I think Wigner was worried—or pleasantly surprised— by something deeper. There is no reason to be astonished if someone starts from a real-world problem—say, the elliptical orbit of Mars—and develops the mathematics to describe it. This is exactly what Isaac Newton did with his inverse square law of gravity, his laws of motion, and calculus. But it is much more difficult to explain why the same tools (differential equations in this instance) provide significant insights into unrelated questions of aerodynamics or population biology. It is here that the effectiveness of math becomes “unreasonable.” It’s like inventing a clock to tell the time and then discovering that it’s really good for navigation, which actually happened, as Dava Sobel explained in
Longitude
.

How can an idea extracted from a particular real-world problem resolve some totally different problem?

Some scientists believe that it happens because the universe really is made from mathematics. John Barrow argues the case like this: “For the fundamental physicist, mathematics is something that is altogether more persuasive. The farther one goes from everyday experience and the local world, the correct apprehension of which is a prerequisite for our evolution and survival, the more impressively mathematics works. In the inner space of elementary particles or the outer space of astronomy, the predictions of mathematics are almost unreasonably accurate. . . . This has persuaded many physicists that the
view that mathematics is simply a cultural creation is a woefully inadequate explanation of its existence and effectiveness in describing the world. . . . If the world is mathematical at its deepest level, then mathematics is the analogy that never breaks down.”

It would be lovely if this were true. But there is a different explanation, less mystical, less fundamentalist. Possibly less convincing.

Both differential equations and clocks are
tools
, not answers. They work by embedding the original problem in a more general context, and deriving general methods to understand that context. This generality improves their chance of being useful elsewhere. This is why their effectiveness appears unreasonable.

You can’t always know in advance what uses you’ll find for a good tool. A round piece of wood mounted on an axle becomes a wheel, useful for moving heavy objects. Cut a groove in its circumference and drape a rope around it, and the wheel becomes a pulley with which you can not only move objects but lift them. Make the wheel out of metal instead of wood, add teeth instead of a groove, and you have a gear. Put your gears and pulleys together with a few other elements—a pendulum, some weights, a face abstracted from an ancient sundial—and you have a mechanism for telling time, which is something the wheel’s original inventors could never have anticipated. The pure mathematicians of the 1960s were forging tools that could be used by everyone in the
1980s. I have great respect for the Hammersleys of this world, much as I respect a large Alsatian dog I meet in the street. My respect for the dog’s teeth does not lead me to agree with its opinions. If everyone adopted the attitudes advocated by Kac and Hammersley, no one would develop the crazy ideas that create revolutions.

So: should you study pure math or applied math?

Neither. You should use the tools at hand, adapt and modify them to suit your own projects, and make new ones as the need arises.

16
Where Do You Get Those Crazy Ideas ?

Dear Meg,

It’s easy to make research sound glamorous: grappling with problems at the cutting edge of human thought, making discoveries that will last a thousand years . . . There is certainly nothing quite like it. All it requires is an original mind, time to think, a place where you can work, access to a good library, access to a good computer system, a photocopier, and a fast Internet connection. All of these will be provided for you as part of your PhD course, except for the first item, which you will have to provide for yourself.

This is, of course, the sine qua non, the thing without which all the other items are useless. Normally, students are not admitted to a PhD course unless they have shown some evidence of original thinking, maybe in a project or a master’s thesis. Originality is one of those things that you either have or you don’t: it can’t be
taught. It can be nurtured or suppressed, but there isn’t an Originality 101 course that will anoint you as able to think new thoughts provided you have read the textbook and passed the exam.

In saying this, I recognize that I am at odds with the prevailing view among educational psychologists, which is that anybody can achieve anything provided they undergo sufficient training. Observing that talented musicians practice a lot, the psychologists have deduced that it is practice that causes talent, and generalized that assertion to all other areas of intellectual activity. But their beliefs are founded on bad experimental design. What they must do, to test their theory, is to start with lots of people who
lack
musical talent, say, the certifiably tone-deaf. Train half of them, keeping the other half as a control group, and show that the training produces lots of highly talented musicians while its absence (predictably) does not. I am sure that training can lead to some improvement. I do not believe it can produce a decent musician unless the talent was there to begin with.

I am no Mozart. I have some musical talent, but not enough, and it’s not for lack of practice. Training can get me to a reasonable level of proficiency: as an undergraduate I played lead guitar in a rock band. But all the practice in the world could never turn me into a Jimi Hendrix or Eric Clapton, never mind Mozart. As Edward Bulwer-Lytton said, “Genius does what it must,
and talent does what it can.” I have just enough musical talent to know what’s lacking.

I
do
have mathematical talent. Not at the Mozart level, but a big improvement on my guitar playing. By age ten I was already the best in my class at math, and believe me, it did not come from lots of regular practice. My dreadful secret is that I did very little work on math. I didn’t have to. My classmates thought I must have put in hours and hours of effort in order to wipe the floor with them in the math tests, and I had enough sense not to set them straight. They would have killed me if they had known how little time I spent on the math homework, compared to their own strenuous efforts.

When I was an undergraduate, at Churchill College, Cambridge, I had a friend who was also taking a math degree. He worked twelve hours a day, every day. I went to lectures, scribbled notes, spent an hour or two a week working through the problem sheets, and that was it until it was time to revise the material for the end-of-year exams. In the British system at that time, there were no end-of-term tests. You waited until June and then you took exams on everything you had studied over the year. So I worked harder in April and May than I did the rest of the year. But while my friend was studying late into the night, I was down at the pub having a beer and playing darts. And what was his reward for all that training? He barely scraped a pass. Whereas I got first-class marks (the British equivalent of straight A’s) and a College Scholarship.

It is true that talented people often train very hard. They have to, to stay at the pinnacle of their chosen field. A football player who did not spend hours every day on fitness training would quickly be replaced by one who did. But the talent has to be there initially in order for the training to be effective.

I suspect that psychologists overrate the role of training because they have fallen for a politically correct theory of child development that views all new young minds as “blank slates” upon which anything whatsoever can be written. This theory was comprehensively demolished by Steven Pinker in his book
The Blank Slate
, but definitive refutation has never been a match for fervent belief.

Anyway, Meg, since you have been accepted onto a PhD course, the mathematicians who run it clearly believe you possess sufficient originality to complete it successfully. I am in no doubt that you also possess another essential quality: commitment. You
want
to do research; you are hungry for it. One of my colleagues once said to me, “I really can’t tell who the best mathematicians are, but I can tell who is
driven
.” Some people believe that in career terms, once you assume a fairly ordinary level of competence, energy and drive actually matter more than talent.

Science fiction writers, another profession where originality is essential, are often asked, “Where do you get those crazy ideas?” The standard answer is, “We make them up.” I’ve written sci-fi novels, and I concur.
But authors do not make up ideas from nowhere. They immerse themselves in activities that might generate ideas, such as reading science magazines, and they keep their antennae tuned for the faintest hint of an idea.

Mathematicians get ideas the same way. They read the math journals, they think about applications, and they keep their antennae tuned to “high.”

Still, the very best seem to have other ways of thinking fresh thoughts. It’s almost as if they lived on another planet. Srinivasa Ramanujan was a brilliant self-taught Indian mathematician whose life story is very romantic; it is well told in Robert Kanigel’s
The Man Who Knew Infinity
. I prefer to think of Ramanujan as Formula Man.
He learned most of his early mathematics from a single, rather curious textbook, George Carr’s
A Synopsis of Elementary
Results in Pure and Applied Mathematics
. It was a list of about five thousand mathematical formulas, starting with simple algebra and leading into complicated integrals in calculus and the summation of infinite series.
The book must have appealed to Ramanujan’s turn of mind, or he would never have worked his way through it; on the other hand, it led him to think (because he had no one to tell him otherwise) that the essence of mathematics is the derivation of formulas.

There is more to math than that: proof, for a start, and conceptual structure. But new formulas play a part, and Ramanujan was a wizard at them. He came to the attention of Western mathematicians in 1913 when he
sent a list of some of his formulas to Hardy. Looking at this list, Hardy saw some formulas he could recognize as known results, but many others were so strange that he had no idea where they could have come from. The man was either a crackpot or a genius; Hardy and his colleague John Littlewood retired to a quiet room with the list, determined not to come out until they had decided which.

The verdict was “genius,” and Ramanujan was eventually brought to Cambridge, where he collaborated with Hardy and Littlewood. He died young, of tuberculosis, and he left a series of notebooks that even today are a treasure trove of new formulas.

When asked where his formulas came from, Ramanujan replied that the Hindu goddess Namagiri came to him in dreams, and told them to him. He had grown up in the shadow of the Sarangapani temple, and Nama-giri was his family deity. As I told you in an earlier letter, Hadamard and Poincaré emphasized the crucial role of the subconscious mind in the discovery of new mathematics. I think Ramanujan’s dreams of Namagiri were surface traces of the hidden activity of his subconscious.

One can’t aspire to be a Ramanujan. His kind of talent is uncanny; I suspect that the only way to understand it is to possess it, and even then it probably yields very little to introspection.

As a contrast, let me try to describe how I usually get new ideas, which is far more prosaic. I read a lot, often in
fields unrelated to my own, and my best ideas often come when something I have read reminds me of something I already know about. That was how I came to work on animal locomotion.

The origin of this particular set of ideas goes back to 1983, when I spent a year in Houston working with Marty Golubitsky. We developed a general theory of space-time patterns in periodic dynamics. That is, we looked at systems whose behavior over time repeats the same sequence over and over again. The simplest example is a pendulum, which swings periodically from left to right and right to left. If you place a pendulum next to a mirror, the reflected version looks exactly the same as the original, but with one difference: when the reflection is at its extreme right position, the original is at its extreme left. These two states both occur in the original system, but there, they are separated by a time lag of exactly half the period. So the swinging pendulum has a kind of symmetry, in which a spatial change (reflect left–right) is equivalent to a temporal one (wait half a period). These space-time symmetries are fundamental to patterns in periodic systems.

We looked for applications of our ideas, and mostly found them in physics. For example, they organize and explain a host of patterns found in a fluid confined between two rotating cylinders. In 1985 we both went to a conference in Arcata, in northern California. After the conference was over, four of us—three mathematicians
and a physicist—shared a rented car back to San Francisco. It was a very small car—calling it “subcompact” would be far too generous—and it had to hold all of our luggage as well as us. To make matters worse, we stopped off at a Napa Valley chateau so that Marty could pick up a crate of his favorite wine.

Anyway, on the journey we stopped every so often to admire the redwoods and giant sequoias, and in between Marty and I worked out how our theory applied to a system of oscillators joined together in a ring. (“Oscillator”
is just a word for anything that undergoes periodic behavior.) We did the work entirely in our heads, not writing anything down because there wasn’t room to move. This exercise was mathematically pleasing, but it seemed rather artificial. It never occurred to us to look at biology instead of physics, probably because we didn’t know any biology.

At this point fate intervened. I was sent a book called
Natural Computation
to review for the magazine
New Scientist
. It was about engineers taking inspiration from nature, trying to develop computer vision by analogy with the eye, for instance. A couple of chapters were about legged locomotion: building robots with legs to move over rough terrain, that kind of thing. And in those chapters I came across a list of patterns in the movement of four-legged animals.

I recognized some of the patterns: they were space-time symmetries, and I knew that the natural place for
them to occur was in a ring of four oscillators. Four legs . . . four oscillators . . . it definitely seemed promising. So I mentioned this curiosity in the review.

A few days after the book review appeared in print, my phone rang. It was Jim Collins, then a young research student visiting Oxford University, about fifty miles from where I lived. He knew a lot about animal movement, and was intrigued by the possible mathematical connection. He came to visit for a day, we put our heads together . . . to cut a long story short, we wrote a series of papers on space-time patterns in animal locomotion.

Many of the more radical changes of research direction in my life have come about in similar ways: spotting a possible connection between some math that I already knew and something I happened on by accident. Every link of this type is a potential research program, and the great beauty of it is, you have a pretty good idea how to get started. What features are crucial to the math? How might similar features appear in the real-world application? For example, in the locomotion story, the ring of mathematical oscillators relates to what neuroscientists call a “central pattern generator.” This is a circuit made from nerve cells that spontaneously produces the natural “rhythms,” the space-time patterns, of locomotion. So Jim and I quickly realized that we were trying to model a central pattern generator, and that a first stab was to treat it as a ring of nerve cells.

We no longer believe our original model is correct: it is too simple, it has a technical flaw, and something slightly more complicated is needed. We have a fair idea of what that replacement looks like. That’s how research is: one good idea, and you’re set for years.

Read widely, keep your mind active, keep your antennae out; when they report something interesting, pounce. As Louis Pasteur famously said, chance favors the prepared mind.

Other books

The Killables by Gemma Malley
Evidence of Murder by Samuel Roen
Will Power: A Djinn Short by Laura Catherine
Breaking All the Rules by Connor, Kerry
Burning Down the House by Russell Wangersky