Read THAT’S THE WAY THE COOKIE CRUMBLES Online
Authors: Dr. Joe Schwarcz
The National Portrait Gallery in London is an amazing place. It is so overwhelming that many visitors don’t know where to turn first. Should they search for the only known authentic portrait of Shakespeare? Or should they pass the time with Dickens, Henry VIII, or Isaac Newton? It’s a tough call. But I knew where I wanted to go. I wanted to find the room that featured the nineteenth-century scientists. I had to see the original of the picture that I had shown so often in lectures. I had to pay homage to William Henry Perkin, the man who, in 1856, single-handedly changed the course of chemistry.
During the early years of Queen Victoria’s reign, the Germans dominated the pursuit of chemical knowledge. Bunsen, Wohler, and, particularly, Justus von Liebig were the leading lights. Britain had no comparable personalities, which Liebig underscored on one of his English lecture tours.
“England is not the land of science,” he insisted. “There is only widespread dilettantism; their chemists are ashamed to be known by that name because it has been assumed by the apothecaries, who are despised.” (Indeed, in England pharmacists are still known as chemists, although I suspect they are no longer despised.) Liebig’s remarks struck a cord with Prince Albert, the queen’s husband, who was a strong supporter of scientific inquiry. England needed to get on the chemical bandwagon, he thought, and it needed an institution that would allow it to do so. So he sponsored the establishment of the Royal College of Chemistry, and the college invited August Wilhelm Hofmann, a former student of Liebig’s, to be its director. And it was there, in 1853, that a historic meeting would take place between the German chemistry professor and fifteen-year-old William Henry Perkin.
As a youngster, Perkin had been keenly interested in art and the fledgling field of photography. But these pursuits would later take a back seat to chemistry. When he was about twelve, Perkin recalled, “a friend showed me some chemical experiments and the wonderful power of substances to crystallize. . . . My choice was fixed, and I determined, if possible, to become a chemist and I immediately commenced to accumulate bottles of chemicals and make experiments.” Perkin asked his father to enroll him in the City of London School, which was the only school offering practical chemistry lessons at that time. These courses were not part of the regular curriculum, and interested students had to pay extra to be taught by Thomas Hall, the chemistry master who had studied under Hofmann.
Hall immediately recognized young Perkin’s potential and arranged for him to enter the Royal College of Chemistry. Perkin Senior had his heart set on his son becoming an architect, and he did not approve of his infatuation with a subject that had no career potential. But he finally did agree to give his son the tuition money, an investment that would eventually pay very handsome dividends. Under Hofmann’s tutelage, young Perkin began to experiment with coal tar, the mucky residue that coal left behind when it was heated in closed vessels in the absence of oxygen to produce gas for gaslights. Hofmann’s interests had always tended towards the practical side of chemistry — his course syllabus included dyeing and bleaching, and extracting drugs from natural sources. But Perkin’s interest was aroused by a lecture on the potential use of coal tar derivatives to make drugs that were then only available from natural sources. Quinine, extracted from the South American cinchona tree, was the only treatment for malaria. And there wasn’t enough of it.
Hofmann had carried out a chemical analysis of quinine, noting that its composition suggested it could be made by somehow combining two molecules of allyltoluidine, a coal tar component, with oxygen. This idea intrigued Perkin, and, during the Easter holiday of 1856, he decided to investigate it in the little lab he had constructed at home. Things did not go well. When he added potassium dichromate, the chemical that was to serve as the source of oxygen, Perkin created a black goo. He tried again with aniline, a related compound, with much the same result. But then came the pivotal moment. When he mixed in alcohol to dissolve the goo, the solution turned a beautiful purple. Furthermore, the rag he used to wipe the bench was dyed the same glorious hue.
Perkin immediately recognized the importance of his accidental discovery. The color purple was all the rage in fashion circles. Manufacturers produced it either by extracting a certain species of lichen or by treating uric acid obtained from Peruvian guano (bird poop) with a series of reagents. But now Perkin could make it from cheap coal tar. He sent a sample to well-known dye makers Pullar’s of Perth and requested their opinion. The answer was swift: Pullar’s told Perkin that he had made “one of the most valuable discoveries that has come out for a long time.” This was all the encouragement Perkin needed to leave the Royal College and — with financial backing from his father — set up a factory on a vacant lot at Greenford Green, a few miles from London.
By the time Queen Victoria appeared in a dress dyed with Perkin’s mauve (as the new shade was called) at the 1862 International Exhibition in London, chemists everywhere were trying to coax novel substances out of coal tar. If Perkin could make mauve, then what other secrets did that complex mixture harbor? Soon, chemists were reporting discoveries aplenty. First came a variety of dyes; over time, these were followed by assorted drugs, plastics, and synthetic fibers, all fabricated from coal tar. William Henry Perkin’s accidental discovery had set chemistry on its modern course. I would encourage anyone who is interested in delving further into the William Perkin story to read Simon Garfield’s delightful book
Mauve
.
In 1906, the fiftieth jubilee of the first synthesis of mauve was widely celebrated, and among the numerous gifts that Perkin received was a painting by Arthur Cope. It’s a classic. The portrait shows an elderly, bearded Perkin standing in front of some chemical glassware, brandishing a beautifully dyed piece of wool. That’s the picture I use in my lectures, the one I was so eager to see in the original. Alas, it was not to be. I was informed that the portrait was not on display — it was “in storage.” One of the greatest contributors to chemistry had been relegated to the basement. But a portrait of Mr. Bean was prominently displayed.
The green cloud rolled slowly towards the trenches at Ypres, Belgium, where the French and Algerian troops had dug in to wait for the German attack. They had never seen anything like it. And many would never see it, or anything else, again. The soldiers began to choke and cough violently as the sickly green vapor settled over the battlefield on that fateful day in 1915. Their lungs began to fill with fluid as the acrid gas stripped away their protective mucus lining. Soon, thousands lay dying, victims of a terrible new weapon. The Germans had unleashed the horror of gas warfare.
The signal to open the valves on the 5,730 canisters of chlorine gas was given by Corporal Fritz Haber, the man who had supervised the operation. That operation was deemed so successful that Haber was promoted to the rank of captain; just a year later, he would become the director of the German Chemical Warfare Service. Dr. Haber, trained as a chemist at the University of Berlin, regarded himself as a great German patriot and had accordingly volunteered his services to the military. He thought that gas warfare would be an effective way to flush the enemy from the trenches. He was convinced that it was essentially no different from shelling or bombing. The military had considered gas warfare “unsporting,” but at Haber’s urging they agreed to try it. On the home front, however, Haber faced stronger opposition. His wife, also a chemist, was outraged that her husband would exploit his chemical expertise in this fashion. When Haber refused to listen to her, she grabbed his revolver and killed herself. He departed the next day for the eastern front, leaving others to make her funeral arrangements. Little wonder Fritz Haber has gone down in history as the callous villain who introduced chemical weapons to the battlefield. But Haber is remembered for something else as well. He was responsible for one of the most important inventions of the twentieth century, an innovation that would save millions of people from starving to death. Talk about a paradox!
By the 1800s, scientists had begun to understand the nuances of agriculture. Carbon, which all plants need, was supplied by carbon dioxide in the air. Hydrogen came from water. But the soil supplied potassium, phosphorus, and nitrogen — minerals vital for crop growth. Once earth was depleted of these minerals, it would become infertile. One could replenish the minerals and render the soil fertile again by plowing in manure or plant wastes. Yet scientists understood that even the most efficient methods of recycling waste products as fertilizer would not be enough to sustain the world’s growing population. Supplying nitrogen for crops was a particular problem. This may seem strange, since about seventy-eight percent of the air is made up of nitrogen. But, with the exception of legumes, plants cannot use nitrogen in this elemental form. Legumes have bacteria growing on their roots that can convert the gaseous nitrogen of the air into a usable form; all other crops have to rely on compounds of nitrogen in the soil.
During the nineteenth century, the major source of nitrogen fertilizer in Europe was guano imported from certain Pacific islands. Still, all the bird droppings Europeans could import were not enough to feed the growing agricultural industry. Then, as luck would have it, someone found large deposits of sodium nitrate, better known as saltpeter, in the Chilean desert. Saltpeter was a great source of nitrogen, but it wouldn’t last forever. Scientists predicted mass starvation when the saltpeter ran out.
Fritz Haber turned his attention to this problem in 1904. As he saw it, the solution lay in finding a way to make use of the vast amount of nitrogen in the atmosphere. Two years earlier, Carl von Linde had succeeded in liquefying air by compressing it and then letting it expand rapidly, thereby cooling it. Through distillation, he could separate the liquid air he obtained into oxygen and nitrogen. Haber found that if he reacted this nitrogen with hydrogen under pressure, then ammonia (NH
3
) would form. This was a breathtaking discovery. It meant that we could pump ammonia gas directly into soil as a fertilizer, or, even better, we could convert it into solid ammonium nitrate by reaction with nitric acid, which itself could be made from ammonia. Ammonium nitrate was an ideal water-soluble fertilizer. In 1909, Haber demonstrated his process to the chemists at the Badische Anilin und Soda Fabrik (BASF), who, under the leadership of Karl Bosch, went on to work out the final details of the industrial manufacture of ammonia. BASF signed a contract with Haber, giving him a royalty for every kilogram of ammonia produced. This made the chemist a wealthy man.
Ammonia and ammonium nitrate increased crop yields dramatically. The hungry masses could now be fed. Undoubtedly, the discovery of the ammonia synthesis was worthy of a Nobel Prize. And Haber received one in 1918 — but not without provoking controversy. Many scientists protested, insisting that the inventor of chemical warfare should not be honored with science’s most prestigious prize. In 1933, when the Nazis came to power, the renowned chemist went from being a German patriot to being “the Jew Haber.” Actually, Haber had never practiced his religion, and early in his career he’d even had himself baptized, joining a Protestant church to avoid anti-Semitism. Haber was not relieved of his position at the Kaiser Wilhelm Institute, but he was told he would have to replace all his Jewish scientists. He quit and departed for England, saying that he refused to judge a scientist’s qualifications based on the ancestry of his grandmother.
Fritz Haber was not received warmly in England. A number of prominent Britons would have nothing to do with the father of gas warfare. And so it is that Fritz Haber leaves us with a bizarre legacy. He saved millions from hunger by synthesizing ammonia, and he killed thousands with his poison gases. German gas attacks brought gas reprisals from the Allies, which resulted in the final aspect of Haber’s legacy. Gas-warfare injuries drove an Austrian corporal out of the army and into politics. His name was Adolf Hitler.
I think of the Radium Girls almost every night. It happens when I check the time on my glowing watch dial — usually just after I’ve been reminded of the passing years by nature’s nocturnal call. Let me fill you in.
In the summer of 1896, Professor Henri Becquerel went into the garden of Paris’s Ecole Polytechnique and placed a photographic plate wrapped in black paper in the bright sunshine. On top of it, he carefully positioned a crystal of uranium sulfate. Becquerel was a physicist who had become interested in those amazing rays discovered just two years earlier by Wilhelm Roentgen. x-rays, as they were called, caused certain substances to glow in the dark. If x-rays could produce a glow, maybe a glowing object could produce x-rays, Becquerel thought. He knew that uranium compounds fluoresced in the sun, so he embarked on his experiment. When he developed the plate and found an exposed spot that corresponded to the spot where he’d placed the crystals, he was elated. But he still had to verify the experiment.
The next day, however, the weather was cloudy, and Becquerel put his wrapped plate and uranium in a drawer. Somehow, the plate got mixed up with some exposed plates and was developed along with them. To Becquerel’s astonishment, it showed spots just like the ones uranium had produced in the sun. Sunshine was unnecessary; the uranium crystals were giving off some form of energy that exposed the photographic plate. One of Professor Becquerel’s students finally determined that this novel form of radiation derived from some sort of activity within the uranium atom and was not dependent on any external stimulus. She coined the term “radioactivity.” That student was Marie Curie.
Working with her husband, Pierre, Marie Curie discovered that uranium atoms were not unique in this respect. When the couple removed uranium from its common ore, pitchblende, the ore’s radioactivity did not disappear. Eventually, from tons of ore, they were able to isolate a few milligrams of two new radioactive elements: polonium and radium. The latter’s name was derived from the Latin word
radius
, meaning “ray,” because pure radium glowed in the dark with a stunning blue color. Nobody at the time knew the reason for this eerie glow. Nobody knew that radium was spontaneously changing into radon, a process that involved the release of very energetic alpha particles. Nobody knew that it was the collision of these particles with air molecules that caused the luminescence. And nobody knew that a collision of these particles with human tissue could kill.
Radium was a novelty. Before long, entrepreneurs were busy capitalizing on its commercial potential. Glow-in-the-dark light cords flooded the stores. But what really captured the public’s attention was the glow-in-the-dark watch dial. By mixing a little radium sulfate with zinc sulfide, manufacturers could produce a brilliant green glow. The energetic alpha particles released by the radium boosted the electrons in zinc sulfide into a higher energy state. When these electrons returned to their ground level, they emitted the energy they had absorbed, but this time in the form of light. We could now create luminosity without electricity.
By the 1920s, the Radium Dial Company had established itself in Ottawa, Illinois. Most of the workers were young girls, attracted by the relatively high salary of eighteen dollars a week. Indeed, the Radium Girls, as people came to call them, were easy to recognize around town because they dressed well and drove fancy cars. They also loved fun. Some took to painting various parts of their bodies with the luminous paint to surprise their boyfriends in the dark. But they were in for a surprise.
Some of the girls began to complain of jaw pain, but no one took this seriously until a twenty-five-year-old worker died. The authorities became suspicious, and they soon focused their investigations on the “lip-pointing” technique the girls had developed to paint the fine numbers on the watch dials. After virtually every stroke, a girl would lick her brush to sharpen its tip. In the process, she would swallow a little radium. Radium is insidious in that it incorporates into bone, like calcium. There, it releases alpha particles that destroy not only the bone but also blood cells in the marrow. The girls developed anemia, leukemia, and jawbones so weak that they disintegrated when a tooth was pulled. By 1929, the consequences of working carelessly with radium had become clear, but by then thirty-three workers had died. The authorities instituted reforms, ensuring that everyone wore gloves, that they mixed paint in a fume hood, and that they stored radium in lead containers. But they couldn’t make the process risk-free, because they couldn’t prevent radium particles from becoming airborne. The last radium-painted watches were made in 1968, just before the Radium Dial Company plant was demolished.
Now you know why I think of the Radium Girls in the middle of the night when I look at my fake Swiss Army watch, which I bought on a New York street corner for seven bucks. I doubt that it contains radium. In any case, alpha particles cannot penetrate the casing, and I do not plan to eat the watch. I suspect the zinc sulfide is energized with promethium, a synthetic element named after Prometheus, who stole fire from the heavens. It is radioactive, but it produces beta rays, which are less dangerous than alpha particles. It can be used for all sorts of glow-in-the-dark objects — maybe even toilet seats to make those nocturnal visits easier.