Miracle Cure Read online

Page 4


  Pasteur’s reputation managed to remain largely unsullied by any similar scandal, at least during his lifetime. It was only a century later that one of his most widely publicized achievements, his rabies vaccine of 1885, was revealed, in a biography entitled The Private Science of Louis Pasteur, to be considerably less significant than it had appeared.

  By 1885, the search for a bacterium that caused rabies had failed—inevitably, since the disease is caused by a virus, one of those free-floating bits of genetic material wrapped in a protein coat that are so much smaller than bacteria that they reproduce inside them, and that wouldn’t even be identified until 1892. However, Pasteur’s thinking went, since rabies had a very slow gestation period—anywhere from a month to a year—perhaps a vaccine could actually “cure” the disease, by immunizing the victim from it after infection, but before symptoms appeared. That is, a vaccination given after exposure to the bite of a rabid animal could serve to inoculate the victim against the disease, which was inevitably fatal once symptoms appeared.

  So, when nine-year-old Joseph Meister was bitten by a rabid dog in July 1885, and survived after Pasteur inoculated him with a weakened rabies virus that had been cultivated in live rabbits, the public acclaim was enormous. The French public practically deified Pasteur, and raised more than 2.5 million francs—at least 12 million dollars today—that enabled the Institut Pasteur to open its doors three years later.

  Meister made Pasteur a national hero, though at least partly through a misunderstanding of just what “curing” a disease means. A bite from a rabid dog will cause rabies in a human only about one time in seven (though, in 1885, that one would invariably die). Had Pasteur not inoculated Meister, he had a good chance of surviving anyway. Another, rather larger problem with the story of Pasteur’s heroic achievement is that Meister wasn’t the first victim to receive Pasteur’s vaccine; two weeks before, in June 1885, a girl named Julie-Antoinette Poughon had been given the vaccine but died shortly thereafter. Nor had Pasteur tested the method on dogs prior to giving it to Meister, though he claimed to have done so. Pasteur, perhaps understandably, neglected to mention either fact to journalists or other scientists.*

  For Koch and Pasteur, however, the real achievements remain so outsized that embarrassments that would destroy the reputation of garden-variety scientists seem barely to rise to the level of peccadillo. Pasteur, especially, was a hero from the time of his original fermentation discoveries, and not just in France. In 1867, the Englishman Joseph Lister was so taken with Pasteur’s researches that he wrote:

  Turning now to the question how the atmosphere produces decomposition of organic substances, we find that a flood of light has been thrown upon this most important subject by the philosophic researches of M. Pasteur, who has demonstrated by thoroughly convincing evidence that it is not to its oxygen or to any of its gaseous constituents that the air owes this property, but to the minute particles suspended in it, which are the germs of various low forms of life, long since revealed by the microscope, and regarded as merely accidental concomitants to putrescence, but now shown by Pasteur to be its essential cause, resolving the complex organic compounds into substances of simpler chemical constitution, just as the yeast plant converts sugar into alcohol and carbonic acid.

  When he wrote this, Lister was a working physician and the Regius Professor of Surgery at the University of Glasgow. He had been born in Essex, forty years before, to a prosperous and accomplished Quaker family. His father, Joseph Jackson Lister, was a respected physicist and pioneer of microscopy; a classic scientific amateur, he was elected a Fellow of the Royal Society, the world’s oldest and most respected scientific organization, for inventing the achromatic microscope.

  In 1847, Joseph graduated from University College London—even in the middle of the nineteenth century, both Oxford and Cambridge were still barred to Quakers—and entered the Royal College of Surgeons. In 1853, he became a house surgeon at University College Hospital, and three years later was appointed surgeon to the Edinburgh Royal Infirmary.

  In 1859, the newly married Lister moved to the University of Glasgow, and his story really began.

  The Glasgow Royal Infirmary had been built in the hope that it would prevent “hospital disease” (the name coined by the Edinburgh obstetrician James Young Simpson in 1869 for the phenomenon known today as “surgical sepsis” or “postoperative sepsis”). In this, it was a notable failure. In Lister’s own records of amputations performed at the Glasgow Royal Infirmary, “hospitalism” killed between 45 and 50 percent of his patients. Lister wrote, “Applying [Pasteur’s] principles to the treatment of compound fracture, bearing in mind that it is from the vitality of the atmospheric particles that all mischief arises, it appears that all that is requisite is to dress the wound with some material capable of killing those septic germs, provided that any substance can be found reliable for this purpose, yet not too potent as a caustic.”

  Credit: National Institutes of Health/National Library of Medicine

  Joseph Lister, 1827–1912

  Lister’s earlier work—on the coagulation of the blood, and especially the different microscopically visible stages of inflammation in the sick—convinced him that Pasteur had it right. The “pollen” idea, however, convinced him also that the microorganisms traveled exclusively through the air. This was wrong, but usefully so, since it argued for the most impassable barrier between the “infected” air and the patient.

  Not a physical barrier, a chemical one. In 1834, the German chemist Friedlieb Runge had discovered that what he called Karbolsäure could be distilled from the tarry substance left behind when wood or coal are burned in furnaces or chimneys: creosote, the stuff that gives smoked meat its flavor. Sometime in the 1860s, Lister read an article about how a German town used creosote to eliminate the smell of sewage. Since he knew, pace Pasteur, that the smell of sewage was caused by the same chemical process that caused wounds to mortify, he reasoned that a compound that prevented one might, mutatis mutandis, halt the other. In the spring of 1865, he started testing other coal tar extracts on patients, and, on August 12, he hit the jackpot: The substance known as phenol, or carbolic acid, stopped infections cold.* Two years later, he published his results: Surgical mortality at the Glasgow Infirmary had fallen from 45 percent to 15 percent. “As there appears to be no doubt regarding the cause of this change, the importance of the fact can hardly be exaggerated.”*

  It took years before Lister was able to persuade the medical establishment of the importance of what has come to be known as antisepsis, helped along more by practical and highly publicized results—in 1871, he safely drained an abscess under the arm of Queen Victoria—than by experimental validation. Dependence on antisepsis, however, had its own risks. Patients were frequently required to inhale the fumes of burning creosote, which was dangerous enough. Even worse, some were given injections of carbolic acid, which doesn’t kill just dangerous pathogens, but often enough, the patients themselves. As the German physiologist (and winner of the very first Nobel Prize in Medicine) Emil Behring pointed out in the 1880s, “It can be regarded almost as a law that the tissue cells of man and animal are many times more susceptible to the poisonous effects of disinfectants than any bacteria known at present. Before the antiseptic has a chance either to kill or inhibit the growth of the bacteria in the blood or in the organs of the body, the infected animal itself will be killed.”

  Lister’s reputation, and the importance of both antiseptic and aseptic surgical practice—not merely disinfecting wounds, which Lister pioneered, but maintaining a fully sanitary environment around patients, a technique he adopted far later—continued to grow over the rest of his life. He would become one of the heroes of nineteenth-century Britain, president of the Royal Society, founder of the British Institute of Preventive Medicine (renamed the Lister Institute of Preventive Medicine in 1903), and be made Baron Lister of Lyme Regis. In 1899, the Chinese minister to the Court of S
t. James’s, commanded by the emperor to produce biographies of the hundred greatest men in the world, announced that the three Englishmen to make the cut were William Shakespeare, William Harvey, and Lister himself. In retrospect, this seems modest enough. The germ theory of disease that had been developed and tested by Pasteur, Koch, and Lister himself produced an astonishing number of discoveries about the causes of disease; not merely anthrax, tuberculosis, and cholera—respectively the bacteria known as Bacillus anthracis, Mycobacterium tuberculosis, and Vibrio cholerae—but gonorrhea (Neisseria gonorrhoeae, discovered 1879), diphtheria (Corynebacterium diphtheriae, discovered 1883), bacterial pneumonia (Streptococcus pneumoniae, discovered 1886), gas gangrene (Clostridium perfringens, discovered 1892), bubonic plague (Yersinia pestis, discovered 1894), dysentery (Shigella dysenteriae, discovered 1898), syphilis (Treponema pallidum, discovered 1903), and whooping cough (Bordatella pertussis, discovered 1906). Moreover, the discovery of the nature of these infectious agents led directly to a powerful suite of defensive weapons; not merely antisepsis and vaccination, but even more usefully, improved sanitation and hygiene. Since by their very nature, preventive measures succeed when disease doesn’t even appear, it is impossible to know with certainty how many lives were saved by these expedients, but they are the most important reason that European life expectancy at birth increased from less than forty years in 1850 to more than fifty by 1900.

  Nonetheless, as valuable as these practices were in defending human life from pathogens, millions continued to fall ill from infectious disease every day. And when they did, medicine could do virtually nothing about it. Perversely, the greatest triumph in medical history—the germ theory of disease—destroyed the ideal of heroic medicine, replacing it with a kind of therapeutic fatalism.* As physicians were taught the bacterial causes of diseases, they also learned that there was little if nothing to do once a patient acquired one.

  In one of Aesop’s best-known fables, a group of frogs living in a pond prayed to the gods to send them a king; an amused Zeus dropped a log in the pond, and announced that this was, henceforth, the frogs’ king. The frogs, disappointed with their new king’s inactivity, prayed again for a king . . . this time one that would do something, upon which Zeus sent them a stork, who promptly ate the frogs. The Aesopian moral—always choose King Log over King Stork—is one that the Western world’s physicians took to heart, and from the 1860s to at least the 1920s, humility reigned. Only a few drugs had any utility at all (mostly for relieving pain), which made for skepticism about virtually all treatment. On May 30, 1860, Dr. Oliver Wendell Holmes, Sr., famously announced in an address before the Massachusetts Medical Society:

  Throw out opium, which the Creator himself seems to prescribe, for we often see the scarlet poppy growing in the cornfields, as if it were foreseen that wherever there is hunger to be fed there must also be a pain to be soothed; throw out a few specifics which our art did not discover, and it is hardly needed to apply; throw out wine, which is a food, and the vapors which produce the miracle of anaesthesia, and I firmly believe that if the whole materia medica [medical drugs], as now used, could be sunk to the bottom of the sea, it would be all the better for mankind,—and all the worse for the fishes.

  Holmes overstates, but not by much. The achievements of the nineteenth century in revolutionizing medical therapeutics are nothing to sneeze at, including the recognition that sneezing itself was a powerful source of dozens of infectious diseases. The great biologists of the era established a robust theory about disease, along with powerful tools for defending against it, and left behind a model for research, experimentation, and validation.

  It takes nothing away from the extraordinary discoveries of Pasteur, Koch, Lister, and others to wonder whether their most enduring contribution to the revolution in medicine wasn’t informational but institutional: the modern biological research laboratories. The Institut Pasteur was founded in 1888; the Lister Institute of Preventive Medicine was established in 1891, the same year that the Robert Koch Institute was founded, originally as the Royal Prussian Institute for Infectious Diseases. In 1890, the Royal Colleges of Surgeons and Physicians opened its first research laboratory in London. These establishments weren’t only fertile schools for training for the next generations of researchers, or structures in which the best biologists and physiologists in the world could cooperate—and, truth be told, compete—one with the other. They were also magnets for the resources that research demanded—magnets for the philanthropy of wealthy families, and subsidies from national governments. As the nineteenth century turned into the twentieth, the life sciences had not yet become the enormously expensive proposition they would become decades hence. Nonetheless, even frugal research still cost money, and institutional laboratories were, for a time, the most productive place to spend it.

  But for the next chapter in the story that leads from George Washington’s sickbed to the maternity wing of New Haven Hospital in 1942, there was an even more important development: the marriage between industrial chemistry and medicine.

  TWO

  “Patience, Skill, Luck, and Money”

  A nineteenth-century German opera, Der Freischütz—in English, The Marksman—tells the story of a young forester who must pass a test of shooting skill to win the heart of his true love. After losing a match to a young peasant, he is persuaded to improve the odds by using Freikugel or Zauberkugel, an enchanted bullet that could not fail to hit its target.* The magic bullets—the first six under the control of the marksman, the seventh owned by the devil—are a constant presence in collections of European folktales and a familiar trope in nineteenth-century drama.

  Even more durably, they inspired the best-known modern usage of the term, which has little to do with the devil or folk mythology. In 1907, “magic bullets” were the theme of the Harben Lectures given at what was then known as Britain’s Royal Institute of Public Health. The lecturer, who used the term to describe a targeted drug, one that would attack a disease-causing microbe without harming the host suffering from the disease, was a German physician named Paul Ehrlich.

  Ehrlich was then fifty-two years old, and one of the most respected physicians and scientists in the world. Like Robert Koch, he was a product of a terrifyingly rigorous, but undeniably effective, German education. Beginning with the reforms of Wilhelm von Humboldt in Prussia in the early nineteenth century, Germany had taken a far more pragmatic view of modern subjects such as mathematics and science than had been the case in the United Kingdom or France, and certainly the United States. By 1872, state-controlled education had added, to the nine years of Latin and Greek provided by the gymnasia, the Realgymnasia, and Oberrealschules, and especially in the technische Hochschulen (technical colleges), algebra, chemistry, and physics. After decades of providing its citizens with history’s most rigorous educational program, Germany had become the world’s leader in virtually every field of scientific research. Moreover, since the original reforms had been explicitly made to support the commercial interests of the German state, there were no qualms about partnerships between schools—secondary and postsecondary—and industry. Paul Ehrlich was a notable beneficiary, as his education took him from the Maria-Magdalenen-Gymnasium in Breslau through universities in Strasbourg, Freiburg, and Leipzig. In 1878, at the age of twenty-four, he received his doctorate for a dissertation on a subject that would occupy him for years, and turn out to be the first link in a chain that led to the first true antibacterial chemical therapy.

  Credit: National Institutes of Health/National Library of Medicine

  Paul Ehrlich, 1854–1915

  —

  Ehrlich’s dissertation was titled “Contributions to the Theory and Practice of Histological Dyes.” Histology—the word is taken from a Greek root meaning “something that stands upright” and was adopted by nineteenth-century physiology to mean “tissue”—had just come into its own as a credible discipline. Ever more powerful microscopes had made dist
inguishing one sort of tissue from another possible, but not so easy. Even with lenses that magnified cells hundreds of times, it was difficult if not impossible to identify distinct cell types, without something to improve the contrast between, for example, different sorts of blood cells. The something was staining: Some chemicals have a special affinity for different cell types, and can turn them a particular color while leaving other similarly shaped cells unchanged. Ehrlich’s specific contribution was to use a dye with what he called “an absolutely characteristic behavior toward the protoplasmic deposits of certain cells” in blood plasma. He named the “certain cells” mastzelle, from the German word for “fattening,” because he believed them involved in the process by which cells fed themselves. In this he was wrong—mast cells are part of the immune system (about which more below)—but the real triumph was the discovery that stains could differentiate components of blood: leukocytes, lymphocytes, red blood cells, and so on. The newly minted doctor, who had become known to his classmates as “the man with blue, yellow, red, and green fingers,” had made a huge discovery: Different stains were absorbed by different cell types.