More on this book
Kindle Notes & Highlights
by
Thomas Goetz
Read between
August 6 - August 12, 2019
In the last half of the nineteenth century, at least one-quarter of all deaths were due to tuberculosis, a steady pall that loomed over every country in Europe, over the United States, and indeed worldwide.
Indeed, it is difficult to imagine, from the security of our twenty-first-century perspective, how familiar death was in the late nineteenth century. Suffering, injury, and disease were altogether routine, part of the ordinary experience of everyday life. People tend to use average life span to illustrate historic differences in health, but those figures—in 1870 the average life span in Europe and the United States was about thirty-six years, compared to about eighty years today—don’t nearly make the point strongly enough.
Death is so unfamiliar today that almost any expiration seems exceptional. We lament the passing of the generations as if each loss were a tragedy rather than biology. But in 1870, death was a constant presence, lurking around every corner, something that visited families and friends regularly (if not routinely). The dead came from all stages of life: the old, of course, but also those in the prime of life.
Infant mortality in England and the United States hovered around a stunning 20 percent of all live births—and in some areas, a rate of 30 to 40 percent wasn’t uncommon.
these ideas were just emerging, and the most basic measures of public health—starting with the idea of just counting who died of what—were controversial, seen by medical doctors as challenging their authority and their duties to individual patients.
THE LAST DECADES OF THE NINETEENTH CENTURY WERE REPLETE with new discoveries. A list of inventions from 1875 to 1900 serves as nothing less than an inventory of the toolbox for modern life: the telephone, the lightbulb, the phonograph, the fountain pen, the cash register, the dishwasher, the escalator, the vacuum cleaner, the modern bicycle, the internal combustion engine, the Kodak camera, the flashbulb, the X-ray machine, the radio, the tape recorder, the paper clip, the zipper, subways, electric power plants, and drinking straws. All these modern tools were invented in a burst of innovation
...more
There is no inevitable path for science; every fact won is hard fought and is self-evident only in retrospect. On the cold frontiers of science, there are no inevitabilities, no simple answers. And there are no easy remedies.
school. Unlike France, where Paris was the absolute center of science and academic research, Germany was a decentralized nation. The German state was not officially unified until 1871, when Otto von Bismarck consolidated the German peoples into one nation. Germany had thus developed a network of intellectual centers, in Berlin and Munich, but likewise in Heidelberg, Wittenberg, Halle, Göttingen, and other cities.
Leeuwenhoek had made it powerful, Lister made it reliable, and Carl Zeiss made it affordable: Those three components allowed the technology to flourish.
Later, he found a more dependable source of animals, after a friend sent Gertrud some white mice as pets. The animals began reproducing rapidly, and Koch began using the extra supply in his lab. These became one of his most iconic contributions to science: the white lab mouse.
The General Hospital had two maternity wards, divided by class: in the first, elite patients received care from physicians and medical students; in the second ward, lower classes were attended to by midwives. Semmelweis monitored the two clinics and found that, to his surprise, the death rate was far higher in the first ward than in the second. Fully 13 percent of women attended to by medical staff died, while just 2 percent of women with midwives did.
Faced with these numbers, Semmelweis came to a horrific realization: The medical students were working on cadavers in the hospital morgue barehanded and then attending to the expectant mothers directly afterward, without washing their hands between. As they moved from corpse to mother, they carried the germs (which would later be identified as Staphylococcus and Streptococcus) with them.
Semmelweis would die in an insane asylum in 1865, tormented by the agony of having his work thrown on the rubbish heap. His tragedy, as the writer Céline would later write, was that “his discovery was too great for the strength of his genius.”
Today, Snow is celebrated for his symbolic victory in having the handle removed from the culprit pump, and as the father of epidemiology.
Thomas Kuhn proposed a new way to look at science: not as the inevitable, persistent discovery of truth, but rather as “the piecemeal process by which . . . items are added, singly and in combination, to the ever growing stockpile that constitutes scientific technique and knowledge.”
New science, Kuhn argued, needs to be powerful enough not just to prove its point, but also to overwhelm the traditions that already explain the world.
As Bernard Barber put it just a year before Kuhn, in a 1961 Science essay entitled “Resistance by Scientists to Scientific Discovery,” “as men in society, scientists are sometimes the agents, sometimes the objects, of resistance to their own discoveries.”
First, he demonstrated unambiguously that specific bacteria cause specific infections. Second, he stipulated a set of threshold protocols for proving causation, standards that could be used by others to make their own demonstrations of causation.
In these principles, Koch was building on the twin pillars of German science, or Wissenschaft: Vollständigkeit, or completeness, and Nachvollziehbarkeit, clarity of methods and results (what is today known as reproducibility).
Koch’s process—isolating a bacterium from a diseased organism, growing it, introducing it into a healthy organism, and then establishing disease once more—came as close as science could to affirming causation.
This is no small feat; science today still grapples with the difference between correlation and causation, and Koch’s postulates, in slightly revised form, continue to be used as criteria for valid research. The mantra of “correlation is not causation” is so pervasive in modern science as to be a cliché, but it is nonetheless commonly overlooked, as overeager researchers (or overeager chroniclers of research) make a wishful leap from a correlation between a disease and an agent and an actual causative relationship.
“Do you know why it is so important to me to fight and defeat you?” Pasteur wrote to a proponent of spontaneous generation a few months before his “Germ Theory” speech. “It is because you are one of the main adherents to a medical doctrine that I consider extremely harmful to the art of healing.” Even for Pasteur, who’d made his career by battling the dragons of conventional wisdom, this was unusually blunt.
Pasteur’s disdain wasn’t just philosophical. He also abhored germs in a literal sense. In fact, he may have been history’s first germophobe: He was compulsive about washing his hands, leaving his work repeatedly throughout the day to roll up his sleeves and lather up.
His original inquiry into fermentation, published in 1857, came at the request of a distillery in Lille, a city in northern France. Pasteur was a professor of chemistry at a nearby university when the distraught father of one of his students approached him. His distillery business, the man confessed, was failing for some mysterious reason. Rather than getting alcohol from his beet juice, the man told Pasteur, he was getting something like sour milk. Pasteur began an investigation and discovered that the juice was fermenting into lactic acid rather than alcohol. The cause, he discerned, was a
...more
This was landmark research, perhaps the first time that a definitive cause of disease had been identified—it was six years before Koch’s work on anthrax—and it would be a model for future investigations.
The smell, he realized, was just as Pasteur had described it. Perhaps, Lister thought, this might be the smell of rotting flesh. (These days, that particular odor is foreign to our noses, but it wasn’t so uncommon in the time before refrigeration and pasteurization. Indeed, once it smacks your nasal passages, the odor is unforgettable.)
Jenner’s new process (later known as vaccination, from the Latin root vaca, for “cow”) would soon become widespread.
(Smallpox would be eliminated from the face of the earth in 1973, the first and, as of now, still the only disease to be entirely eradicated from the human population.)
An essential part of this system, then as now, is competition. The base human instinct to beat the other guy is an essential characteristic of science, notwithstanding its tweedy reputation. Without competition, there would be no need for a citation—the coin of the realm for scientists. Without competition, there would be no triumph of discovery, no glory in being first to discern a truth where there had been only questions. Economists have long recognized the essential role of competition in fostering innovation and as a spur toward technological innovation. Competition pushes innovators and
...more
The feud between Pasteur and Koch is one of the great battles of science, in part because they were so foolishly public with it.
As late as 1883, Michel Peter, a Parisian physician held in high esteem by his colleagues, went so far as to denounce Pasteur’s work to his face, at an address at the National Academy of Medicine. “What do I care about your microbes? . . . I have said, and I repeat, that all this research on microbes are not worth the time spent on them or the fuss made about them, and that after all the work nothing would be changed in medicine, there would only be a few extra microbes. Medicine . . . is threatened by the invasion of incompetent, and rash persons given to dreaming.”
“All of these facts taken together can lead to only one conclusion,” Koch said. “That the bacilli which are present in the tuberculosis substances not only accompany the tuberculosis process but are the cause of it. In the bacilli we have, therefore, the actual infective cause of tuberculosis.”
TO GRASP THE SIGNIFICANCE OF KOCH’S DISCOVERY, WE MUST FIRST get our heads around this: To live in the nineteenth century was to experience infectious disease as a constant, to have unexplained illnesses afflict and dispatch loved ones without warning. Simply put, more people died of more things back then than do now; the death rate in London in 1850 was twenty-five per thousand—more than five times today’s death rate.
HIV/AIDS has killed about 620,000 Americans since the first case in 1981, roughly the same number of those who die from heart disease or cancer every year. This isn’t to minimize the grave toll of HIV/AIDS. But by historical standards, the impact demonstrates how detached we are from the experience of infectious disease and how terrifying an epidemic can be today, even for those with little risk.
The nineteenth century, though, was a one-hundred-year dirge from one horrid epidemic to another. Six waves of cholera ravaged the globe during the century:
Tuberculosis was altogether different. It was not an epidemic but an endemic disease. It didn’t come in waves or explode through a population; its presence was constant, pervasive, and persistent.
Indeed, the history of humanity is intertwined with the history of tuberculosis; it has been found in Egyptian mummies and in ancient Native American burial grounds, and it is mentioned in four-thousand-year-old Sanskrit texts from India.
Until Koch’s discovery of the bacillus, being coughed on, as many inevitably were, would not have prompted much concern.
One can get a sense of how ubiquitous it was by noting how many famous figures of the day died from it. Elizabeth Barrett Browning, Johann Wolfgang von Goethe, Friedrich Schiller, Henry David Thoreau, all three Brontë sisters, Anton Chekhov, Washington Irving, Guy de Maupassant, Edgar Allan Poe, Sir Walter Scott . . . The toll is so vast that there’s an entire Wikipedia entry devoted to the subject.
The slow, wasting nature of the disease made it fodder for romanticists and drove the plots of operas such as La Traviata and La Bohème.
Nineteenth-century civilization seems to have been custom-designed for a microbe such as tuberculosis. It was, of course, the great age of industrialization, when people were pouring into European and American cities for work. To say they lived on top of one another is to be fairly literal: The average household size in 1850s England was nearly seven persons, more than double the average today.
Measles and pertussis, meanwhile, can be extremely contagious, each with an R0 over 15 commonly. (This is one reason public health officials are so adamant about vaccinating against these diseases: They must be kept out of a population altogether, since once they take hold, they’re extremely difficult to extinguish.)
The hereditary theory for TB dated back to Hippocrates, who noted that “consumptives beget consumptives.” It would prove an exceptionally hard notion to shake, insofar as the disease did spread through families, with the onset between family members often occurring generationally, delayed by years or decades. What’s more, the symptoms of consumption (lassitude, lack of appetite, emaciation) seemed to endorse the notion that a consumptive had inherited an overall physical weakness that culminated in the disease, rather than the other way around.
His complaint—that Koch, who had gotten so much glory, was getting still more—is known today as the Matthew effect. The term was coined by Robert Merton in a 1968 paper in Science: “The Matthew effect consists in the accruing of greater increments of recognition for particular scientific contributions to scientists of considerable repute and the withholding of such recognition from scientists who have not yet made their mark.” (The term gets its name from a passage in the Gospel of Matthew: “For unto every one that hath shall be given, and he shall have abundance: but from him that hath not
...more
In fact, Balas and Boren found an average of a seventeen-year delay between the original research establishing a best practice and the time when patients would be routinely treated accordingly.
Homeopathy, the sham practice of using that which causes disease to cure it, was also booming in the 1800s.
In 1867 the law was extended to all children fourteen years and younger and assigned penalties for refusal. Public antipathy to this law was widespread, fed by the perception that the state was exceeding its authority and invading citizens’ homes.

