More on this book
Community
Kindle Notes & Highlights
It is tempting to write the history of technology through products: the wheel; the microscope; the airplane; the Internet. But it is more illuminating to write the history of technology through transitions: linear motion to circular motion; visual space to subvisual space; motion on land to motion in air; physical connectivity to virtual connectivity.
The paucity of medicines has one principal reason: specificity. Nearly every drug works by binding to its target and enabling or disabling it—turning molecular switches on or off. To be useful, a drug must bind to its switches—but to only a selected set of switches; an indiscriminate drug is no different from a poison. Most molecules can barely achieve this level of discrimination—but proteins have been designed explicitly for this purpose. Proteins, recall, are the hubs of the biological world. They are the enablers and the disablers, the machinators, the regulators, the gatekeepers, the
...more
Proteins are thus poised to be some of the most potent and most discriminating medicines in the pharmacological world. But to make a protein, one needs its gene—and here recombinant DNA technology provided the crucial missing stepping-stone.
The use of recombinant DNA to produce proteins thus marked a transition not just between one gene and one medicine, but between genes and a novel universe of drugs.
Many genes tug and push the pressure of blood in the body, like a tangle of strings controlling a puppet’s arms. If you change the length of any of these individual strings, you change the configuration of the puppet.
By itself, then, a mutant, or a mutation, can provide no real information about a disease or disorder. The definition of disease rests, rather, on the specific disabilities caused by an incongruity between an individual’s genetic endowment and his or her current environment—between a mutation, the circumstances of a person’s existence, and his or her goals for survival or success. It is not mutation that ultimately causes disease, but mismatch.
Nowhere, perhaps, was the increased visibility of “mutants” more evident than in that reliable barometer of American anxieties and fantasies—comic strips.
As the legal scholar Alexander Bickel described it, “The individual’s [i.e., mother’s] interest, here, overrides society’s interest in the first three months and, subject only to health regulations, also in the second; in the third trimester, society is preeminent.”
In 1973, a few months after Roe v. Wade, McKusick published a new edition of his textbook on medical genetics. In a chapter on the “prenatal detection of hereditary diseases,” Joseph Dancis, the pediatrician, wrote: In recent years the feeling has grown among both physicians and the general public that we must be concerned not simply with ensuring the birth of a baby, but one who will not be a liability to society, to its parents, or to itself. The “right to be born” is being qualified by another right: to have a reasonable chance of a happy and useful life. This shift in attitude is shown by,
...more
Genetic illness and genetic wellness were not discrete neighboring countries; rather, wellness and illness were continuous kingdoms, bounded by thin, often transparent, borders.
In the history of science and technology too, breakthroughs seem to come in two fundamental forms. There are scale shifts—where the crucial advance emerges as a result of an alteration of size or scale alone (the moon rocket, as one engineer famously pointed out, was just a massive jet plane pointed vertically at the moon). And there are conceptual shifts—in which the advance arises because of the emergence of a radical new concept or idea.
That one of the most elemental diseases in human history happens to arise from the corruption of the two most elemental processes in biology is not a co-incidence: cancer co-opts the logic of both evolution and heredity; it is a pathological convergence of Mendel and Darwin.
Venter and Smith launched the Haemophilus project in the winter of 1993. By July 1995, it was complete. “The final [paper] took forty drafts,” Venter later wrote. “We knew this paper was going to be historic, and I was insistent that it be as near perfect as possible.” It was a marvel: the Stanford geneticist Lucy Shapiro wrote about how members of her lab had stayed up all night reading the H. flu genome, “thrilled by the first glimpse at the complete gene content of a living species.” There were genes to generate energy, genes to make coat proteins, genes to manufacture proteins, to regulate
...more
As Richard Dawkins puts it, “All animals probably have a relatively similar repertoire of proteins that need to be ‘called forth’ at any particular time. . . .” The difference between a more complex organism and a simpler one, “between a human and a nematode worm, is not that humans have more of those fundamental pieces of apparatus, but that they can call them into action in more complicated sequences and in a more complicated range of spaces.” It was not the size of the ship, yet again, but the way the planks were configured. The fly genome was its own Delphic boat.
Every great scientific paper is a conversation with its own history—and the opening paragraphs of the Nature paper were written with full cognizance of its moment of reckoning:
It encodes about 20,687 genes in total—only 1,796 more than worms, 12,000 fewer than corn, and 25,000 fewer genes than rice or wheat. The difference between “human” and “breakfast cereal” is not a matter of gene numbers, but of the sophistication of gene networks. It is not what we have; it is how we use it.
Every cell possesses a subcellular structure called a mitochondrion that is used to generate energy. Mitochondria have their own mini-genome, with only thirty-seven genes, about one six-thousandth the number of genes on human chromosomes. (Some scientists propose that mitochondria originated from some ancient bacteria that invaded single-celled organisms. These bacteria formed a symbiotic alliance with the organism; they provided energy, but used the organism’s cellular environment for nutrition, metabolism, and self-defense. The genes lodged within mitochondria are left over from this ancient
...more
What is certain is that every perilous ocean-crossing left hardly any survivors—perhaps as few as six hundred men and women. Europeans, Asians, Australians, and Americans are the descendants of these drastic bottlenecks, and this corkscrew of history too has left its signature in our genomes. In a genetic sense, nearly all of us who emerged out of Africa, gasping for land and air, are even more closely yoked than previously imagined. We were on the same boat, brother.
Some genes certainly vary sharply between racial or ethnic groups—sickle-cell anemia is an Afro-Caribbean and Indian disease, and Tay-Sachs disease has a much higher frequency in Ashkenazi Jews—but for the most part, the genetic diversity within any racial group dominates the diversity between racial groups—not marginally, but by an enormous amount. This degree of intraracial variability makes “race” a poor surrogate for nearly any feature: in a genetic sense, an African man from Nigeria is so “different” from another man from Namibia that it makes little sense to lump them into the same
...more
For race and genetics, then, the genome is a strictly one-way street. You can use genome to predict where X or Y came from. But, knowing where A or B came from, you can predict little about the person’s genome. Or: every genome carries a signature of an individual’s ancestry—but an individual’s racial ancestry predicts little about the person’s genome. You can sequence DNA from an African-American man and conclude that his ancestors came from Sierra Leone or Nigeria. But if you encounter a man whose great-grandparents came from Nigeria or Sierra Leone, you can say little about the features of
...more
Galton and his disciples, we might recall, were obsessed with the measurement of intelligence. Between 1890 and 1910, dozens of tests were devised in Europe and America that purported to measure intelligence in some unbiased and quantitative manner. In 1904, Charles Spearman, a British statistician, noted an important feature of these tests: people who did well in one test generally tended to do well in another test. Spearman hypothesized that this positive correlation existed because all the tests were obliquely measuring some mysterious common factor. This factor, Spearman proposed, was not
...more
In the 1950s, Americans commonly listed their IQs on their résumés, submitted the results of a test for a job application, or even chose their spouses based on the test. IQ scores were pinned on the babies who were on display in Better Babies contests (although how IQ was measured in a two-year-old remained mysterious).
General intelligence (g) originated as a statistical correlation between tests given under particular circumstances to particular individuals. It morphed into the notion of “general intelligence” because of a hypothesis concerning the nature of human knowledge acquisition. And it was codified into “IQ” to serve the particular exigencies of war. In a cultural sense, the definition of g was an exquisitely self-reinforcing phenomenon: those who possessed it, anointed as “intelligent” and given the arbitration of the quality, had every incentive in the world to propagate its definition. Richard
...more
Intelligence, in short, is heritable (i.e., influenced by genes), but not easily inheritable (i.e., moved down intact from one generation to the next).
The Harvard historian Orlando Patterson, in the slyly titled “For Whom the Bell Curves,” reminded readers that the frayed legacies of slavery, racism, and bigotry had deepened the cultural rifts between whites and African-Americans so dramatically that biological attributes across races could not be compared in a meaningful way. Indeed, the social psychologist Claude Steele demonstrated that when black students are asked to take an IQ test under the pretext that they are being tested to try out a new electronic pen, or a new way of scoring, they perform well. Told that they are being tested
...more
And therein lies the rub. The tricky thing about the notion of g is that it pretends to be a biological quality that is measurable and heritable, while it is actually strongly determined by cultural priorities. It is—to simplify it somewhat—the most dangerous of all things: a meme masquerading as a gene.
What had converged between my mother and her sister, I began to realize, was not personality but its tendency—its first derivative, to borrow a mathematical term. In calculus, the first derivative of a point is not its position in space, but its propensity to change its position; not where an object is, but how it moves in space and time. This shared quality, unfathomable to some, and yet self-evident to a four-year-old, was the lasting link between my mother and her twin. Tulu and Bulu were no longer recognizably identical—but they shared the first derivative of identity.
C had been able to learn many of the essential features of her acquired gender through social performance and mimesis, but she couldn’t unlearn the psychosexual drives of her genetic self.
At the top of the cascade, nature works forcefully and unilaterally. Up top, gender is quite simple—just one master gene flicking on and off. If we learned to toggle that switch—by genetic means or with a drug—we could control the production of men or women, and they would emerge with male versus female identity (and even large parts of anatomy) quite intact. At the bottom of the network, in contrast, a purely genetic view fails to perform; it does not provide a particularly sophisticated understanding of gender or its identity. Here, in the estuarine plains of crisscrossing information,
...more
Separated by geographic and economic continents, when two brothers, estranged at birth, were brought to tears by the same Chopin nocturne at night, they seemed to be responding to some subtle, common chord struck by their genomes.
Two men—both renamed Jim after adoption—had been separated from each other thirty-seven days after birth and had grown up eighty miles apart in an industrial belt in northern Ohio. Both had struggled through school. “Both drove Chevrolets, both chain-smoked Salems, and both loved sports, especially stock-car racing, but both disliked baseball. . . . Both Jims had married women named Linda. Both had owned dogs that they had named Toy. . . . One had a son named James Allan; the other’s son was named James Alan. Both Jims had undergone vasectomies, and both had slightly high blood pressure. Each
...more
The picture that emerged from the Minnesota study was not that reared-apart twins were identical, but that they shared a powerful tendency toward similar or convergent behaviors. What was common to them was not identity, but its first derivative.
Genes had been linked to temperaments before: the extraordinary, otherworldly sweetness of children with Down syndrome had long been noted by psychologists, and other genetic syndromes had been linked with outbursts of violence and aggression. But Ebstein was not interested in the outer bounds of pathology; he was interested in normal variants of temperament. Extreme genetic changes could evidently cause extreme variants of temperament. But were there “normal” gene variants that influenced normal subtypes of personality?
The precise nature of stimulation varies from one context to the next. It can produce the most sublime qualities in humans—exploratory drive, passion, and creative urgency—but it can also spiral toward impulsivity, addiction, violence, and depression.
The most provocative human studies have cataloged the geographic distribution of the D4DR variant. Nomadic and migratory populations have higher frequencies of the variant gene. And the farther one moves from the original site of human dispersal from Africa, the more frequently the variant seems to appear as well. Perhaps the subtle drive caused by the D4DR variant drove the “out-of-Africa” migration, by throwing our ancestors out to sea. Many attributes of our restless, anxious modernity, perhaps, are products of a restless, anxious gene.
A predisposition cannot be confused with the disposition itself: one is a statistical probability; the other, a concrete reality. It is as if genetics can nearly beat its way to the door of human form, identity, or behavior—but it cannot traverse the final mile.
It is a testament to the unsettling beauty of the genome that it can make the real world “stick.” Our genes do not keep spitting out stereotypical responses to idiosyncratic environments: if they did, we too would devolve into windup automatons. Hindu philosophers have long described the experience of “being” as a web—jaal. Genes form the threads of the web; the detritus that sticks is what transforms every individual web into a being. There is an exquisite precision in that mad scheme. Genes must carry out programmed responses to environments—otherwise, there would be no conserved form. But
...more
In the 1980s, however, a more intriguing pattern emerged: when the children born to women who were pregnant during the famine grew up, they too had higher rates of obesity and heart disease. This finding too might have been anticipated. Exposure to malnourishment in utero is known to cause changes in fetal physiology. Nutrient-starved, a fetus alters its metabolism to sequester higher amounts of fat to defend itself against caloric loss, resulting, paradoxically, in late-onset obesity and metabolic disarray. But the oddest result of the Hongerwinter study would take yet another generation to
...more
Not in the actual sequence of genes: if you sequence the genomes of a pair of identical twins every decade for fifty years, you get the same sequence over and over again. But if you sequence the epigenomes of a pair of twins over the course of several decades, you find substantial differences: the pattern of methyl groups attached to the genomes of blood cells or neurons is virtually identical between the twins at the start of the experiment, begins to diverge slowly over the first decade, and becomes substantially different over fifty years.
Chance events—injuries, infections, infatuations; the haunting trill of that particular nocturne; the smell of that particular madeleine in Paris—impinge on one twin and not the other. Genes are turned “on” and “off” in response to these events, and epigenetic marks are gradually layered above genes.3 Every genome acquires its own wounds, calluses, and freckles—but these wounds and calluses “exist” only because they have been written into genes. Even the environment signals its presence through the genome. If “nurture” exists, it is only by virtue of its reflection in “nature.” That idea
...more
The story comes with a twist. One of the four genes used by Yamanaka to reverse cellular fate is called c-myc. Myc, the rejuvenation factor, is no ordinary gene: it is one of the most forceful regulators of cell growth and metabolism known in biology. Activated abnormally, it can certainly coax an adult cell back into an embryo-like state, thereby enabling Yamanaka’s cell-fate reversal experiment (this function requires the collaboration of the three other genes found by Yamanaka). But myc is also one of the most potent cancer-causing genes known in biology; it is also activated in leukemias
...more
Most epigenetic “memories” are the consequence of ancient evolutionary pathways, and cannot be confused with our longing to affix desirable legacies on our children.
Despite Menelaus’s admonitions, the blood of our fathers is lost in us—and so, fortunately, are their foibles and sins. It is an arrangement that we should celebrate more than rue. Genomes and epigenomes exist to record and transmit likeness, legacy, memory, and history across cells and generations. Mutations, the reassortment of genes, and the erasure of memories counterbalance these forces, enabling unlikeness, variation, monstrosity, genius, and reinvention—and the refulgent possibility of new beginnings, generation upon generation.
The universe seeks equilibriums; it prefers to disperse energy, disrupt organization, and maximize chaos. Life is designed to combat these forces. We slow down reactions, concentrate matter, and organize chemicals into compartments; we sort laundry on Wednesdays. “It sometimes seems as if curbing entropy is our quixotic purpose in the universe,” James Gleick wrote. We live in the loopholes of natural laws, seeking extensions, exceptions, and excuses. The laws of nature still mark the outer boundaries of permissibility—but life, in all its idiosyncratic, mad weirdness, flourishes by reading
...more
We now know that cells have ancient detectors that recognize viral genes and stamp them with chemical marks, like cancellation signs, to prevent their activation.
Until recently, the capacity to predict fate from the human genome was limited by two fundamental constraints. First, most genes, as Richard Dawkins describes them, are not “blueprints” but “recipes.” They do not specify parts, but processes; they are formulas for forms.
The history of human genetics has reminded us, again and again, that “knowing apart” often begins with an emphasis on “knowing,” but often ends with an emphasis on “parting.” It is not a coincidence that the vast anthropometric projects of Nazi scientists—the obsessive measurement of jaw sizes, head shapes, nose lengths, and heights—were also once legitimized as attempts to “know humans apart.”
Even so, this triangle of limits—high-penetrance genes, extraordinary suffering, and noncoerced, justifiable interventions—has proved to be a useful guideline for acceptable forms of genetic interventions.
In both cases, it seems, the short variant encodes a hyperactive “stress sensor” for psychic susceptibility, but also a sensor most likely to respond to an intervention that targets the susceptibility. The most brittle or fragile forms of psyche are the most likely to be distorted by trauma- inducing environments—but are also the most likely to be restored by targeted interventions. It is as if resilience itself has a genetic core: some humans are born resilient (but are less responsive to interventions), while others are born sensitive (but more likely to respond to changes in their
...more
The idea of a “resilience gene” has entranced social engineers. Writing in the New York Times in 2014, the behavioral psychologist Jay Belsky argued, “Should we seek to identify the most susceptible children and disproportionately target them when it comes to investing scarce intervention and service dollars? I believe the answer is yes.” “Some children are—in one frequently used metaphor—like delicate orchids,” Belsky wrote, “they quickly wither if exposed to stress and deprivation, but blossom if given a lot of care and support. Others are more like dandelions; they prove resilient to the
...more

