More on this book
Community
Kindle Notes & Highlights
Read between
April 10 - April 13, 2023
In July 1969, as US astronauts made their final preparations to land on the moon, the Soviet minister of energy and electrification called for an aggressive expansion of nuclear construction. He set ambitious targets for a network of new plants across the European part of the Soviet Union, with giant, mass-produced reactors that would be built from the Gulf of Finland to the Caspian Sea.
The USSR was buckling under the strain of decades of central planning, fatuous bureaucracy, massive military spending, and endemic corruption—the start of what would come to be called the Era of Stagnation.
The quality of workmanship at all levels of Soviet manufacturing was so poor that building projects throughout the nation’s power industry were forced to incorporate an extra stage known as “preinstallation overhaul.” Upon delivery from the factory, each piece of new equipment—transformers, turbines, switching gear—was stripped down to the last nut and bolt, checked for faults, repaired, and then reassembled according to the original specifications, as it should have been in the first place. Only then could it be safely installed. Such wasteful duplication of labor added months of delays and
...more
Behind all the catastrophic failures of the USSR during the Era of Stagnation—beneath the kleptocratic bungling, the nepotism, the surly inefficiencies, and the ruinous waste of the planned economy—lay the monolithic power of the Communist Party. The Party had originated as a single faction among those grappling for power in Russia following the Revolution of 1917, ostensibly to represent the will of the workers, but quickly establishing control of a single-party state—intended to lead the proletariat toward “True Communism.”
Distinct from mere Socialism, True Communism was the Marxist utopia: “a classless society that contains limitless possibilities for human achievement,” an egalitarian dream of self-government by the people.
Decades later, the Party had established its own rigid hierarchy of personal patronage and held the power of appointment over an entire class of influential positions, known collectively as the nomenklatura.
Advancement in many political, economic, and scientific careers was granted only to those who repressed their personal opinions, avoided conflict, and displayed unquestioning obedience to those above them.
Lies and deception were endemic to the system, trafficked in both directions along the chain of management: those lower down passed up reports to their superiors packed with falsified statistics and inflated estimates, of unmet goals triumphantly reached, unfulfilled quotas heroically exceeded.
Eventually the supply problems of the centrally planned economy became so chronic that crops rotted in the fields, and Soviet fishermen watched catches putrefy in their nets, yet the shelves of the Union’s grocery stores remained bare.
When Units Five and Six of the Chernobyl station came online in 1988, Brukhanov would preside over the largest nuclear power complex on earth.
The USSR, hopelessly backward in developing computer technology, lacked simulators with which to train its nuclear engineers, so the young engineers’ work at Chernobyl would be their first practical experience in atomic power.
Mikhail Gorbachev, would preside as leader of the Soviet Union. Gorbachev had assumed power in March 1985, ending the long succession of zombie apparatchiks whose declining health, drunkenness, and senility had been concealed from the public by squadrons of increasingly desperate minders. At fifty-four, Gorbachev seemed young and dynamic and found an enthusiastic audience in the West.
He announced plans for economic reorganization—perestroika—and, at the climax of the Party congress in March 1986, talked of the need for glasnost, or open government. A dedicated Socialist, Gorbachev believed that the USSR had lost its way but could be led to the utopia of True Communism by returning to the founding principles of Lenin.
The Era of Stagnation had fomented a moral decay in the Soviet workplace and a sullen indifference to individual responsibility, even in the nuclear industry. The USSR’s economic utopianism did not recognize the existence of unemployment, and overstaffing and absenteeism were chronic problems.
In 1905 Albert Einstein overturned these ideas. He suggested that if atoms could be somehow torn apart, the process would convert their tiny mass into a relatively enormous release of energy. He defined the theory with an equation: the energy released would be equal to the amount of mass lost, multiplied by the speed of light squared. E=mc2
When the nuclei split, their neutrons could fly away at great speed, smashing into other nearby atoms, causing their nuclei to split in turn, releasing even more energy. If enough uranium atoms were gathered in the correct configuration—forming a critical mass—this process could begin sustaining itself, with one atom’s neutrons splitting the nucleus of another, sending more neutrons into a collision course with further nuclei. As it went critical, the resulting chain reaction of splitting atoms—nuclear fission—would liberate unimaginable quantities of energy.
The bomb itself was extremely inefficient: just one kilogram of the uranium underwent fission, and only seven hundred milligrams of mass—the weight of a butterfly—was converted into energy. But it was enough to obliterate an entire city in a fraction of a second.
Radiation is produced by the disintegration of unstable atoms.
This dynamic process of nuclear decay is radioactivity; the energy it releases, as atoms shed neutrons in the form of waves or particles, is radiation.
The granite used to build the US Capitol is so radioactive that the building would fail federal safety codes regulating nuclear power plants. All living tissue is radioactive to some degree: human beings, like bananas, emit radiation because both contain small amounts of the radioisotope potassium 40; muscle contains more potassium 40 than other tissue, so men are generally more radioactive than women. Brazil nuts, with a thousand times the average concentration of radium of any organic product, are the world’s most radioactive food.
Radiation is invisible and has neither taste nor smell. Although it’s yet to be proved that exposure to any level of radiation is entirely safe, it becomes manifestly dangerous when the particles and waves it gives off are powerful enough to transform or break apart the atoms that make up the tissues of living organisms. This high-energy radiance is ionizing radiation.
Ionizing radiation takes three principal forms: alpha particles, beta par...
This highlight has been truncated due to consecutive passage length restrictions.
Alpha particles are relatively large, heavy, and slow moving and cannot penetrate the skin; even a sheet o...
This highlight has been truncated due to consecutive passage length restrictions.
Beta particles are smaller and faster moving than alpha particles and can penetrate more deeply into living tissue, causing visible burns on the skin and lasting genetic damage.
Gamma rays—high-frequency electromagnetic waves traveling at the speed of light—are the most energetic of all. They can traverse large distances, penetrate anything short of thick pieces of concrete or lead, and destroy electronics. Gamma rays pass straight through a human being without slowing down, smashing through cells like a fusillade of microscopic bullets.
Severe exposure to all ionizing radiation results in acute radiation syndrome (ARS), in which the fabric of the human body is unpicked, rearranged, and destroyed at the most minute levels.
drunk by people who believed radioactivity gave them energy. In 1903 Marie and Pierre Curie had won the Nobel Prize for the discovery of polonium and radium—an alpha-particle emitter, roughly a million times more radioactive than uranium—which they extracted from metric tonnes of viscous, tarry ore in their Paris laboratory.
More than eighty years later, Curie’s laboratory notes remain so radioactive that they are kept in a lead-lined box.
In the watch factories of New Jersey, Connecticut, and Illinois, the Radium Girls were trained to lick the tips of their brushes into a fine point before dipping them into pots of radium paint. When the jaws and skeletons of the first girls began to rot and disintegrate, their employers suggested they were suffering from syphilis. A successful lawsuit revealed that their managers had understood the risks of working with radium and yet done everything they could to conceal the truth from their employees. It was the first time the public learned the hazards of ingesting radioactive material.
The biological effect of radiation on the human body would eventually be measured in rem (roentgen equivalent man) and determined by a complicated combination of factors: the type of radiation; the duration of total exposure; how much of it penetrates the body, and where; and how susceptible those parts of the body are to radiation damage.
The survivors of the atom bomb attacks on Hiroshima and, three days later, Nagasaki provided the first opportunity to study the effects of acute radiation syndrome on a large number of people.
Of those who lived through the initial explosion in Nagasaki, thirty-five thousand died within twenty-four hours; those suffering from ARS lost their hair within one or two weeks, and then experienced bloody diarrhea before succumbing to infection and high fever. Another thirty-seven thousand died within three months. A similar number survived for longer but, after another three years, developed leukemia; by the end of the 1940s, the disease would be the first cancer linked to radiation.
As profound and terrible as exposure to ionizing radiation might prove for human beings, it’s rarely accompanied by any detectable sensation. A person might be bathed in enough gamma rays to be killed a hundred times over without feeling a thing.
On August 21, 1945, two weeks after the bomb was dropped on Hiroshima, Harry K. Daghlian Jr., a twenty-four-year-old physicist on the Manhattan Project, was conducting an after-hours experiment in Los Alamos, New Mexico, when his hand slipped. The test assembly he had built—a ball of plutonium surrounded by tungsten carbide bricks—went critical.
Twenty-five days later, Daghlian slipped into a coma from which he never awoke—the first person in history to die accidentally from close exposure to nuclear fission.
The first nuclear reactor ever built, assembled by hand beneath the bleachers of the University of Chicago’s disused football field in 1942, was the anvil of the Manhattan Project, the essential first step in creating the fissile material needed to forge the world’s first atomic weapon.
From the start, the Soviet nuclear project was governed by principles of ruthless expedience and paranoid secrecy.
It was not until the end of 1952 that the government signaled its commitment to nuclear power by naming a new design institute dedicated to creating new reactors: the Scientific Research and Design Institute of Energy Technology, known by its Russian acronym NIKIET.
The following year, the USSR tested its first thermonuclear device—a hydrogen bomb, a thousand times more destructive than the atom bomb—and both emerging superpowers became theoretically capable of wiping out humanity entirely. Even Kurchatov was shaken by the power of the new weapon he had created, which had turned the surface of the earth to glass for five kilometers around ground zero.
To the Soviet people, still rebuilding amid the devastation of World War II, the Obninsk reactor showed how the USSR could technologically lead the world in a way that benefited ordinary citizens, bringing heat and light into their homes.
But to generate power steadily inside a nuclear reactor, the behavior of the neutrons must be artificially controlled, to ensure that the chain reaction stays constant and the heat of fission can be harnessed to create electricity.
Should each fission fail to create as many neutrons as the one before, the reactor becomes subcritical, the chain reaction slows and eventually ceases, and the reactor shuts down. But if each generation produces more than one fission, the chain reaction could begin to grow too quickly toward a potentially uncontrollable supercriticality and a sudden and massive release of energy similar to that in a nuclear weapon.
Withdrawing the control rods farther, or in greater numbers, increases reactivity and thus the amount of heat and power generated, while inserting them farther has the opposite effect.
To generate electricity, the uranium fuel inside a reactor must become hot enough to turn water into steam but not so hot that the fuel itself starts to melt. To prevent this, in addition to control rods and a neutron moderator, the reactor requires a coolant to remove excess heat.
But when it finally went critical in June 1954, Atom Mirny-1 retained another profound drawback the scientists never fixed: a phenomenon known as the positive void coefficient.
This negative void coefficient acts like a dead man’s handle on the reactor, a safety feature of the water-water designs common in the West.
This positive void coefficient remained a fatal defect at the heart of Atom Mirny-1 and overshadowed the operation of every Soviet water-graphite reactor that followed.
Calder Hall had been constructed to manufacture plutonium for Britain’s own nascent atom bomb program. What electricity it did produce was a costly fig leaf.
On October 9 the two thousand tons of graphite in Windscale Pile Number One caught fire. It burned for two days, releasing radiation across the United Kingdom and Europe and contaminating local dairy farms with high levels of iodine 131.
Under Khrushchev, Soviet scientists began to enjoy unprecedented autonomy, and the public—encouraged to trust unquestioningly in the new gods of science and technology—were kept in the dark.