More on this book
Community
Kindle Notes & Highlights
Read between
December 21 - December 31, 2018
General Curtis LeMay had created an institutional culture at the Strategic Air Command that showed absolutely no tolerance for mistakes. People were held accountable not only for their behavior but for their bad luck. “To err is human,” everyone at the command had been told, “to forgive is not SAC policy.”
didn’t hate them because they were dumb, I didn’t hate them because they had spilled our blood for nothing, I hated them because of their arrogance . . . because they had convinced themselves that they actually knew what they were doing and that we were too minor to understand the “Big Picture.” I hated my own generals, because they covered up their own gutless inability to stand up to the political masters in Washington and say, “Enough. This is bullshit. Either we fight or we go home.”
Meyer told the Milwaukee Journal that almost every one of the more than two hundred men in his unit regularly smoked hashish. They were often high while handling secret documents and nuclear warheads.
The drug use at Homestead was suspected after a fully armed Russian MiG-17 fighter plane, flown by a Cuban defector, landed there unchallenged, while Air Force One was parked on a nearby runway.
The Polaris base at Holy Loch, Scotland, helped turn the Cowal Peninsula into a center for drug dealing in Great Britain.
And the entire command-and-control system could be shut down by the electromagnetic pulse and the transient radiation effects of a nuclear detonation above the United States. Communications might be impossible for days after a Soviet attack.
President Nixon tried to end the Vietnam War by threatening the use of nuclear weapons, convinced that Eisenhower had employed a similar tactic to end the war in Korea. “I call it the Madman Theory, Bob,” Nixon told his chief of staff, H. R. Haldeman. “I want the North Vietnamese to believe that I’ve reached the point where I might do anything to stop the war.” The secretary of state, the secretary of defense, and the Joint Chiefs of Staff thought it was a bad idea. But Nixon and Kissinger thought the plan might work.
At a meeting of the National Security Council, Iklé expressed his opposition to launch on warning, calling it “accident-prone.” Secretary of State Kissinger disagreed, praising its usefulness as a deterrent. Kissinger felt confident that the command-and-control system could handle it and stressed that “the Soviets must never be able to calculate that you plan to rule out such an attack.”
The president was accompanied everywhere by a military aide carrying the “football”—a briefcase that held the SIOP Decisions Handbook, a list of secret command bunkers throughout the United States, and instructions on how to operate the Emergency Broadcast System. The SIOP Decisions Handbook outlined various attack options, using cartoonlike illustrations to convey the details quickly. It was known as the Black Book.
Eager to defend the civilian control of nuclear weapons from military encroachment, John F. Kennedy and Robert McNamara had fought hard to ensure that only the president could make the ultimate decision. But they hadn’t considered the possibility that the president might be clinically depressed, emotionally unstable, and drinking heavily—like Richard Nixon, during his final weeks in office.
President Carter was determined to end the arms race with the Soviet Union. And he knew more about nuclear weapons than any of his predecessors at the White House, except, perhaps, Eisenhower. Carter had attended the U.S. Naval Academy, served as an officer on submarines, and helped to design the first nuclear propulsion systems for the Navy.
The committee’s views were succinctly expressed in an essay by Richard Pipes, a history professor at Harvard and one of the group’s founders: “Why the Soviet Union Thinks It Could Fight and Win a Nuclear War.” The Soviets were violent, deceitful, authoritarian, and cunning, Pipes argued, and they’d already shown a willingness to commit mass murder on behalf of communism. The downfall of the United States now seemed within their grasp and would be pursued, regardless of the cost.
The window of vulnerability—like the bomber gap and the missile gap before it—provided a strong rationale for increased spending on defense. And like those other scares, it was based more on fear than on facts.
The missile would constantly be moved between twenty-three protective concrete shelters, like a pea in an immense shell game. The Soviet Union would never know which shelter housed a missile. The shelters would be a mile apart. Twenty-two of them would contain fake missiles—and those decoys would also be moved constantly by truck. If the scheme worked, the Soviets would have to use at least forty-six warheads to destroy a single MX missile.
AT ABOUT ELEVEN O’CLOCK in the morning on November 9, 1979, the computers at the NORAD headquarters inside Cheyenne Mountain said that the United States was under attack. The huge screen in the underground command center at SAC headquarters showed that Soviet missiles had been launched from submarines off the West Coast. The same message was received by computers in the National Military Command Center at the Pentagon and the Alternate National Military Command Center at Site R inside Raven Rock Mountain. And then more missiles appeared on the screen, launched not only from submarines but
...more
As the minutes passed without the arrival of Soviet warheads, it became clear that the United States wasn’t under attack. The cause of the false alarm was soon discovered. A technician had put the wrong tape into one of NORAD’s computers. The tape was part of a training exercise—a war game that simulated a Soviet attack on the United States. The computer had transmitted realistic details of the war game to SAC headquarters, the Pentagon, and Site R.
A couple of months after the false alarm, twenty-three security officers assigned to the Combat Operations Center inside Cheyenne Mountain were stripped of their security clearances. According to the Air Force Office of Special Investigations, the security force responsible for protecting the nerve center of America’s command-and-control system was using LSD, marijuana, cocaine, and amphetamines.
At about two thirty in the morning on June 3, 1980, Zbigniew Brzezinski, the president’s national security adviser, was awakened by a phone call from a staff member, General William E. Odom. Soviet submarines have launched 220 missiles at the United States, Odom said. This time a surprise attack wasn’t implausible.
Once again, NORAD’s computers and its early-warning sensors were saying different things. The problem was clearly in one of the computers, but it would be hard to find.
This time technicians found the problem: a defective computer chip in a communications device. NORAD had dedicated lines that connected the computers inside Cheyenne Mountain to their counterparts at SAC headquarters, the Pentagon, and Site R. Day and night, NORAD sent test messages to ensure that those lines were working. The test message was a warning of a missile attack—with zeros always inserted in the space showing the number of missiles that had been launched. The faulty computer chip had randomly put the number 2 in that space, suggesting that 2 missiles, 220 missiles, or 2,200 missiles
...more
With help from his wife, Barbara, and a local contractor, Peurifoy built a bomb shelter underneath the garage at the family home in Albuquerque. Other engineers at Sandia added bomb shelters to their houses, too.
The Fowler Letter’s only immediate effect was to raise the possibility that Glenn Fowler would lose his job.
During the late 1970s, a coded switch was finally placed in the control center of every SAC ballistic missile. It unlocked the missile, not the warhead. And as a final act of defiance, SAC demonstrated the importance of code management to the usefulness of any coded switch. The combination necessary to launch the missiles was the same at every Minuteman site: 00000000.
The misinformation placed them at greater risk. It was also a form of disrespect toward young servicemen and women who were already risking their lives. And it encouraged careless behavior around nuclear weapons. In many ways, denying the safety problems only made them worse.
THE HOSPITAL IN CONWAY REFUSED to admit the injured men, claiming that it lacked the authority to treat Air Force personnel.
The refusal to admit these injured young airmen, at four in the morning, about half an hour away from another hospital, seemed in keeping with the spirit of the entire night.
mobile homes, told people to leave at once. Despite the disturbing, early-morning sight of two men in battle fatigues and gas masks standing at the front door, most of the homeowners were grateful for the warning. But one man opened the door, pointed a handgun at them, and said, “I’m not going to leave.” They didn’t argue with him.
At the Redstone Arsenal in Huntsville, Alabama, Matthew Arnold was taught how to deactivate chemical and biological weapons. “Chlorine is your friend,” the instructor told the class. The principal ingredient in household bleach would render almost every deadly pathogen, nerve agent, and blister agent harmless.
He had learned how to defuse car bombs and biological weapons, to handle Broken Arrows and dismantle nuclear warheads. He was twenty years old.
The worst effects of the oxidizer would usually appear about five hours after exposure. Like the phosgene gas used as a chemical weapon during the First World War, the oxidizer could kill you in an extremely unpleasant way. It was known as “dry land drowning.”
At least two warheads and half a dozen missiles were damaged. A manufacturing defect or corrosion seemed the most likely explanation for the collapse of the telescoping arms.
But an Air Force investigation later found a different cause: maintenance crews had been goofing around with the load carts, out of sheer boredom, and using them to lift B-52 bombers off the ground.
At Carswell Air Force Base, someone on a loading crew had ignored a tech order and pulled a handle too hard in the cockpit of a B-52. Instead of opening the bomb bay doors, he’d inadvertently released a B-61 hydrogen bomb. It fell about seven feet and hit the runway. When members of the loading crew approached the weapon, they saw that its parachute pack had broken off—and that a red flag had appeared in a little window on the casing. The bomb was armed.
Arnold’s unit handled nuclear weapons all the time, and they rarely thought about the destructive force that could be unleashed. EOD technicians sat on nuclear weapons, casually leaned against them, used them as tables during lunch breaks.
Iklé considered the all-or-nothing philosophy of “assured destruction” to be profoundly immoral, a misnomer more accurately described as “assured genocide.” Aiming nuclear weapons at civilian populations threatened a “form of warfare universally condemned since the Dark Ages—the mass killing of hostages.”
Many of the enlisted men in the 308th thought the Air Force was scapegoating the little guys in order to hide problems with the Titan II and protect the top brass.
NINETEEN EIGHTY-THREE PROVED TO BE one of the most dangerous years of the Cold War. The new leader of the Soviet Union, Yuri Andropov, was old, paranoid, physically ill, and staunchly anti-American. A former head of the KGB, Andropov had for many years played a leading role in the suppression of dissent throughout the Soviet bloc.
An American missile defense system was unlikely to be effective against an all-out Soviet attack. It might, however, prove useful in destroying any Soviet missiles that survived an American first strike.
The Soviet general staff was alerted, and it was Petrov’s job to advise them whether the missile attack was real. Any retaliation would have to be ordered soon. Petrov decided it was a false alarm. An investigation later found that the missile launches spotted by the Soviet satellite were actually rays of sunlight reflected off clouds.
Able Archer 83 ended uneventfully on November 11—and NATO’s defense ministers were totally unaware that their command-and-control drill had been mistaken for the start of a third world war.
If the SRAMs were poorly maintained, simply dropping them on the ground from a height of five or six feet could make them explode—or take off. “The worst probable consequence of continuous degradation . . . is spontaneous ignition of the propellant in a way similar to a normally initiated burn,” an Air Force nuclear safety journal warned. “Naturally, this would be a catastrophe.” The journal advised its readers to “follow procedures and give the weapons a little extra care and respect.”
Although European protest marches had focused mainly on the United States for the previous six years, it was the leadership of Western Europe who most strongly opposed creating a world without nuclear weapons.
For more than forty years, efforts to tame the SIOP, to limit it, reduce it, make it appear logical and reasonable, had failed. “With the possible exception of the Soviet nuclear war plan, this was the single most absurd and irresponsible document I had ever reviewed in my life,” General Butler later recalled.
Two other Soviet officials possessed nuclear codes and footballs: the minister of defense and the chief of the general staff. Both of them supported the coup d’état.
a reactor in Virginia, a worker cleaning the floor got his shirt caught on the handle of a circuit breaker on the wall. He pulled the shirt off it, tripped the circuit breaker, and shut down the reactor for four days.
A lightbulb slipped out of the hand of a worker at a reactor in California. The bulb hit the control panel, caused a short circuit, turned off sensors, and made the temperature of the core change so rapidly that a meltdown could have occurred.
“Our ability to organize does not match the inherent hazards of some of our organized activities.” What appeared to be the rare exception, an anomaly, a one-in-a-million accident, was actually to be expected. It was normal.
The most dangerous systems had elements that were “tightly coupled” and interactive. They didn’t function in a simple, linear way, like an assembly line. When a problem arose on an assembly line, you could stop the line until a solution was found. But in a tightly coupled system, many things occurred simultaneously—and they could prove difficult to stop. If those things also interacted with each other, it might be hard to know exactly what was happening when a problem arose, let alone know what to do about it.
Few bureaucracies were flexible enough to allow both centralized and decentralized decision making, especially in a crisis that could threaten hundreds or thousands of lives.
“Time and time again, warnings are ignored, unnecessary risks taken, sloppy work done, deception and downright lying practiced,” Perrow found. The instinct to blame the people at the bottom not only protected those at the top, it also obscured an underlying truth. The fallibility of human beings guarantees that no technological system will ever be infallible.