More on this book
Community
Kindle Notes & Highlights
by
Katie Hafner
Read between
June 11 - July 3, 2022
Eisenhower was the first president to host a White House dinner specifically to single out the scientific and engineering communities as guests of honor, just as the Kennedys would later play host to artists and musicians.
He believed in the value of unfettered science, in its ability to produce remarkable, if not always predictable, results.
Computers had the potential to act as extensions of the whole human being, as tools that could amplify the range of human intelligence and expand the reach of our analytical powers.
Lick’s thoughts about the role computers could play in people’s lives hit a crescendo in 1960 with the publication of his seminal paper “Man-Computer Symbiosis.” In it he distilled many of his ideas into a central thesis: A close coupling between humans and “the electronic members of the partnership” would eventually result in cooperative decision making. Moreover, decisions would be made by humans, using computers, without what Lick called “inflexible dependence on predetermined programs.”
“The hope,” Licklider wrote, “is that in not too many years, human brains and computing machines will be coupled . . . tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.”
His message, already something of a mantra up in Cambridge but still largely unfamiliar to a military audience, was that a computer should be something anyone could interact with directly, eliminating computer operators as the middlemen in solving their problems.
That said, Lick hedged his bets. “It will possibly turn out,” he continued, “that only on rare occasions do most or all of the computers in the overall system operate together in an integrated network. It seems to me to be important, nevertheless, to develop a capability for integrated network operation.” And therein lay the seed of Licklider’s grandest vision yet.
In those days, software programs were one-of-a-kind, like original works of art, and not easily transferred from one machine to another.
A machine’s ability to amplify human intellectual power was precisely what Licklider had had in mind while writing his paper on human-machine symbiosis six years earlier.
“The possibility of a war exists, but there is much that can be done to minimize the consequences,” Baran wrote. “If war does not mean the end of the earth in a black-and-white manner, then it follows that we should do those things that make the shade of gray as light as possible: to plan now to minimize potential destruction and to do all those things necessary to permit the survivors of the holocaust to shuck their ashes and reconstruct the economy swiftly.”
Baran’s second big idea was still more revolutionary: Fracture the messages too. By dividing each message into parts, you could flood the network with what he called “message blocks,” all racing over different paths to their destination. Upon their arrival, a receiving computer would reassemble the message bits into readable form.
By the time Baran had answered all of the concerns raised by the defense, communications, and computer science communities, nearly four years had passed and his volumes numbered eleven.
In the car, Clark sketched out his idea: Leave the host computers out of it as much as possible and instead insert a small computer between each host computer and the network of transmission lines. (This was, by coincidence, precisely what Davies had concluded separately in England.)
“The process of technological development is like building a cathedral,” remarked Baran years later. “Over the course of several hundred years new people come along and each lays down a block on top of the old foundations, each saying, ‘I built a cathedral.’ Next month another block is placed atop the previous one. Then comes along an historian who asks, ‘Well, who built the cathedral?’ Peter added some stones here, and Paul added a few more. If you are not careful, you can con yourself into believing that you did the most important part. But the reality is that each contribution has to follow
...more
The firm had also become well known as a place whose hiring philosophy was to recruit MIT dropouts. The idea was that if they could get into MIT they were smart, and if they dropped out, you could get them cheaper.
The potential for trouble falls into three categories: the foreseen, the unforeseen but foreseeable, and the unforeseeable. In the last case engineers are forced to make decisions with their fingers crossed.
Eight months weren’t enough for anyone to build the perfect network. Everyone knew it. But BBN’s job was more limited than that; it was to demonstrate that the network concept could work. Heart was seasoned enough to know that compromises were necessary to get anything this ambitious done on time. Still, the tension between Heart’s perfectionism and his drive to meet deadlines was always with him, and sometimes was apparent to others as an open, unresolved contradiction.
“It’s one thing when you plug into a socket in the wall and electrons flow,” said Bob Kahn. “It’s another thing when you have to figure out, for every electron, which direction it takes.” To Kahn’s way of thinking, this summed up the difficulty of building a network that switched packets of bits, and did so dynamically.
They were engineers and pragmatists. All of their lives they had built things, connected wires, and made concepts real. Their ethos was utilitarian. At its core, all engineering comes down to making tradeoffs between the perfect and the workable.
On his deathbed in 1969, Dwight Eisenhower asked a friend about “my scientists” and said they were “one of the few groups that I encountered in Washington who seemed to be there to help the country and not help themselves.”
Heart was already predicting difficulty in getting the host sites to complete their work on time. He knew how heavily the principal investigators relied on graduate students, and Heart worried that the project could be derailed because of a graduate student’s failure to treat the schedule seriously enough.
Heart’s reviews tended to be ad hoc and ongoing, which isn’t to say that they were easy. “It was like your worst nightmare for an oral exam by someone with psychic abilities,” recalled Bernie Cosell, the team’s ace software troubleshooter. “He could intuit the parts of the design you were least sure of, the places you understood least well, the areas where you were just song-and-dancing, trying to get by, and cast an uncomfortable spotlight on parts you least wanted to work on, meanwhile virtually ignoring the parts you were so proud of for having gotten really nailed down and done right.”
The randomness of the crashes was unusually bad. Intermittent problems of this sort were the devil.
In fact, the absence of clues was one of the most useful clues.
“Request for Comments,” it turned out, was a perfect choice of titles. It sounded at once solicitous and serious.
But in ancient Greek, protokollon meant the first leaf of a volume, a flyleaf attached to the top of a papyrus scroll that contained a synopsis of the manuscript, its authentication, and the date.
There is no small irony in the fact that the first program used over the network was one that made the distant computer masquerade as a terminal. All that work to get two computers talking to each other and they ended up in the very same master-slave situation the network was supposed to eliminate.
Then again, technological advances often begin with attempts to do something familiar. Researchers build trust in new technology by demonstrating that we can use it to do things we understand. Once they’ve done that, the next steps begin to unfold, as people start to think even further ahead. As people assimilate change, the next generation of ideas is already evolving.
For months, Heart’s team and Roberts had been discussing the possibility of connecting many new users to the net without going through a host computer. It appeared they could make it possible to log onto the network, reach a distant host, and control remote resources through a simple terminal device—a Teletype or a CRT with a keyboard—directly connected to an IMP.
The ARPANET was not intended as a message system. In the minds of its inventors, the network was intended for resource-sharing, period. That very little of its capacity was actually ever used for resource-sharing was a fact soon submersed in the tide of electronic mail.
By August 1973, while TCP was still in the design phase, traffic had grown to a daily average of 3.2 million packets.
After the split, TCP would be responsible for breaking up messages into datagrams, reassembling them at the other end, detecting errors, resending anything that got lost, and putting packets back in the right order.
The Internet Protocol, or IP, would be responsible for routing individual datagrams.
But the networking people at Harvard declined the offer. “They said they couldn’t possibly let a graduate student do something that important,” said Metcalfe. Harvard officials decided to have BBN do it. BBN in turn gave the job to its graduate student in residence, Ben Barker, who recruited John McQuillan, a fellow Harvard graduate student, to help him.
By virtue of its quiet momentum, TCP/IP had prevailed over the official OSI standard. Its success provided an object lesson in technology and how it advances. “Standards should be discovered, not decreed,” said one computer scientist in the TCP/IP faction. Seldom has it worked any other way.

