More on this book
Community
Kindle Notes & Highlights
by
Katie Hafner
Read between
April 1 - April 6, 2017
Computers had the potential to act as extensions of the whole human being, as tools that could amplify the range of human intelligence and expand the reach of our analytical powers.
Lick saw a future in which, thanks in large part to the reach of computers, most citizens would be “informed about, and interested in, and involved in, the process of government.” He imagined what he called “home computer consoles” and television sets linked together in a massive network.
He held to the view that computers would naturally continue to be used for what they do best: all of the rote work. And this would free humans to devote energy to making better decisions and developing clearer insights than they would be capable of without computers. Together, Lick suggested, man and machine would perform far more competently than either could alone.
“Consider the situation in which several different centers are netted together, each center being highly individualistic and having its own special language and its own special way of doing things,” he posited. “Is it not desirable or even necessary for all the centers to agree upon some language or, at least, upon some conventions for asking such questions as ‘What language do you speak?’At this extreme, the problem is essentially the one discussed by science-fiction writers: How do you get communications started among totally uncorrelated sapient beings?”
He would extend the concept of the Intergalactic Network to mean not just a group of people to whom he was sending memos but a universe of interconnected computers over which all might send their memos.
Taylor left Herzfeld’s office on the E-ring and headed back to the corridor that connected to the D-ring and his own office. He glanced at his watch. “Jesus Christ,” he said to himself softly. “That only took twenty minutes.”
Larry Roberts was twenty-nine years old when he walked into the Pentagon as ARPA’s newest draftee. He fit in quickly, and his dislike of idle time soon became legendary.
ability to call for, or call off, the launch of American missiles (called “minimal essential communication”),
AT&T’s officials concluded that Baran didn’t have the first notion of how the telephone system worked. “Their attitude was that they knew everything and nobody outside the Bell System knew anything,” Baran said. “And somebody from the outside couldn’t possibly understand or appreciate the complexity of the system.
AT&T’s answer was to educate. The company began a seminar series on telephony, held for a small group of outsiders, including Baran. The classes lasted for several weeks. “It took ninety-four separate speakers to describe the entire system, since no single individual seemed to know more than a part of the system,” Baran said. “Probably their greatest disappointment was that after all this, they said, ‘Now do you see why it can’t work?’And I said, ‘No.’”
Marill referred to the set of procedures for sending information back and forth as a message “protocol,” prompting a colleague to inquire, “Why do you use that term? I thought it referred to diplomacy.”
This attitude was especially pronounced among researchers from the East Coast universities, who saw no reason to link up with campuses in the West. They were like the upper-crust woman on Beacon Hill who, when told that long-distance telephone service to Texas was available, echoed Thoreau’s famous line: “But what would I possibly have to say to anyone in Texas?”
He called the intermediate computers that would control the network “interface message processors,” or IMPs, which he pronounced “imps.”
Under a contract from ARPA, he was developing a system (called NLS, for oNLine System) that depended on computer-literate communities. He saw the ARPA experimental network as an excellent vehicle for extending NLS to a wide area of distributed collaboration. “I realized there was a ready-made computer community,” Engelbart recalled.
In specifying the network requirements, Roberts was guided by a few basic principles. First, the IMP subnet was to function as a communications system whose essential task was to transfer bits reliably from a source location to a specified destination. Next, the average transit time through the subnet should be less than half a second. Third, the subnet must be able to operate
Roberts would have been content with an inner loop of 1,500 instructions; Crowther and Walden used 150. From that, the two programmers calculated just how quickly the IMP could process each packet. With that information, they were able to predict how many packets could be moved per second. “We actually sat down and wrote the hundred and fifty lines of code, counted them, and then we knew,” Crowther said. They had figured out the kernel.
When news reached Massachusetts Senator Edward Kennedy’s office just before Christmas that a million-dollar ARPA contract had been awarded to some local boys, Kennedy sent a telegram thanking BBN for its ecumenical efforts and congratulating the company on its contract to build the “Interfaith Message Processor.”
A typical strategy here is to dominate the conversation, not by doing all the talking, but by asking a lot of questions.
Kleinrock, although only ten years older than the rest of his group, had a great reputation in queueing theory (the study of how long people and things spend waiting in lines, how long the lines get, and how to design systems to reduce waiting).
As the UCLA guys understood it, BBN was working out some specifications for how to construct such a connection. The host-to-IMP interface had to be built from scratch each time a new site was established around a different computer model. Later, sites using the same model could purchase copies of the custom interface.
FTP was the first application to permit two machines to cooperate as peers instead of treating one as a terminal to the other.
Members of the team would all log on at once, from Palo Alto and Cambridge and L.A. and Salt Lake, and for an hour or two at a stretch trade comments back and forth. Conversing through keyboards and terminals was less spontaneous than speaking, but Bhushan believed it forced clarity into their thinking.
a good hack was a creative or inspired bit of programming.
The first electronic-mail delivery engaging two machines was done one day in 1972 by a quiet engineer, Ray Tomlinson at BBN.
In 1973, Lukasik commissioned an ARPA study that found that three quarters of all traffic on the ARPANET was e-mail. By then, sending e-mail was a simple and nearly trouble-free process.
JohnVittal, had helped write RFC 733). MSG was far and away the most popular mail program on the ARPANET.
Pentagon-sponsored project at BBN called HERMES.MSG was the original “killer app”—a software application that took the world by storm. Although there was never anything official about it, MSG clearly had the broadest grassroots support.
“It was because of Vittal that we all assimilated network mail into our spinal cords,” recalled Brian Reid. “When I met him years later, I remember being disappointed—as one often is when one meets a living legend—to see that he had two arms and two legs and no rocket pack on his back.”
Jimmy Carter’s presidential campaign used e-mail several times a day in the autumn of 1976. The system they were using was a basic mailbox program, a technology already more than a decade old.
“The key is not in automating the bag/can/truck/ person mechanism,” Stefferud said. “It is in bypassing them altogether.”
“I really like seeing an accurate time stamp,” said someone else. “It’s nice to be able to unravel the sequence of comments received in scrambled order.” “Some people use it blatantly as a kind of one-upmanship. ‘I work longer hours than you do.’”
“Those who will not learn to use this instrument well cannot be saved by an expanded alphabet; they will only afflict us with expanded gibberish.” What did Shakespeare know? ;-) Emoticons and smileys :-), hoisted by the hoi polloi no doubt, grew in e-mail and out into the iconography of our time.
The romance of the Net came not from how it was built or how it worked but from how it was used. By 1980 the Net was far more than a collection of computers and leased lines. It was a place to share work and build friendships and a more open method of communication.
By August 1973, while TCP was still in the design phase, traffic had grown to a daily average of 3.2 million packets. From 1973 to 1975, the Net expanded at the rate of about one new node each month. Growth was proceeding in line with Larry Roberts’s original vision, in which the network was deliberately laden with large resource providers. In this respect, DARPA had succeeded wonderfully.
Illinois was slated to become home to the powerful new ILLIAC IV, a massive, one-of-a-kind high-speed computer under construction at the Burroughs Corporation in Paoli, Pennsylvania. The machine was guaranteed to attract researchers from around the country.
Students on the Urbana campus were convinced the ILLIAC IV was going to be used to simulate bombing scenarios for the Vietnam War and to perform top-secret research on campus. As campus protests erupted over the impending installation, university officials grew concerned about their ability to protect the ILLIAC
But the Center for Advanced Computation already had its IMP and full access to the network. Researchers there took quickly to the newfound ability to exploit remote computing resources—so quickly, in fact, that the Center terminated the $40,000 monthly lease on its own high-powered Burroughs B6700. In its place, the university began contracting for computer services over the ARPANET. By doing this, the computation center cut its computer bill nearly in half.
When Roberts contacted AT&T to see if it wanted to take over the ARPANET, AT&T formed a committee of corporate and Bell Labs staff and studied the idea for months. AT&T could have owned the network as a monopoly service, but in the end declined. “They finally concluded that the packet technology was incompatible with the AT&T network,” Roberts said.
BBN was refusing to release the IMP source code—the original operating program written by the IMP Guys five years earlier.
The intellectual property issue finally boiled over when the engineers who left BBN to start their own company made a pointed request for their former employer to turn over the IMP source code. When BBN refused, they appealed to DARPA. While keeping source code proprietary is usually the prerogative of the company that develops it, the IMP source code was different, for it had been developed by BBN with federal funds. Moreover, BBN was in the midst of starting up the TELENET subsidiary, which would be competing with the company started by the engineers.
A milestone occurred in October 1977, when Cerf and Kahn and a dozen or so others demonstrated the first three-network system with packet radio, the ARPANET, and SATNET, all functioning in concert. Messages traveled from the San Francisco Bay area through a packet-radio net, then the ARPANET, and then a dedicated satellite link to London, back across the packet-satellite network and across the ARPANET again, and finally to the University of Southern California’s Information Sciences Institute (ISI) in Marina del Rey. The packets traveled 94,000 miles without dropping a single bit.
After the split, TCP would be responsible for breaking up messages into datagrams, reassembling them at the other end, detecting errors, resending anything that got lost, and putting packets back in the right order. The Internet Protocol, or IP, would be responsible for routing individual datagrams.
“I set out to make a new model for the ALOHA system.” Within a few weeks, he was on a Xerox-funded trip to the University of Hawaii. He stayed a month, and before coming home had added an extensive analysis of the ALOHA system to his thesis. It was just the theoretical boost the dissertation needed. When he resubmitted the work, it was accepted. But the ALOHA system gave Metcalfe far more than a doctorate. Xerox PARC was in the process of developing one of the first personal computers, called the Alto.
Metcalfe and Lampson, along with Xerox researchers David Boggs and Chuck Thacker, built their first Alto Aloha system in Bob Taylor’s lab at Xerox PARC. To their great delight, it worked. In May 1973 Metcalfe suggested a name, recalling the hypothetical luminiferous medium invented by nineteenth-century physicists to explain how light passes through empty space. He rechristened the system Ethernet.
NASA had its own network called the Space Physics Analysis Network, or SPAN. Because this growing conglomeration of networks was able to communicate using the TCP/IP protocols, the collection of networks gradually came to be called the “Internet,” borrowing the first word of “Internet Protocol.”
Gateways were the internetworking variation on IMPs, while routers were the mass-produced version of gateways, hooking local area networks to the ARPANET.
But the Internet community—people like Cerf and Kahn and Postel, who had spent years working on TCP/IP—opposed the OSI model from the start. First there were the technical differences, chief among them that OSI had a more complicated and compartmentalized design. And it was a design, never tried. As far as the Internet crowd was concerned, they had actually implemented TCP/IP several times over, whereas the OSI model had never been put to the tests of daily use, and trial and error.
One key development in determining the outcome between TCP/IP and OSI turned out to be the popularity of the UNIX operating system, which had been developed at AT&T’s Bell Laboratories in 1969. Programmers liked UNIX for two primary reasons: Its flexibility let them tailor it to whatever program they were working on, and it was “portable,” meaning it could be made to work on many different computers.
They called their company Sun (for Stanford University Network) Microsystems. The first Sun machines were shipped with the Berkeley version of UNIX, complete with TCP/IP. Berkeley UNIX with TCP/IP would be crucial to the growth of the Internet.
Ethernet—the local area network designed by Bob Metcalfe and his colleagues at Xerox PARC back in 1973—was a practical solution to the problem of how to tie computers together, either on a campus or at a company. Xerox began selling Ethernet as a commercial product in 1980.

