The Dream Machine: J. C. R. Licklider and the Revolution That Made Computing Personal
Rate it:
6%
Flag icon
Once again, that last objection loomed larger in 1940 than we can easily understand now. Today we speak of "the computer" as if it were a single thing that had to be invented only once. But as Wiener's list of features suggests, the modern digital computer is actually a combination of at least half a dozen separate inventions, most of which involved not just another gadget but a shift in the way people thought about computing. At the time of Wiener's memo, moreover, it was far from clear whether he or anyone else had put the individual pieces together in the right way; those conceptual ...more
7%
Flag icon
In the early summer of 1938, his boss asked him if he could build a calculator that would do arithmetic with complex numbers—the kind involving "imaginary" quantities based on the square root of—1. Those quantities had turned out to have some very real applications in the design of AT&T's new coast-to-coast system of long-distance lines, and Bell Labs' computer division—a small team of women armed with desk calculators—was being swamped by complex arithmetic.
7%
Flag icon
Either way, the Complex Computer was the hit of the meeting. Even Norbert Wiener seemed to find it a revelation. After he'd spent quite a bit of time playing with the Teletype and getting totally exasperated (for some reason he kept trying to make the remote computer divide a number by 0 and produce infinity—an impossible task for any real machine, and one that Stibitz and Williams had explicitly guarded against), Wiener told a colleague that he was now convinced: binary math was the future of computing. Indeed, the Dartmouth presentation may have been what inspired the memo he sent to ...more
8%
Flag icon
Indeed, Howard Aiken may well have been the first person to realize that programming would eventually become a profession in its own right. Not only would he later convince Harvard to start the first master's degree courses in what would now be called computer science, but he insisted from the beginning that the Mark I project be staffed with trained mathematicians—among them, most notably, a thirty-seven-year-old naval reserve lieutenant named Grace Murray Hopper, formerly a mathematics professor at Vassar College. ("Where the hell have you been?" was Aiken's greeting to her on the day she ...more
11%
Flag icon
Conversely, they said, if it was valid to think of the nervous system in engineering terms, then it was just as valid to think of machines in biological terms. Look at the fire-control system, correcting its aim after every shot with feedback from radar. The gun and its fire-control system operated in a completely automatic fashion, with no humans in the loop anywhere. And yet the gun seemed guided by a grimly determined intelligence.
12%
Flag icon
McCulloch and Pitts themselves had emphasized that each of their model neurons actually behaved like a small logic circuit of the sort that could be built with a set of electromechanical relays or vacuum tubes. And that, if nothing else, meant that von Neumann could draw his own circuit designs using McCulloch and Pitts's neuralnet notation, which would eventually evolve into the standard circuit notation used by computer engineers today.
13%
Flag icon
Up until that point, most of the pioneering computers had embodied at least part of the problem-solving process in their actual physical structure. The classic example was Bush's Differential Analyzer, in which the problem was represented by the arrangement of gears and shafts. The brilliant idea behind the stored-program concept was to make a clean split, to separate the problem-solving sequence from the hardware entirely. The act of computation thus became an abstract process that we now know as software, a series of commands encoded in a string of binary 1s and 0s and stored in the ...more
13%
Flag icon
That was probably just as well. However fierce the controversy surrounding its birth, the stored-program concept now ranks as one of the great ideas of the computer age—arguably the great idea. By rendering software completely abstract and decoupling it from the physical hardware, the stored-program concept has had the paradoxical effect of making software into something that is almost physically tangible. Software has become a medium that can be molded, sculpted, and engineered on its own terms. Indeed, as the Yale University computer scientist David Gelernter has pointed out, the modern ...more
This highlight has been truncated due to consecutive passage length restrictions.
14%
Flag icon
Indeed, young researchers quickly learned not to use the word mind at all, even in casual conversation, lest they be declared "unscientific" and thereby undermine their own careers.
19%
Flag icon
That doesn't sound like a great idea now—that expectation should influence behavior. But it was anathema to the behaviorists. I remember writing up some of this research and showing it to one psychologist who was a rabid Skinnerian, and he said, 'Well, that's very interesting, but you don't dare publish it. If you write a paper about expectations, then your scientific reputation will be destroyed.' "
20%
Flag icon
But this digital computer was supposed to act as a flight simulator, a machine for which there was never any "answer," just a constantly changing sequence of pilot actions and simulated aircraft responses. So Forrester and his team would have to create not a calculator but a computer that could monitor its inputs constantly, staying ready for whatever might come along; that could respond to events as fast as they occurred, without ever falling behind when things got hectic; and that could keep going until the simulation was over, however long that took. In short, they would have to create the ...more
23%
Flag icon
Forrester and his colleagues could see only one solution, albeit a fabulously extravagant one: spend $1 million on a whole new computer just to test core memory. So that was exactly what they did, with the Pentagon's signing the checks. Work on the Memory Test Computer started in May 1952. And a little over a year later, the first eight-thousand-word bank of fully tested core memory was wired into Whirlwind itself. The results were dramatic: the operating speed doubled, the data-input rate quadrupled, and maintenance time was reduced from four hours per day to two hours per week. Memory had ...more
24%
Flag icon
The trainers quickly discovered that it was impossible to predict who their best pupils would be—not even professional mathematicians were a sure bet; they often lost patience with the details—but it was very easy to spot the talented ones once they got started. As a general rule of thumb, for example, music teachers proved to be particularly adept. And much to the project leaders' astonishment (this being the 1950s) women often turned out to be more proficient than men at worrying about the details while simultaneously keeping the big picture in mind. One of the project's best programming ...more
25%
Flag icon
Viewed in retrospect, Turing's test for machine intelligence has to rank as one of the most provocative assertions in all of modern science. To this day, people are still talking about it, writing commentaries on it, and voicing outraged objections to it (most of which he anticipated in his original paper, by the way).† Of course, like so much of Turing's work, the 1950 paper wasn't widely read at the time, and it had essentially no impact on the artificial-intelligence research that was just beginning in the United States.
26%
Flag icon
In short, says Miller, "the scare" led to an even deeper confirmation of the phenomenon: the magical number was 7, all right, but seven chunks, not just seven items. Moreover, he says, this recognition of chunking was what finally led him to make an open break with Skinner and company. Not only did the data prove the existence of mental states—namely, concepts in memory—but they showed that these mental states have structure. Indeed, says Miller, chunking implied that our minds are capable of organizing whole hierarchies of data: each chunk in short-term memory can hold several pieces of ...more
29%
Flag icon
And yet, Lick further reasoned, when it came to things that computers did poorly or not at all—intuitive processes such as perception, goal setting, judgment, insight, and all the rest—our human capabilities surpassed the most powerful machines on the planet (and still do). Having spent much of his career trying to untangle the relatively straightforward mechanisms of hearing, Lick could testify to that point personally: these "heuristic" mental functions, as he called them, actually involved a vast amount of information processing. They seemed effortless and intuitive only because that ...more
30%
Flag icon
So, Lick wondered, what would happen if you put humans and computers together as a system? What would happen if you let the machines take over all those dreary, algorithmic chores they were so good at? How much of your time would be opened up for real creativity?
31%
Flag icon
As promised, he taught Lick to program the LGP-30. (Since the code had to be written in a "hexadecimal" notation, in which the numbers 10 through 15 were abbreviated by the letters F, G, J, K, Q, and W, Lick soon came up with a mnemonic: "For God and Jesus Christ, Quit Worrying.")
32%
Flag icon
"I had certain standards of mathematical precision for what a scientific paper ought to be like," he explained in an interview conducted by the writer Pamela McCorduck in 1974, adding that by "precision" he meant mathematical abstraction, formal definitions, powerful theorems, and rigorous proofs. "But finally it became clear that I wasn't going to solve the artificial intelligence problem in a mathematically rigorous way in reasonable time, so I simply decided to start publishing what I had."
32%
Flag icon
"Every speculation about the mechanism was wrong," McCarthy says, "especially Asimov's Three Laws of Robotics, which are kind of Talmudic laws rather than scientific laws."
33%
Flag icon
Since the users would be sharing the computer's processing time as well as its storage space, McCarthy took to calling his scheme time-sharing. And characteristically, he wasn't too impressed with himself for having thought it up. "Time-sharing to me was one of these ideas that seemed quite inevitable," he says. "When I was first learning about computers, I [thought] that even if [time-sharing] wasn't the way it was already done, surely it must be what everybody had in mind to do." Wrong. Nobody at IBM had even imagined such a thing, not in 1955. It's true that the company was the prime ...more
This highlight has been truncated due to consecutive passage length restrictions.
33%
Flag icon
But in artificial-intelligence research, such a demand would be sure death. The "variables" in an AI program somehow had to represent the quicksilver fluidity of mental states in human working memory—the images, concepts, possibilities, goals, and alternatives that the problem solver focuses on at any given instant. And there was no way for the programmer to know in advance how big or how complex these variables should be, because they would constantly be changing as the problem solving advanced.
34%
Flag icon
And the other legacy? An undeniable grace, beauty, and power. As a Lisp programmer continued to link simpler functions into more complex ones, he or she would eventually reach a point where the whole program was a function—which, of course, would also be just another list. So to execute that program, the programmer would simply give a command for the list to evaluate itself in the context of all the definitions that had gone before. And in a truly spectacular exercise in self-reference, it would do precisely that. In effect, such a list provided the purest possible embodiment of John von ...more
36%
Flag icon
And digital computer models? In principle, said Lick, they combine the best of both worlds by being both static and dynamic—and more. Indeed, he argued, it is this characteristic that gives software its unique power as a modeling medium. A model sculpted from software is static when it exists as binary code on a disk or a tape. As such, it can be stored, transmitted, archived, and retrieved, just as ordinary text can be. It in fact is a kind of text. Like ordinary text, moreover—and unlike the balsa wood of a model airplane, or the electrical circuits of an analog computer, or any other ...more
37%
Flag icon
Another TX-0 hacker devised what was essentially the first word processor, a program that allowed you to type in your class reports and then format the text for output on the Flexowriter. Since it made the three-million-dollar TX-0 behave like a three-hundred-dollar typewriter—much to the outrage of traditionalists who saw this, too, as a ludicrous waste of computer power—the program became known as Expensive Typewriter. In much the same spirit, freshman Bob Wagner wrote an Expensive Desk Calculator program to help him do his homework for a numerical-analysis class. (Wagner's grade: 0. Using ...more
38%
Flag icon
Nonetheless, McCarthy's talk had an impact, not least because at the very end of it he finally stated in public what he'd long been mulling over in private: "If computers of the kind I have advocated become the computers of the future," he said, "then computation may someday be organized as a public utility, just as the telephone system is a public utility. We can envisage computer service companies whose subscribers are connected to them by telephone lines. Each subscriber needs to pay only for the capacity that he actually uses, but he has access to all programming languages characteristic ...more
39%
Flag icon
Except that no one in the United States was laughing: the launch of Sputnik rocked the nation like no event since Pearl Harbor. As one account of the early space race put it, "[Sputnik's] two transmitters would fail twenty-three days after launch—but their arrogant beeping would continue to sound in the American memory for years to come. . . . Gone forever in this country was the myth of American superiority in all things technical and scientific."
46%
Flag icon
Unlike a power utility, which basically just provides a resource that people consume, an information utility lets the users give back to the system. It even lets them create whole new resources in the system. "More than anything else," says Fano, "the system became the repository of the knowledge of the community. And that was a totally new thing."
55%
Flag icon
And that, explained Scantlebury, was the sad part of the story: the powers-that-be at the British Postal Service, which had absolute control over the U.K. telecommunications system, had flatly refused to fund Davies's vision of nationwide packet switching. They couldn't even see the point of a demonstration. So, said Scantlebury, having gotten there first, the NPL group would now have to sit back and watch as the Americans did a packet-switched network for real.
56%
Flag icon
What was strong enough was this idea that packet switching would be more survivable, more robust under damage to the network. If a plane got shot down, or an artillery barrage hit the command center, the messages would still get through. And in a strategic situation—meaning a nuclear attack—the president could still communicate to the missile fields. So I can assure you, to the extent that I was signing the checks, which I was from nineteen sixty-seven on, I was signing them because that was the need I was convinced of."
60%
Flag icon
The ARPA network—or the Arpanet, as people were now beginning to call it—was finally on line. It had gone from contract award to equipment on site and running in less than nine months.
73%
Flag icon
In fact, says Lampson, he and his colleagues were increasingly coming to realize that the decision to focus on graphics displays undermined the most fundamental premise of time-sharing—namely, that computers are fast and humans are slow. As he would express it in a later account, "[This] relationship holds only when the people are required to play on the machine's terms, seeing information presented slowly and inconveniently, with only the clumsiest control over its form or content. When the machine is required to play the game on the human's terms, presenting a pageful of attractively (or ...more
75%
Flag icon
But wherever the idea came from, it was an exceedingly potent one—as potent, in its own way, as the idea of an open marketplace guided by Adam Smith's Invisible Hand, or the idea of an open democracy in which all human beings are created equal. If you wanted to be a part of the Internet, you could be: it would be open to anyone willing to deal in the common currency and/or speak the common language, as defined by the interface standard. It was this architecture of openness that would enable the Internet to undergo its explosive growth in the 1990s, when it would expand from a handful of users ...more
76%
Flag icon
The combination of Alto's wonderful graphics screen and its strange little "mouse" took the arcane abstraction known as software and transformed it into something visible, tangible, almost tactile. Users had the weird and eerily seductive sensation of reaching into the computer, of being able to grab what they saw there and manipulate it by hand. Thanks to the programmers, moreover, they found plenty to manipulate: icons, pop-up menus, drop-down menus, scroll bars, and windows-even overlapping windows that seemed to be stacked on top of one another like real, 3-D objects ("I was in the seminar ...more
77%
Flag icon
"In the history of art," says Alan Kay, "the most powerful work in any genre is done not by the adults who invent it, but by the first generation of kids who grow up in it. Think of perspective painting during the Renaissance. Well, we were that generation. We were the kids who'd had our Ph.D.s paid for by ARPA."