More on this book
Community
Kindle Notes & Highlights
Read between
October 18 - November 10, 2019
February: Charles Simonyi joins Microsoft, where he describes himself as “the messenger RNA of the PARC virus.”
But today’s popular image of the computer nerd as incipient high-tech millionaire was nobody’s fantasy then. Instead they had been attracted to PARC by the thrill of pioneering.
He was a master of parsimony and the sworn enemy of its opposite, which he called “biggerism.”
Its most arresting element was its human scale. Where the typical computer of this era was the size of two or three refrigerators standing back to back and wired to many more racks of special-purpose hardware, the “Alto” was to be self-contained and small enough to bark a shin on as you wheeled it under your desk. The Alto was interactive, which meant instantly responsive to the user’s demands. Contemporary computers communicated with their users indirectly, through punch cards or teletypes so slow and awkward that a single bleak exchange of query and response required days to complete.
To use a term coined by Alan Kay, the PARC scientist who was one of the machine’s principal conceptualizers, the Alto was to be a “personal computer.”
The governing principle of PARC was that the place existed to give their employer that ten-year head start on the future. They even contrived a shorthand phrase to explain the concept. The Alto, they said, was a time machine.
The Alto’s operating software had not yet been written, so its brains resided temporarily in a commercial minicomputer called a Nova, which was cabled to the Alto’s back panel like a resuscitator to a comatose patient. A few members of the lab had crafted a sort of animated test pattern by converting several drawings of Sesame Street’s Cookie Monster into sequences of digital ones and zeros.
The scientists of PARC changed all that. They took it as their credo that the computer must serve the user rather than the other way around. That it must be easy and intuitive to operate. That it must communicate with the user in human terms and on a human scale, even if at supernatural speeds.
Surf the Internet, send e-mail to a workmate, check your bank account at an ATM equipped with a touch screen, follow the route of a cold front across the Midwest on a TV weather forecaster’s animated map: The pathway to the indispensable technology was blazed by PARC.
When Apple sued Microsoft in 1988 for stealing the “look and feel” of its Macintosh graphical display to use in Windows, Bill Gates’s defense was essentially that both companies had stolen it from Xerox.
One of the most unusual and prolific research facilities in history, PARC was originally conceived in much more modest terms—as a research lab for a computer subsidiary Xerox had recently acquired. How it burst those boundaries in the early 1970s to become something more closely resembling a national resource is part of its special mystique.
The final factor was management. PARC was founded by men whose experience had taught them that the only way to get the best research was to hire the best researchers they could find and leave them unburdened by directives, instructions, or deadlines. For the most part, the computer engineers of PARC were exempt from corporate imperatives to improve Xerox’s existing products. They had a different charge: to lead the company into new and uncharted territory.
That Xerox proved only sporadically willing to follow them is one of the ironies of this story.
Xerox was so indifferent to PARC that it “didn’t even patent PARC’s innovations,”
Another great myth is that Xerox never earned any money from PARC. The truth is that its revenues from one invention alone, the laser printer, have come to billions of dollars—returning its investment in PARC many times over.
What is indisputable is that Xerox did bring together a group of superlatively creative minds at the very moment when they could exert maximal influence on a burgeoning technology, and financed their work with unexampled generosity.
Many moved on to more splendid achievements and some to astounding wealth. But none ever forgot how profoundly their professional lives were changed when Bob Taylor fixed them with his discerning eye and invited them to enlist in his tiny company of believers.
But then, nothing ever pleased him more than functioning as the lodestar of the proceedings while pretending to be nothing but an unassuming bystander.
How and where Taylor acquired his gift for finding and cultivating the most talented researchers in his field no one ever quite figured out. Part of it was instinct. He might not be able to articulate or even understand all the technical details, but somehow he always knew when a researcher or a project would lead to something important, and how to prepare the ground for that person or project to ripen.
“The master often speaks in somewhat inscrutable fashion,” Lampson said to peals of knowing laughter, “with a deeper and more profound interpretation than his humble disciples are able to provide. In retrospect you can really see that the path has been plotted years in advance, and you’ve been following his footsteps all along.”
He settled back, slippers on the coffee table. “I was never interested in the computer as a mathematical device, but as a communications device,” he said, then paused meaningfully,
“It took me a couple of years to get them to come around. The designers said, the display? That’s crazy, the display is peripheral! I said, No, the display is the entire point!”
The frequent relocations had already left their mark on the boy. “You’ve got to make a new set of friends and interact with a new set of prejudices every time,” he recalled.
“We said we were going to the moon, but we were a hell of a long way from getting there,” Taylor recalled. “It was mostly engineering, and sometimes fairly pedestrian engineering. It wasn’t science, and I was much more interested in science.”
But what if the system were designed so the computer was no longer a mute data manipulator, but a participant in a dialogue—something, he had written in that paper, like “a colleague whose competence supplements your own?”
Licklider’s program launched the golden age of government-funded computing research.
By the time he left IPTO in 1964 to return to MIT, Licklider had set in motion numerous trailblazing projects aimed at making the computer more accessible to the user. Studies in graphics pointed toward new ways of displaying computer-generated information. There were initiatives in computer networking and new programming languages. Systems to reorganize the computer’s memory and processing cycles so it might serve many users simultaneously—which was known as time-sharing—brought the per-session cost of building and running these enormous contraptions down to a level that even midsized and
...more
Licklider left Sutherland with a $15 million budget, a workload that had grown far beyond what a single man could handle, and a suggested deputy: Bob Taylor. At first glance he was a strange choice. Taylor had never taken an advanced course in computing. He would never be able to design hardware or write a software program. But he displayed two qualities Licklider found appealing: an instinctive grasp of the promise of man-computer interaction, and an exceptionally high degree of “people skills.”
Not that the purpose was chiefly to play. Rather, it was to build a network of people mirroring the one he would soon propose for computers. Lifelong professional and personal bonds were forged at these events. They would start the day with a communal breakfast, followed by several hours of discussion in the morning. They would eat lunch together, then were set free until dinner, another communal affair. The day ended with further colloquia.
Each participant got an hour or so to describe his work. Then he would be thrown to the mercy of the assembled court like a flank steak to a pack of ravenous wolves.
At the first session the group piled on an unfortunate wild man from that backwater, the University of Utah, named Alan Kay. Kay had stepped forth in a public session to pitch his vision of a computer you could hold in your hand. He had already coined a name for it: “Dynabook,” a notebook-shaped machine with a display screen and a keyboard you could use to create, edit, and store a very personal sort of literature, music, and art.
Taylor’s original model of a nationwide computer network grew out of his observation that time-sharing was starting to promote the formation of a sort of nationwide computing brotherhood (at this time very few members were women). Whether they were at MIT, Stanford, or UCLA, researchers were all looking for answers to the same general questions. “These people began to know one another, share a lot of information, and ask of one another, ‘How do I use this? Where do I find that?’” Taylor recalled. “It was really phenomenal to see this computer become a medium that stimulated the formation of a
...more
This concept would develop into the ARPANET. The idea owed something to Licklider, who had earlier proposed what he dryly called an “intergalactic network” of mainframes. During his time at ARPA the notion remained theoretical, however; it was hard enough to get small-scale time-sharing systems to run individually, much less in concert with one another. But Taylor judged that the technology had now progressed far enough to make the concept practical. He did not deceive himself: Building such a system meant overcoming prodigious obstacles. On the other hand, ARPA’s generous umbrella sheltered
...more
Engelbart’s vision refined and expanded a concept memorably set forth by Dr. Vannevar Bush, an MIT engineering dean and wartime science advisor to Franklin D. Roosevelt. In 1945 Bush had turned his attention to the scientific advances produced in the name of war and to how they might serve the peace. The result was a small masterpiece of scientific augury entitled “As We May Think,” which appeared in the July 1945 issue of The Atlantic Monthly.
“As We May Think” remains one of the few genuinely seminal documents of the computer age. Even today it stands out as a work of meticulous scientific and social analysis. The contemporary reader is struck by its pragmatism and farsightedness, expressed without a hint of platitude or utopianism, those common afflictions of writing about the future.
He sketched out something called the “memex,” which he described as “a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility.” The mechanism of consultation would be “associative indexing…whereby any item may be caused at will to select immediately and automatically another. This is the essential feature of the memex.”
Obsessed with developing new ways for man and computer to interact, Engelbart linked video terminals to mainframes by cable and communicated with the machines via televised images. To allow the user to move the insertion point, or cursor, from place to place in a block of text instantaneously, he outfitted a hollowed-out block of wood with two small wheels fixed at right angles so it could be rolled smoothly over a flat surface. The wheels communicated their motion to potentiometers whose signals in turn were translated by the computer into the placement of the cursor on the screen. From this
...more
The piece de resistance was Engelbart’s implementation of the memex. The screen showed how a user could select a single word in a text document and be instantly transported to the relevant portion of a second document—the essence of hypertext, found today, some thirty years later, on every World Wide Web page and countless word-processed documents.
With Lampson on board, Genie picked up momentum. The group tore apart the SDS 930, tacked on new hardware, and wrote an entirely new operating system. “There weren’t any spectacularly new ideas in the project,” Lampson said later. “The point was to try to take ideas that other people had, some of which had been implemented on other machines, and show you could make it all work in a much less grandiose environment.” Genie accomplished its goal, which was to bring time-sharing to the masses by implementing it on the small machine that Taylor and Currie eventually beguiled Palevsky into marketing
...more
“This was definitely not your two-guys-in-a-garage startup,” said Lampson, who by now held a faculty appointment at Berkeley and set his own name down as a co-founder. It was, however, something infinitely more risky. The BCC pioneers were about to become victims of the “second-system effect.” The theory of second systems was formulated by an IBM executive named Frederick Brooks, whose career supervising large-scale software teams taught him that designers of computer systems tend to build into their second projects all the pet features that tight finances or short deadlines forced them to
...more
But one man was way ahead of them all. That one had written a doctoral thesis at Utah in 1969 describing an idealized interactive computer called the FLEX machine. He had experimented with powerful displays and with computers networked in intricate configurations. On page after page of his dissertation he lamented the inability of the world’s existing hardware to realize his dream of an interactive personal computer. He set before science the challenge to build the machine he imagined, one with “enough power to outrace your senses of sight and hearing, enough capacity to store thousands of
...more
Visible within the flood of ideas is the Alan Kay who made computing cool. He declared publicly that it was all right to use three-million-dollar machines to play games and “screw around.” If that meant grad students were blasting digital rocket ships off their computer screens in a game called “Spacewar,” it was all part of the weaving of new technology into the cultural fabric. His unashamed view of the computer as very much a toy liberated many others to explore its genius for procedures other than the parsing of numbers and the sequencing of databases—to see it, in other words, as a
...more
One factor in his powerful kinship with Bob Taylor was their shared curiosity about what this machine could be made to do, more than how. Notwithstanding his incessant harangues, most of the inspired engineers Taylor recruited to CSL, the Lampsons and Thackers, started out too blindly focused on the issue of what was within their power to actually build. They would ask: What is the next stop on the road? Kay turned the question inside out: Let’s figure out where we want to go, and that will show us how to get there. He never lost sight of the computer’s appropriate station in the world: to
...more
“People get trapped in thinking that anything in the environment is to be taken as a given. It’s part of the way our nervous system works. But it’s dangerous to take it as a given because then it controls you, rather than the other way around. That’s McLuhan’s insight, one of the bigger ones in the twentieth century. Zen in the twentieth century is about taking things that have been rendered invisible by this process and trying to make them visible again.
“Parents ask me what they should do to help their kids with science. I say, on a walk always take a magnifying glass along. Be a miniature exploratorium….” You would have to know something about his life to recognize this as a scene from his childhood. Kay’s father was a scientist, a physiologist engaged in designing prostheses for arms and legs. “I can certainly remember going on walks with him,” Kay recalled. “And when you go on a walk with a parent who’s interested in science, it’s usually about all the things that you can’t readily see.”
United’s IBM 305 RAMAC was the first one he ever touched. It was huge, specifically designed to manage colossal databases like the fifty-two weeks’ worth of reservations and seating records consigned to Denver’s safekeeping. But what really struck Kay was the primitiveness of its operational routine. The system was serviced by platoons of attendants, full-time menials doing nothing more refined than taking stacks of punch cards from one machine and loading them in the next. To his amazement, digital electronics turned out to be as mindless and labor-intensive as laying a sewer line.
There was an exorbitant discrepancy between the purpose of the machine—which was to simplify human endeavor—and the effort required to realize it. Kay banked the insight.
Nothing of the kind existed in the computer world in the 1960s. Machines differed in shape, size, and architecture down to the circuitry inside their cabinets and the sequences of digital ones and zeros delivering instructions to the central processing unit.
Therefore Kay, who had programmed everything from a Burroughs 5000 at the Air Force Air Training Command to a Control Data 6600 at NCAR, the National Center for Atmospheric Research, was compelled to become a student of computer architectures. Subconsciously his mind was absorbing the principles of programming that would grow a few years hence at PARC into an extraordinary advance in software design. As he recalled later, however, at the moment “I barely saw it.”
The essay forecast that as circuits became more densely packed with microscopic transistors, computing power would exponentially increase in performance and diminish in cost over the years. Moore contended that this trend could be predicted mathematically, so that memory costing $500,000 in 1965 would come all the way down to $3,000 by 1985—an insight so basic to the subsequent growth and expansion of the computer industry that ever since then it has been known as “Moore’s Law.”