More on this book
Community
Kindle Notes & Highlights
Read between
December 17, 2017 - July 29, 2018
Stevens's strategy was simple: hire the most brilliant experimenters in the country, give them the best equipment money could buy, inspire them to the highest possible standards of intellectual clarity and experimental precision—and work them fourteen hours per day.
He was the founder of artificial intelligence, which for him meant (among other things) a computer that could respond to you in real time, with humanlike common sense. He was likewise the creator of Lisp, an interactive symbol-processing programming language that not only had a compelling mathematical beauty but would let you grow your programs in a much more open-ended, organic manner than batch processing ever could. And he was the inventor of general-purpose computer "time-sharing," a technique that let individual users interact with batch-processing behemoths in a way that looked very much
...more
Programming was for everyone, he insisted, not just the science and engineering majors. It was a fundamental intellectual skill, like mathematics or English composition. The point of his course was not merely to teach students how to write in Fortran or Algol, Perlis maintained, but also to teach them how to think about processes of all kinds—how to describe them, how to analyze them, and how to build up complex processes out of simpler ones.
Assault the technological frontiers everywhere you can, Ruina urged them. Go out to the university labs, the national labs, the private sector, anywhere. Look for people with ideas that push the envelope. Give them development money. Be generous. Take risks. Cut through the red tape. Do whatever you have to. But do it.
Give people something useful, and they'll use it.
the third great lesson of time-sharing: openness requires trust.
And by all accounts the wildest of the bunch was Utah's Alan Kay, a guy who was so far out in the future that not even this crowd could take him seriously—yet so funny, so glib, and so irrepressible that they listened anyway. When it was his turn to give a presentation, Kay told them about his idea for a "Dynabook," a little computer that you could carry around in one hand like a notebook.
PARC, they called it: the Xerox Palo Alto Research Center.
So in 1957, having had their fill of Shockley, and convinced that they could do far better working with a more tractable material, silicon, eight of his brightest young associates left to form a company of their own: Fairchild Semiconductor.
Among these émigrés was Noyce himself, who teamed up in 1968 with another Fairchild cofounder, Gordon Moore of Moore's law, to found a company they called Intel, short for "integrated electronics" (fortunately for posterity, they rejected other candidate names such as Elcal, for Electronics of California, and Ectek, for Electronic Computer Technology).
"Don't just invent the future; go live in it."
Make that especially Alan Kay. Brilliant, irrepressible, and blessed with a world-class gift for gab—not to mention a handlebar mustache and a grin that gave him the look of a mischievous hippie—the thirty-one-year-old Kay was Taylor's most gleefully subversive agent in SSL.
"My maternal grandmother was a schoolteacher, suffragette, lecturer, and one of the founders of the University of Massachusetts, Amherst," Kay wrote in a later account. "My maternal grandfather, Clifton Johnson, was a fairly well known illustrator, photographer, and writer (100+ books). He was also a musician, and played piano and pipe organ. He died the year I was born [1940], and the family myth is that I am the descendant most like him, both in interests and in temperament."
True learning, he insisted, required the active participation of the learner. True learning was a matter of curiosity and exploration—and the joy of discovering how each new experience fitted in with the web of memories, ideas, feelings, and sensations already in the mind. True learning, he was convinced, was what computers could bring to everyone.
"It is frightfully important for man to communicate with his fellow man,"
"The future," he wrote, "is not to be won by making a lot of minor technological advances and moving them immediately into the Services."
"Unix was the first really general-purpose operating system for minicomputers," he says. "It took off for two reasons. One, it was free. And two, Unix was the first operating system you could get source code [the full list of programming commands] for. You could hack it."
Dennis Ritchie and Ken Thompson had created Unix for their own use back in 1969, after Bell Labs' withdrawal from the Multics partnership, and the project had retained that loose, hands-on feeling ever since.
mainly Unix was tightly written because that was how the two men felt a kernel ought to be: sweet, quick, and clean.
Of course, sometimes you had to get down into the bits and bytes and compose a completely new software tool. But that was pretty easy, too, especially after Ritchie completely rewrote the Unix kernel in an elegant new computer language of his own devising. Since he'd based it on an earlier, experimental language by Thompson, code-named "B," Ritchie code-named his language "C." The name stuck.
But no matter: once the language was ready, Allen quit his job, Gates dropped out of school, they both moved to Albuquerque to be near MITS, and together they formed a little company called Micro Soft to market it.)
On the hardware side, this challenge was taken up most famously by the Apple Computer Company, founded in 1976 by Homebrew Computer Club members Steve Wozniak and Steve Jobs, longtime buddies from the Silicon Valley town of Cupertino.
Games were still the most popular choice, of course, but educational software had a large market, too. And business users were beginning to find office-oriented products such as the WordStar word-processing package, the dBase database program, and the first of the must-have killer apps, VisiCalc, an electronic-spreadsheet program for the Apple II that automated the tedious chores of bookkeeping and financial analysis. Businesspeople were soon buying Apple IIs just so they could run the program; Apple later estimated that VisiCalc was responsible for the sale of some twenty-five thousand units
...more
they announced that Time's Man of the Year for 1982—"the greatest influence for good or evil"—was not a human being at all but a machine: the computer. "In 1982," declared the magazine's cover story, "a cascade of computers beeped and blipped their way into the American office, the American school, the American home. The 'information revolution' that futurists have long predicted has arrived, bringing with it the promise of dramatic changes in the way people live and work, perhaps even in the way they think. America will never be the same."
By now, of course, Steve Jobs's visit to PARC in December of 1979 has long since passed into legend. "Apple's daylight raid," as one writer called it, has become one of the founding myths of the personal-computer revolution, the moment when the nimble young Jason snatched the Golden Fleece from the sleeping dragon.
Kay had recently gone off to Los Angeles on a long-promised sabbatical, partly to be with a new girlfriend and partly to "take organ lessons,"
"Why hasn't this company brought this to market?" Jobs famously shouted, waving his arms around while his engineers did their best to ignore him and focus on how the system worked. "What's going on here? I don't get it!"
"Special attention must be paid to the friendliness of the user interaction and the subtleties that make using the Lisa rewarding and job-enriching."
Despite a decade's head start and more, history was passing them by. The business writers Douglas K. Smith and Robert C. Alexander said it all in the title of their 1988 account, Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer. Indeed, in the pop mythology of the personal-computer era, the story of PARC and the Alto has become an endlessly repeated classic: visionary genius stifled by leaden bureaucracy.
The first and most obvious was a rampant case of feature-itis: they loaded up the Star software with every neat thing they could think of, until it had grown to roughly a million lines of code and was at the ragged edge of what the 1981-vintage hardware could support. Indeed, that was the first thing users noticed: the Star was painfully, maddeningly sloooow.
A second and more fundamental error was the designers' decision to make the Star a closed system, meaning that all the hardware and all the software had to come from Xerox.
the third and perhaps most distressing error was that the designers passed up several chances to do something simpler. "The option was open all through the nineteen-seventies," says Lampson. "We could have shipped something much like the IBM PC, and a short time later shipped something very much like the original Alto."
"Lick was accepted for who he was: an extremely important person in the computer field."
"What amazed me was how many generations of people he influenced," says Malone. "There were people there from Hewlett-Packard, from DEC—from all over the place, all standing up and crediting Lick with giving them a chance to do their best work. It was a universal feeling."
"I pushed the Internet as hard as I did because I thought it was capable of becoming a vital part of the social fabric of the country—and the world," he says. "But it was also clear to me that having the government provide the network indefinitely wasn't going to fly. A network isn't something you can just buy; it's a long-term, continuing expense. And government doesn't do that well.
And then around Christmastime 1990, at CERN, the European Center for Particle Physics in Geneva, Switzerland, an English physicist named Tim Berners-Lee finished the initial coding of a system in which Internet files could be linked via hypertext.
In retrospect, of course, Berners-Lee's combination of hypertext and browsing would prove to be one of those brilliantly simple ideas that change everything. By giving users something to see, it set off a subtle but powerful shift in the psychology of the Internet.
So in the end Berners-Lee settled for a name that was still less than ideal, since its acronym was nine syllables long when spoken. Nonetheless, it did seem to say what he wanted it to say, so he went with it: World Wide Web.
And yet it's fair to ask, did Lick himself really have all that much to do with these developments? If he had never spun his dreams of symbiosis and all the rest, or if he hadn't accepted Jack Ruina's offer to start ARPA's command-and-control program, wouldn't the history of computing still have unfolded pretty much the way it did? Maybe.
Most of the significant advances in computer technology—including the work that my group did at Xerox PARC—were simply extrapolations of Lick's vision. They were not really new visions of their own. So he was really the father of it all."