More on this book
Community
Kindle Notes & Highlights
Read between
July 22 - September 30, 2017
Douglas Engelbart had been a voice in the wilderness until then; his own bosses at SRI International, in what would soon become Silicon Valley, thought he was an absolute flake.
With the Luftwaffe's making nightly bombing runs over London, and with German U-boats' wreaking havoc in the North Atlantic, his most urgent priorities had to be the development of radar, antiaircraft fire control, and antisubmarine warfare. (In 1941 the list would expand to include a supersecret crash program known as the Manhattan Project, which was intended to exploit a newly discovered phenomenon called nuclear fission.)
Hopper would later gain fame both as a teacher and as a pioneer in the development of high-level programming languages. Yet perhaps her best-known contribution came in the summer of 1945, when she and her colleagues were tracking down a glitch in the Mark II and discovered a large moth that had gotten crushed by one of the relay switches and shorted it out. She taped the dead moth into the logbook with the notation "First case of an actual bug being found."
In the intellectual turmoil of Cambridge in 1948, they were beginning to glimpse a way to be completely rigorous and scientific and yet still believe in the existence of mind and the reality of our inner lives.
Marianna Jablonskaa liked this
However, the Latin gubernator turned out to be a corruption of the Greek word for "steersman," kybernetes. And that, Wiener felt, could be transmuted into English very nicely, as cybernetics.
Marianna Jablonskaa liked this
Cybernetics was not an easy book to write, says Selfridge, who watched Wiener struggle through many revisions.
"The first industrial revolution, the revolution of the 'dark, satanic mills,' was the devaluation of the human arm by the competition of machinery," he wrote. "The modern industrial revolution is similarly bound to devalue the human brain, at least in its simpler and more routine decisions."
And Shannon, being human, was not displeased. He was exceedingly disturbed, however, by a related development. Within a year or two of his initial publication, much to his horror, information theory started to become . . . popular. Anybody and everybody seemed to be plunging into the field, many without a clue as to what they were talking about. Scientists were beginning to submit grant applications that included references to information theory, whether or not their proposals actually had anything to do with it. It was becoming a scientific buzzword, much as chaos, artificial intelligence,
...more
As a general rule of thumb, for example, music teachers proved to be particularly adept. And much to the project leaders' astonishment (this being the 1950s) women often turned out to be more proficient than men at worrying about the details while simultaneously keeping the big picture in mind. One of the project's best programming groups was 80 percent female.
"The hope is that, in not too many years, human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today."
And most especially, what you needed was interactivity day or night, at work or at home, anytime you had an idea, anytime you wanted.
What ultimately had value was not computer time, he kept insisting, but human time.
Marianna Jablonskaa liked this
Lick, who hated to write but hated even more to disappoint anyone in his professional family, agreed to help out. He handed Elkind the completed manuscript of "Man-Computer Symbiosis" on January 13, 1960.
Future programming languages would be graphical, incorporating gestures and strokes and images in a way that would be so transparent and so intuitive that people would be able to use them with little or no training.
His name was Douglas Engelbart.
Not in the biological "hardware," said Engelbart: at the level of neurons and neurochemistry, all human brains are essentially identical. No, the difference lies in the biological "software," the repertoire of concepts and skills that each of us acquires over a lifetime.
isolated skills and concepts are useless unless they can be organized for a larger purpose:
CTSS did have one great thing going for it: Corbató had designed it as an open system, in the sense that users could modify it, tailor it, and extend it however they wanted. As Corbató himself said, "This open system quality . . . allowed everyone to make the system be their thing, rather than what somebody imposed on them."
"The presence of such errors in a program is not evidence of poor workmanship on the part of the programmers," he wrote. "[The fact is] that no complex program can ever be run through all its possible states or conditions in order to permit its designers to check that what they think ought to happen actually does happen."
software crisis was to apply more and better interactive computing, à la Dynamic Modeling. But to programmers who worked in the commercial sector—which was to say, most programmers—the answer was very different. In their world, software was a product to be gotten out the door, on time and within budget. So their instinctive reaction was to adopt an industrial approach, with an ever-increasing emphasis on planning, discipline, documentation, coordination, and control. Perhaps not surprisingly, this instinct also fitted in perfectly with the batch process-oriented, think-it-through-ahead-of-time
...more
He just didn't think it should be the only approach.
The real significance of computing was to be found not in this gadget or that gadget, but in how the technology was woven into the fabric of human life—how computers could change the way people thought, the way they created, the way they communicated, the way they worked together, the way they organized themselves, even the way they apportioned power and responsibility.
Marianna Jablonskaa liked this
Simula put each of those data structures together with all its procedures in a tightly integrated package, so that each structure "knew" how to respond to commands. In practice, this meant that a Simula programmer could model an oil refinery, say, in much the same way that he or she thought about a real refinery: not as a list of abstract data structures and equally abstract procedures, but in terms of valves, pipes, tanks, and whatever—tangible objects that had well-defined properties and characteristic behaviors. The potential gain in conceptual clarity was enormous.
"As with Simula leading to OOP [object-oriented programming]," he wrote, "this encounter finally hit me with what the destiny of personal computing really was going to be. Not a personal dynamic vehicle, as in Engelbart's metaphor . . . but something much more profound: a personal dynamic medium."*
We used to work until we were exhausted, and would sleep until we woke, without regard for alarm clocks or the sun."