More on this book
Community
Kindle Notes & Highlights
“One of the reasons our group was successful, and got a big jump on others, was that we set up certain limited objectives, namely that we would not produce any new elementary components,” adds Bigelow. “We would try and use the ones which were available for standard communications purposes.
reliable components,
In 1946, on the eve of the transistor, it was uncertain whether the non-zero probability of error in any individual digital transformation would bring a computation involving millions of transformations to a halt.
Not only did the widespread use of the 6J6 mean that it was available inexpensively,
Abraham Wald, who founded sequential analysis while working with our group. Statistical thinking had
Of the final total of 3,474 tubes in the IAS computer, 1,979 were 6J6s.
1951 “Reliable Organizations of Unreliable Elements” and 1952 “Probabilistic Logics and the Synthesis of Reliable Organisms from Unreliable Components,
In the post exchange, he found a copy of Atomic Energy for Military Purposes, a swiftly declassified nontechnical account of the Manhattan Project by Henry Smyth, chairman of the physics department at Princeton
the problem was how to build a forty-stage shift register, this being at the heart of the machine’s ability to compute.
learn how to design a reliable 40-stage machine with thousands of unreliable components,
specifications to what is now called “worst-case design
mechanical devices, vacuum tubes are weakened by age, not use, and “suffer accidental failures
Optimum reliability can therefore be achieved by operating as few tubes as possible, at maximum speed.
separate signal from noise at every stage of the process—in this case, at the transfer of every single bit—rather than allowing noise to accumulate along the way.
von Neumann started consulting for IBM.”26 All technical details of the MANIAC and its programming were placed in the public domain,
Much of what was learned in building the wire drive was later applied to an auxiliary 2,048-word magnetic drum, equivalent to a 40-channel wire drive running fixed loops of wire through independent read/write heads.
Williams and Kilburn had demonstrated how a sequence of pulses (in time) could be converted to a pattern of spots (in space) and stored indefinitely as long as the pattern were regenerated periodically by a trace from an electron beam. The spots become positively charged (i.e., deficient in electrons) as a result of secondary electron emission by the phosphor,
“if noise is ever to be filtered from signal, it must be done at the earliest possible stage.
The 1,024 bits in each cylinder were visible to the naked eye, flickering from one machine cycle to the next
In modern (or once-modern) computers, a cathode-ray tube (CRT) displays the state of a temporary memory buffer whose contents are produced by the central processing unit (CPU). In the MANIAC, however, cathode-ray tubes were the core memory, storing the instructions that drove the operations of the CPU. The
Parallel memory access would make the computer forty times as fast as a serial processor but, in the opinion of numerous skeptics, unlikely to work without one thing or another always going wrong.
Each individual memory tube had its own logbook recording its health history and any idiosyncrasies that arose along the way.
It just isn’t decent for the operator to have to worry about how the machine is built.
RCA, distracted by television, never took the Selectron seriously and failed to give Rajchman, working largely alone, the resources to make it a success.
Rajchman a RCA could have further developed the Selectron, but RCA preferrsd the electron beam switchdd memory beaus they wanted tk focus on tv business anyway.
It took years of midnight oil to sort these problems out, but the general trend was for hardware to become more reliable and error-free, while codes grew more complicated, and error-prone.
“His problem was that he was a thinker,” says Atle Selberg, whose wife, Hedi, was hired by von Neumann on September 29, 1950, and remained with the computer project until its termination in 1958. “He wouldn’t leave things alone when other people thought they were finished. Julian was always thinking of doing something a bit more here and there.”
“Von Neumann understood this very deeply,” Bigelow confirms. “So when looking at ENIAC, or some of the early machines which were very inflexible, he saw better than any other man that this was just the first step, and that great improvement would come.”
“Von Neumann singled out the problem of numerical weather prediction for special attention,” Thompson later explained, “as the most complex, interactive, and highly nonlinear problem that had ever been conceived of—one that would challenge the capabilities of the fastest computing devices for many years.”
World War II, with its growing dependence on aircraft, increased the demand for forecasts, while weather radar and radio-equipped weather balloons increased the supply of observational data needed to produce them.
So weather prediction is nother really important computational motivation. Because f air force troubles
he imagined, adding that “perhaps some day in the dim future, it will be possible to advance the computations faster than the weather advances, and at a cost less than the saving to mankind due to the information gained.”
Ha so Richardson had a very real time goal for computing weather predictions since you cant predict too many hours into the future , but it takes several hours for humans to perfom calculationz
Eckert and Mauchly were upset that the New York Times had made no mention of the ENIAC, but had mentioned the proposed IAS/RCA computer, which did not even exist. They felt scooped by von Neumann, as they had over the authorship of the EDVAC report, and prevented, by the secrecy imposed on their own project, from voicing a response.
he estimated that once the new computer was operating, “a completely calculated prediction for the entire northern hemisphere should take about 2 hours per day of prediction.”
And it was Charney who did the most to solve Richardson’s third problem: formulating equations whose solutions did not quickly become more unstable than the weather itself.
and we did catch the cyclogenesis. It wasn’t terribly accurate, but there was no question that [we did]. And I always thought that this was a terribly important thing.… I wanted the world to know about that!”
The next six months brought intense activity: the Trinity test, Hiroshima, Nagasaki, the surrender of Japan, and, behind the scenes, the completion of the ENIAC, the first H-bomb calculations, and the launching of the computer project at the IAS.
Canfield solitaire with fifty-two cards will play out successfully? “After spending a lot of time trying to estimate them by pure combinatorial calculations,” he recalls, “I wondered whether a more practical method than ‘abstract thinking’ might not be to lay it out say one hundred times and simply observe and count the number of successful plays.” This, he noted, was a far easier way to arrive at an approximate answer than “to try to compute all the combinatorial possibilities which are an exponentially increasing number so great that, except in very elementary cases, there is no way to
...more
“Your code was described and was impressive,” von Neumann wrote to Klári from Los Alamos, discussing whether a routine she had developed should be coded as software or hardwired into the machine. “They claim now, however, that making one more, ‘fixed,’ function table is so little work, that they want to do it. It was decided that they will build one, with the order soldered in.”
With the success of Monte Carlo came a sudden demand for a reliable supply of random numbers; there was a shortage of them.
The U.S. Air Force’s Project RAND (progenitor of the RAND Corporation), for whom von Neumann was consulting in Santa Monica, took it upon themselves, in April 1947, to build an electronic roulette wheel and compile a list of one million random numbers, available first as punched cards and later expanded
To help determine whether the Super was something the United States should pursue, or be afraid of its enemies pursuing, it was decided to run the big December 1945 ENIAC calculations, and to hold a conference, in April 1946, on the results.
Barricelli “insisted on using punched cards, even when everybody had computer screens,” according to Gaure. “He gave two reasons for this: when you sit in front of a screen your ability to think clearly declines because you’re distracted by irrelevancies, and when you store your data on magnetic media you can’t be sure they’re there permanently,
Claude Shannon with his 1940 PhD thesis on “An Algebra for Theoretical Genetics” (which was followed by a year at IAS), had already built a framework into which the double helix neatly