- George Stibitz, Bell Telephone Labs mathematician, in 1942 when discussing anti-aircraft gun machines, didn’t like the term “electrical pulses”, so he suggested another term: digital.
- The true mechanization of calculation began when inventors devised ways, not only to record numbers but to add them, in particular the capability to carry a digit from one column to the other (999+1) - this began with Pascal’s adding machine of 1642, or with a device by Wilhelm Schickard in 1623. Leibniz extended Pascal’s invention to be able to multiply as well. The Felt Comptometer, invented in the 1880s, was one of the first commercially successful calculators.
- Charles Babbage proposed in the 1830s the use of punched cards for his Analytical Engine, an idea he borrowed from the Jacquard loom which uses punched cards to control the weaving of cloth by selectively lifting threads according to a predetermined pattern. Herman Hollerith was able to use this technique to develop a machine for the 1890 US Census. The crucial difference between Jacquard’s and Hollerith’s systems: Jacquard used cards for control, whereas Hollerith used them for storage of data. Eventually IBM’s punched card installations would also use the cards for control. Before WWII, the control function of a punched card was carried out by people manually. The concept of automatic control is the ancestor of what we now call software.
- (1) control, (2) storage, (3) calculation, (4) the use of electric or electronic circuits: these attributes, when combined, make a computer. The fifth attribute, (5) communication, was lacking in the early computers built in the 1930s and 1940s. It was ARPA’s project in the 1960s that reoriented the digital computer to be a device that was inherently networked.
- In 1936, Alan Turing proposed the Turing machine and in 1937, Konrad Zuse (unaware of Turing’s paper) created the first universal machine. Babbage never got that far, but he did anticipate the notion of the universality of a programmable machine. John von Neumann conceived the stored program principle which has physical memory storing both the instructions (program) and the data without any distinctions. All of the machines so far used electricity to carry signals, but none used electronic devices to do the actual calculations. In 1938, J. V. Atanasoff did so using vacuum tubes to solve linear algebra equations (and nothing else), the first machine of its kinds. In doing so, he said he considered “analogue” techniques but discarded them (likely the origin of the term analog in computing). Then came the Colossus in 1944, the ENIAC in 1946 (first reprogrammable computer, but it took days to be able to reprogram it).
- The 1950s and 1960s were the decades of the mainframe, so-called because of the large metal frames on which the computer circuits were mounted. The UNIVAC inaugurated the era, but IBM quickly came to dominate the industry. In the 1950s, mainframes used thousands of vacuum tubes; by 1960 these were replaced by transistors, which consumed less power and were more compact. A person skilled in mathematical analysis would lay out the steps needed to solve a problem, translate those steps into a cold that was intrinsic to the machines design, punch those codes into paper tape or its equivalent, and run the machine. Through the 1950s, a special kind of program, a compiler, pioneered by Grace Hopper who led the UNIVAC team, would take human-friendly instructions and translate to machine code.
- Programming computers remained in the hands of few specialists who were comfortable with machine code. The breakthrough came in 1957 with Fortran, introduced by IBM, that had a syntax close to ordinary algebra, which made it familiar to engineers. Fortran compiler generated machine code that was as efficient and fast as code that human beings wrote (it was around this time that these codes came to be called “languages”). Its major competitor was COBOL, in part because the US DoD adopted it officially and, as a result, COBOL became one of the first languages to be standardized to a point where the same program could run on different computers, from different vendors, and produce the same results. As computers took on work of greater complexity, another type of program emerged that replaced the human operators who managed the flow of work in a punched card installation: the operating system around 1956.
- Vacuum tubes worked through the switching of electrons excited by a hot filament, moving at high speeds within a vacuum. They were very fast but cumbersome since tubes tended to burn out (similar to lightbulbs) AND required a hot filament. A new invention by Bell Labs in 1947 (using high purity germanium produced at Purdue, my alma mater) allowed to move on from those two bottlenecks: transistors. Manufacturers began selling transistorized computers beginning in the mid 1950s. The chip was a breakthrough of the early 1960s because it allowed a way of placing multiple transistors and other devices on a single wafer of silicon or germanium. As chip density increased, the functions it performed were more specialized, and the likelihood that a particular logic chip would find common use among a wide variety of customers got smaller and smaller. In this context, IBM started using the term architecture to refer to the overall design of a computer.
- Those computers lacked communication functionalities. The first computers of the ARPANET were linked to one another in 1969, and by 1971 there were 15 computers linked to it. However, as things progressed, there were multiple networks of computers independent of each other (many “internets”). ARPANET was the first, for research and scientific purposes, but there was also AOL, Usenet, BITNET and many others with different purposes and uses. The main difference between the modern internet and ARPANET is the social and cultural dimension that we have today. Things changed when the National Science Foundation (NSF) made 3 key decisions in 1986: (1) create a general-purpose network, available to researchers in general; (2) adopt the TCP/IP protocol promulgated by ARPA; (3) fund the construction of a high-speed backbone to connect supercenters and local/regional networks. One of the main recipients of these construction contracts was MCI (today a subsidiary of Verizon), which to this day is the principal carrier of Internet backbone traffic. Soon BITNET and Usenet established connections to this major network along with other domestic and international networks. The original ARPANET became obsolete and was the commissioned in 1990. By 1995, all internet backbone services were operated by commercial entities.
- In a milestone legislation of 1992, “the foundation is authorized to foster and support access by the research and education communities computer networks which may be substantially for purposes in addition to research and education in the sciences and engineering”. With those three words, “in addition to”, the modern Internet was born and the NSF role in it receded.
- Then Tim Berners-Lee invented the World Wide Web at the CERN. It made the Internet more accessible. It had 3 main components: URL, HTTP, HTML. He wrote a program, called a browser, that users installed on their computer to display information transmitted over the Web. And it spread, a grouped created Mosaic, a browser with rich graphics and integrated with the mouse.
- Another big invention were specialized websites to search the web as it was manually and poorly indexed up to that point. AltaVista was the first big one, but replaced by Google with its high quality algorithm.