The Eye of the Master: A Social History of Artificial Intelligence
Rate it:
4%
Flag icon
‘To make machines look intelligent it was necessary that the sources of their power, the labour force which surrounded and ran them, be rendered invisible.’
4%
Flag icon
The targets of this power (or ‘surveillance capitalism’ in Shoshana Zuboff’s definition) are usually described not as actors possessing autonomy and ‘intelligence’ on their own but as passive subjects of measurement and control. This is a problem of critical theory in general and critical AI studies in particular: although these studies are concerned about the impact of AI on society, they often overlook the role of collective knowledge and labour as the primary source of the very ‘intelligence’ that AI comes to extract, encode, and commodify. Moreover, these studies often fail to see the ...more
7%
Flag icon
Interestingly, the statistical method of multidimensional projection originated from the fields of psychometrics and eugenics in the late nineteenth century, and was analogous to the technique employed by Charles Spearman for evaluating ‘general intelligence’ in the controversial practice of the intelligence quotient (IQ) test. This is a further proof of the social genealogy of AI: the first artificial neural network – the perceptron – was born not as the automation of logical reasoning but of a statistical method originally used to measure intelligence in cognitive tasks and to organise ...more
9%
Flag icon
Cultural techniques – such as writing, reading, painting, counting, making music – are always older than the concepts that are generated from them. People wrote long before they conceptualized writing or alphabets; millennia passed before pictures and statues gave rise to the concept of the image; and until today, people sing or make music without knowing anything about tones or musical notation systems. Counting, too, is older than the notion of numbers. To be sure, most cultures counted or performed certain mathematical operations; but they did not necessarily derive from this a concept of ...more
10%
Flag icon
Numbers, as much as abstract rules and heuristic practices, were key tools in the administration of ancient societies, but they were not invented from nothing: they materially emerged as a form of power through labour and rituals, through discipline and drill.
10%
Flag icon
This study can be taken as a rejoinder to the Platonic numerology that is central to the history of music: before numbers were used to measure the proportions of rhythm, the rhythm of work contributed to the invention of numbers. At the end, these findings cast a different light on the history of mathematics, so much that one could suspect, at this point, that algorithmic practices are even older than the concept of number itself.
10%
Flag icon
Damerow argued that learning is based on the construction of ‘mental models’ that fundamentally represent and internalise external actions.
11%
Flag icon
To understand abstraction essentially means understanding what has to be abstracted rather than merely knowing how it has to take place. To understand the abstraction leading to an elegant solution of the problem means understanding how the solution can really be found.
11%
Flag icon
Abstraction always operates within given material constraints and through them: symbols, tools, techniques, and technologies are conceived and realised in relation to limited resources of matter, energy, space, time, and so on. The reality which abstraction is struggling with is not the idealised space of Platonic ideas but the actual living world, made of force fields and conflicts. In this sense, abstraction is also part of the larger social antagonism.
13%
Flag icon
Contrary to the common view that stresses only the separation of hardware and software, digital computing is actually the imbrication, in the same medium of information and instruction, of binary numerals and Boolean logic – one as a complementary form of the other.
14%
Flag icon
On the contrary, machine learning algorithms change their internal rules (called parameters) according to the input data. As such, data are no longer passive, so to speak, but become active information that influences the parameters of the step-by-step procedure which is, then, no longer strictly predetermined by the algorithm. The breakthrough of machine learning is exactly about this shift: algorithms for data analytics become dynamic and change their rigid inferential structure to adapt to further properties of data – usually logical and spatial relations.
15%
Flag icon
The idea of the automatic computer, in the contemporary sense, emerged out of the project to mechanise the mental labour of clerks rather than the old alchemic dream of building thinking automata – although the latter narrative would often be used, in the nineteenth century as much as in the century of corporate AI, to masquerade the former business.
17%
Flag icon
In more analytical terms, the Babbage principle posits that the abstract diagram of the division of labour helps to organise production while at the same time offering an instrument for measuring the value of labour. In this respect, the division of labour provides not only the design of machinery but also of the business plan.
17%
Flag icon
The effect of the division of labour, both in mechanical and in mental processes, is, that it enables us to purchase and apply to each process precisely that quantity of skill and knowledge which is required for it: we avoid employing any part of the time of a man who can get eight or ten shillings a day by his skill in tempering needles, in turning a wheel, which can be done for sixpence a day; and we equally avoid the loss arising from the employment of an accomplished mathematician in performing the lowest processes of arithmetic.
20%
Flag icon
That the Analytical Engine could follow analysis means that it could represent and embody the analytical construction of a problem as an algebraist could do. Moreover, that the Analytical Engine had ‘no power of anticipating any analytical relations or truths’ means that it could not exceed or break the chain of reasoning that it was representing and materially embodying – just as today’s algorithms for data analytics that are rebranded as ‘machine learning’ and ‘artificial intelligence’ cannot creatively break the rules on which they are based and, more importantly, cannot consistently invent ...more
22%
Flag icon
In this debate, Marx was probably one of the most original and acute voices. He came to question the technological determinism according to which the machine would be the prime mover of industrial capitalism. Reversing the common perception of the relation between technology and economy, he argued that technological development (the means of production) is triggered by the division of labour (the relations of production) and not the other way around.
24%
Flag icon
Following Babbage, Marx adopted the idea that the extended division of labour, rather than science, was the inventor of the machine. In this way, Marx reversed Thompson and Hodgskin’s knowledge theory of labour into the more materialistic labour theory of knowledge, in which forms of labour that are spontaneous, unconscious, tacit, and collective are also eventually recognised as producing knowledge.
25%
Flag icon
Tool-makers and machine operators knew that they were contributing to the invention of new technologies. What they were rarely aware of is that they were also contributing to new scientific discoveries. New machines prompt scientific notions and paradigm shifts more often than science happens to invent new technologies from above. As in an example mentioned earlier, it was the steam engine which gave birth to thermodynamics, rather than the other way around. The science of heat and energy transformation developed to ameliorate the steam engine: it was a projection of the lucrative ambitions of ...more
25%
Flag icon
Tools and machines, however, are never fully transparent in their implications. Machines are born as experiments, and they are often operated without full knowledge of their workings. Science is developed to cover these blind spots in our knowledge of machines, not just of the universe. On the other hand, the perception of nature is often machine based, not simply because of the mediation of instruments on perception but because machines have influenced, indirectly, the ontology of entire scientific paradigms. For example, into the twenty-first century, the standard theory of time remains ...more
25%
Flag icon
Going back to the industrial age, the historical epistemology of science and technology suggests that we reconsider the project of machine intelligence as a prism reflecting multiple forms of knowledge. Stretching its definition across a larger time scale, the expression ‘machine intelligence’ ultimately acquires at least four meanings: (1) the human knowledge of the machine; (2) the knowledge embodied by the machine’s design; (3) the human tasks automated by the machine; and (4) the new knowledge of the universe made possible by its use.
29%
Flag icon
The introduction of machinery marks a dramatic dialectical turn in the history of labour, whereby the worker ceases to be the subject of the machine and becomes the object of capital: ‘The hand tool makes the worker independent – posits him as proprietor. Machinery – as fixed capital – posits him as dependent, posits him as appropriated.’44 This shift in power between human and machine in the Victorian age is also the inception of a new imagery, in which machines acquire features of the living and the workers those of automata:
35%
Flag icon
Cybernetics unveiled the machinic nature of bureaucracy and, conversely, the bureaucratic role of machines – that is, how they both work as feedback apparatuses to control and capture workers’ know-how. The findings of Alquati’s research can be summarised as follows: (1) labour is the source of information of the industrial cybernetic apparatus, indeed the most valuable part of labour is information; (2) information operates the cybernetic apparatus, gradually improves its design and adds value to the final products; (3) the numeric dimension of cybernetics allows us to translate labour into ...more
36%
Flag icon
While machine learning textbooks reiterate that McCulloch and Pitts’s idea of artificial neurons was inspired by the structures and behaviour of neurons in the brain, in fact the opposite is true: they saw, in the first instance, biological neurons as technological artefacts. McCulloch and Pitts implicitly envisioned brain physiology as homologous with the communication technology of the age, comprised of electromechanical relays, feedback mechanisms, television scanners, and, notably, telegraph networks. At the 1948 Hixon symposium on cerebral mechanisms, discussed in more detail in the next ...more
38%
Flag icon
From an epistemological point of view, we here encounter once again a mechanical paradigm that openly aspires to become a paradigm of nature, not of biological laws in this case but of physical ones. Precisely, a finite-state machine, such as the digital computer, is elevated to become the ontological model for the structure of the universe itself.
41%
Flag icon
This book tries to clarify that, rather than designing machines like organisms (biomorphism) as they professed, cyberneticians ultimately envisioned organisms like machines (technomorphism), which were mirroring their own surrounding social order (sociomorphism). Like the philosophies of nature from earlier centuries (the canonical example being La Mettrie’s L’homme Machine of 1747), cyberneticians projected on the ontology of nature and the brain the technical composition of their time, made up of telegraph networks, electromechanical relays, feedback systems, and television scanners. ...more
41%
Flag icon
The analogy between organisms and machines appears, at first glance, to be an issue of epistemic translation between the disciplines of engineering and biology, but, in fact, it points to a more profound attitude of cybernetic engineering: What are the ethical implications of seeing an industrial machine as an organism, a living being? As much as ‘computer science’, cybernetics was not a science but an artificial language, a manual of instructions for machine components – a ‘machine semiotics’ which happened to be forcibly translated into an ontology of nature.
41%
Flag icon
The paradigm of connectionist AI did not win out over symbolic AI because the former is ‘smarter’ or better able to mimic brain structures, but rather because inductive and statistical algorithms are more efficient at capturing the logic of social cooperation than deductive ones. By tracing the evolution from linear to self-organising information, the history of data analytics, machine learning, and AI can begin to be seen in perspective as a grand process of self-organisation within the technosphere to follow the transformation of the social order.
47%
Flag icon
The ability of a natural organism to survive in spite of a high incidence of error (which our artificial automata are incapable of) probably requires a very high flexibility and ability of the automaton to watch itself and reorganize itself. And this probably requires a very considerable autonomy of parts. There is a high autonomy of parts in the human nervous system. This autonomy of parts of a system has an effect which is observable in the human nervous system but not in artificial automata.
48%
Flag icon
Seen in perspective, von Neumann pursued a different method of inquiry compared to the other cyberneticians. Against the Platonism and intuitionism then popular also in engineering, von Neumann maintained a constructivist perspective on language, logic, and mathematics. He believed that these concepts were not inherent or innate, but rather products of historical development. It is only proper to realize that language is largely a historical accident. The basic human languages are traditionally transmitted to us in various forms, but their very multiplicity proves that there is nothing ...more
48%
Flag icon
But, rather than reiterating the computationalism of McCulloch, Pitts, and Wiener, von Neumann – himself no romantic – made at the end a remarkable intervention: he reversed the relation between logic and nature, computer and brain, to the point of suggesting that the study of neurophysiology could one day reshape logic altogether.
49%
Flag icon
The peculiar character of the problem of a rational economic order is determined precisely by the fact that the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess. The economic problem of society is thus not merely a problem of how to allocate ‘given’ resources – if ‘given’ is taken to mean given to a single mind which deliberately solves the problem set by these ‘data’. It is rather a problem of how to secure the ...more
50%
Flag icon
The ‘know how’ consists in the capacity to act according to rules which we may be able to discover but which we need not be able to state in order to obey them … Rules which we cannot state thus do not govern only our actions. They also govern our perceptions, and particularly our perceptions of other people’s actions. The child who speaks grammatically without knowing the rules of grammar not only understands all the shades of meaning expressed by others through following the rules of grammar, but may also be able to correct a grammatical mistake in the speech of others.
50%
Flag icon
While we are clearly often not aware of mental processes because they have not yet risen to the level of consciousness but proceed on what are (both physiologically and psychologically) lower levels, there is no reason why the conscious level should be the highest level, and many grounds which make it probable that, to be conscious, processes must be guided by a supra-conscious order which cannot be the object of its own representations. Mental events may thus be unconscious and uncommunicable because they proceed on too high a level as well as because they proceed on too low a level.
52%
Flag icon
People do behave in the same manner towards things, not because these things are identical in a physical sense, but because they have learnt to classify them as belonging to the same group, because they can put them to the same use or expect from them what to the people concerned is an equivalent effect.
52%
Flag icon
The idea that science breaks up and replaces the system of classification which our sense qualities represent is less familiar, yet this is precisely what Science does … This process of re-classifying ‘objects’ which our senses have already classified in one way, of substituting for the ‘secondary’ qualities in which our senses arrange external stimuli a new classification based on consciously established relations between classes of events is, perhaps, the most characteristic aspect of the procedure of the natural sciences. The whole history of modern Science proves to be a process of ...more
52%
Flag icon
We have seen that the classification of the stimuli performed by our senses will be based on a system of acquired connexions which reproduce, in a partial and imperfect manner, relations existing between the corresponding physical stimuli. The ‘model’ of the physical world which is thus formed will give only a very distorted reproduction of the relationships existing in that world; and the classification of these events by our senses will often prove to be false, that is, give rise to expectations which will not be borne out by events.
53%
Flag icon
The market may be considered as one of the oldest historical devices for solving simultaneous equations. The interesting thing is that the solving mechanism operates not via a physical but via a social process. It turns out that the social processes as well may serve as a basis for the operation of feedback devices leading to the solution of equations by iteration.