New Dark Age Quotes

Rate this book
Clear rating
New Dark Age: Technology and the End of the Future New Dark Age: Technology and the End of the Future by James Bridle
2,760 ratings, 4.03 average rating, 366 reviews
Open Preview
New Dark Age Quotes Showing 1-30 of 38
“Thus paranoia in an age of network excess produces a feedback loop: the failure to comprehend a complex world leads to the demand for more and more information, which only further clouds our understanding – revealing more and more complexity that must be accounted for by ever more byzantine theories of the world. More information produces not more clarity, but more confusion.”
James Bridle, New Dark Age: Technology and the End of the Future
“The crisis of global warming is a crisis of the mind, a crisis of thought, a crisis in our ability to think another way to be. Soon, we shall not be able to think at all.”
James Bridle, New Dark Age: Technology and the End of the Future
“we find ourselves today connected to vast repositories of knowledge, and yet we have not learned to think. In fact, the opposite is true: that which was intended to enlighten the world in practice darkens it. The abundance of information and the plurality of worldviews now accessible to us through the internet are not producing a coherent consensus reality, but one riven by fundamentalist insistence on simplistic narratives, conspiracy theories, and post-factual politics. It is on this contradiction that the idea of a new dark age turns: an age in which the value we have placed upon knowledge is destroyed by the abundance of that profitable commodity, and in which we look about ourselves in search of new ways to understand the world.”
James Bridle, New Dark Age: Technology and the End of the Future
“Uncertainty, mathematically and scientifically understood, is not the same as unknowing. Uncertainty, in scientific, climatological terms, is a measure of precisely what we do know. And as our computational systems expand, they show us ever more clearly how much we do not know.”
James Bridle, New Dark Age: Technology and the End of the Future
“Technology is not mere tool making and tool use: it is the making of metaphors.”
James Bridle, New Dark Age: Technology and the End of the Future
“We connect to the cloud; we work in it; we store and retrieve stuff from it; we think through it. We pay for it and only notice it when it breaks.”
James Bridle, New Dark Age: Technology and the End of the Future
“All contemporary computation stems from this nexus: military attempts to predict and control the weather, and thus to control the future.”
James Bridle, New Dark Age: Technology and the End of the Future
“In what could be taken as the founding statement of computational thought, he wrote: ‘All stable processes we shall predict. All unstable processes we shall control.’14”
James Bridle, New Dark Age: Technology and the End of the Future
“Like an air control system mistaking a flock of birds for a fleet of bombers, software is unable to distinguish between its model of the world and reality – and, once conditioned, neither are we.”
James Bridle, New Dark Age: Technology and the End of the Future
“the big data fallacy is the logical outcome of scientific reductionism: the belief that complex systems can be understood by dismantling them into their constituent pieces and studying each in isolation. And this reductionist approach would hold if it did in practice keep pace with our experiences; in reality, it is proving to be insufficient.”
James Bridle, New Dark Age: Technology and the End of the Future
“We pay for it and only notice it when it breaks.”
James Bridle, New Dark Age: Technology and the End of the Future
“What is needed is not new technology, but new metaphors: a metalanguage for describing the world that complex systems have wrought.”
James Bridle, New Dark Age: Technology and the End of the Future
“A simply functional understanding of systems is insufficient; one needs to be able to think about histories and consequences too. Where did these systems come from, who designed them and what for, and which of these intentions still lurk within them today?”
James Bridle, New Dark Age: Technology and the End of the Future
“Catch-22 exemplifies the dilemma of rational actors caught up within the machinations of vast, irrational systems. Within such systems, even rational responses lead to irrational outcomes. The individual is aware of the irrationality but loses all power to act in their own interest. Faced with the roiling tide of information, we attempt to gain some kind of control over the world by telling stories about it: we attempt to master it through narratives. These narratives are inherently simplifications, because no one story can account for everything that's happening; the world is too complex for simple stories. Instead of accepting this, the stories become ever more baroque and bifurcated, ever more convoluted and open-ended. Thus paranoia in an age of network excess produces a feedback loop: the failure to comprehend a complex world leads to the demand for more and more information, which only further clouds our understanding - revealing more and more complexity that must be accounted for by ever more byzantine theories of the world. More information produces not more clarity, but more confusion.”
James Bridle, New Dark Age: Technology and the End of the Future
“Today the cloud is the central metaphor of the internet: a global system of great power and energy that nevertheless retains the aura of something noumenal and numnious, something almost impossible to grasp. We connect to the cloud; we work in it; we store and retrieve stuff from it; we think through it. We pay for it and only notice it when it breaks. It is something we experience all the time without really understanding what it is or how it works. It is something we are training ourselves to rely upon with only the haziest of notions about what is being entrusted, and what it is being entrusted to.

Downtime aside, the first criticism of this cloud is that it is a very bad metaphor. The cloud is not weightless; it is not amorphous, or even invisible, if you know where to look for it. The cloud is not some magical faraway place, made of water vapor and radio waves, where everything just works. It is a physical infrastructure consisting of phone lines, fibre optics, satellites, cables on the ocean floor, and vast warehouses filled with computers, which consume huge amounts of water and energy and reside within national and legal jurisdictions. The cloud is a new kind of industry, and a hungry one. The cloud doesn't just have a shadow; it has a footprint. Absorbed into the cloud are many of the previously weighty edifices of the civic sphere: the places where we shop, bank, socialize, borrow books, and vote. Thus obscured, they are rendered less visible and less amenable to critique, investigation, preservation and regulation.

Another criticism is that this lack of understanding is deliberate. There are good reasons, from national security to corporate secrecy to many kinds of malfeasance, for obscuring what's inside the cloud. What evaporates is agency and ownership: most of your emails, photos, status updates, business documents, library and voting data, health records, credit ratings, likes, memories, experiences, personal preferences, and unspoken desires are in the cloud, on somebody else's infrastructure. There's a reason Google and Facebook like to build data centers in Ireland (low taxes) and Scandinavia (cheap energy and cooling). There's a reason global, supposedly post-colonial empires hold onto bits of disputed territory like Diego Garcia and Cyprus, and it's because the cloud touches down in these places, and their ambiguous status can be exploited. The cloud shapes itself to geographies of power and influence, and it serves to reinforce them. The cloud is a power relationship, and most people are not on top of it.

These are valid criticisms, and one way of interrogating the cloud is to look where is shadow falls: to investigate the sites of data centers and undersea cables and see what they tell us about the real disposition of power at work today. We can seed the cloud, condense it, and force it to give up some of its stories. As it fades away, certain secrets may be revealed. By understanding the way the figure of the cloud is used to obscure the real operation of technology, we can start to understand the many ways in which technology itself hides its own agency - through opaque machines and inscrutable code, as well as physical distance and legal constructs. And in turn, we may learn something about the operation of power itself, which was doing this sort of thing long before it had clouds and black boxes in which to hide itself.”
James Bridle, New Dark Age: Technology and the End of the Future
“This is the magic of big data. You don’t really need to know or understand anything about what you’re studying; you can simply place all of your faith in the emergent truth of digital information. In one sense, the big data fallacy is the logical outcome of scientific reductionism: the belief that complex systems can be understood by dismantling them into their constituent pieces and studying each in isolation. And this reductionist approach would hold if it did in practice keep pace with our experiences; in reality, it is proving to be insufficient.”
James Bridle, New Dark Age: Technology and the End of the Future
“In the case of the airport, code both facilitates and coproduces the environment. Prior to visiting an airport, passengers engage with an electronic booking system – such as SABRE – that registers their data, identifies them, and makes them visible to other systems, such as check-in desks and passport control. If, when they find themselves at the airport, the system becomes unavailable, it is not a mere inconvenience. Modern security procedures have removed the possibility of paper identification or processing: software is the only accepted arbiter of the process. Nothing can be done; nobody can move. As a result, a software crash revokes the building’s status as an airport, transforming it into a huge shed filled with angry people. This is how largely invisible computation coproduces our environment – its critical necessity revealed only in moments of failure, like a kind of brain injury.”
James Bridle, New Dark Age: Technology and the End of the Future
“But perhaps the most obvious is that despite the sheer volume of information that exists online – the plurality of moderating views and alternative explanations – conspiracy theories and fundamentalism don’t merely survive, they proliferate. As in the nuclear age, we learn the wrong lesson over and over again. We stare at the mushroom cloud, and see all of this power, and we enter into an arms race all over again.”
James Bridle, New Dark Age: Technology and the End of the Future
“The weakness of ‘learning to code’ alone might be argued in the opposite direction too: you should be able to understand technological systems without having to learn to code at all, just as one should not need to be a plumber to take a shit, nor to live without fear that your plumbing system might be trying to kill you. The possibility that your plumbing system is indeed trying to kill you should not be discounted either: complex computational systems provide much of the infrastructure of contemporary society, and if they are not safe for people to use, no amount of education in just how bad they are will save us in the long run.”
James Bridle, New Dark Age: Technology and the End of the Future
“Over the last century, technological acceleration has transformed our planet, our societies, and ourselves, but it has failed to transform our understanding of these things. The reasons for this are complex, and the answers are complex too, not least because we ourselves are utterly enmeshed in technological systems, which shape in turn how we act and how we think. We cannot stand outside them; we cannot think without them.”
James Bridle, New Dark Age: Technology and the End of the Future
“By inherent, I mean the notion that they emerged, ex nihilo, from the things we created rather than involving our own actions as part of that co-creation.”
James Bridle, New Dark Age: Technology and the End of the Future
“Just as the availability of vast computational power drives the implementation of global surveillance, so its logic has come to dictate how we respond to it, and to other existential threats to our cognitive and physical well-being. The demand for some piece of evidence that will allow us to assert some hypothesis with 100 per cent certainty overrides our ability to act in the present. Consensus - such as the broad scientific agreement around the urgency of the climate crisis - is disregarded in the face of the smallest quantum of uncertainty. We find ourselves locked in a kind of stasis, demanding that Zeno's arrow hit the target even as the atmosphere before it warms and thickens. The insistence upon some ever-insufficient confirmation creates the deep strangeness of the present moment: everybody knows what's going on, and nobody can do anything about it.

Reliance on the computational logics of surveillance to derive truth about the world leaves us in a fundamentally precarious and paradoxical position. Computational knowing requires surveillance, because it can only produce its truth from the data available to it directly. In turn, all knowing is reduced to that which is computationally knowable, so all knowing becomes a form of surveillance. Thus computational logic denies our ability to think the situation, and to act rationally in the absence of certainty. It is also purely reactive, permitting action only after sufficient evidence has been gathered and forbidding action in the present when it is most needed.

The operation of surveillance, and our complicity in it, is one of the most fundamental characteristics of the new dark age, because it insists on a kind of blind vision: everything is illuminated, but nothing is seen. We have become convinced that throwing light upon the subject is the same thing as thinking it, and thus having agency over it. But the light of computation just as easily renders us powerless - either through information overload, or a false sense of security. It is a lie we have been sold by the seductive power of computational thinking.

Our vision is increasingly universal, but our agency is ever more reduced. We know more and more about the world, while being less and less able to do anything about it. The resulting sense of helplessness, rather than giving us pause to reconsider our assumptions, seems to be driving us deeper into paranoia and social disintegration: more surveillance, more distrust, an ever-greater insistence on the power of images and computation to rectify a situation that is produced by our unquestioning belief in their authority.

Surveillance does not work, and neither does righteous exposure. There is no final argument to be made on either side, no clinching statement that will ease our conscience and change the minds of our opponents. There is no smoking gun, no total confirmation or clear denial. The Glomar response, rather than the dead words of a heedless bureaucracy, turns out to be the truest description of the world that we can articulate.”
James Bridle, New Dark Age: Technology and the End of the Future
“We don't and cannot understand everything, but we are capable of thinking it. The ability to think without claiming, or even seeking, to fully understand is key to survival in a new dark age because, as we shall see, it is often impossible to understand. Technology is and can be a guide to helpmate in this thinking, providing we do not privilege its output: computers are not here to give us answers, but are tools for asking questions. Understanding a technology deeply and systemically often allows us to remake its metaphors in the service of other ways of thinking.”
James Bridle, New Dark Age: Technology and the End of the Future
“We often struggle to conceive of and describe the scope and scale of new technologies, meaning that we have trouble even thinking them. What is needed is not new technology, but new metaphors: a metalanguage for describing the world that complex systems have wrought. A new shorthand is required, one that simultaneously acknowledges and addresses the reality of a world in which people, politics, culture and technology are utterly enmeshed. We have always been connected - unequally, illogically, and some more than others - but entirely and inevitably. What changes in the network is that this connection is visible and undeniable. We are confronted at all times by the radical interconnectedness of things and our selves, and we must reckon with this realization in new ways. It is insufficient to speak of the internet or amorphous technologies, alone and unaccountable, as causing or accelerating the chasm in our understanding and agency. For want of a better term, I use the word 'network' to include us and our technologies in one vast system - to include human and nonhuman agency and understanding, knowing and unknowing, within the same agential soup. The chasm is not between us and our technologies, but within the network itself, and it is through the network that we come to know it.

Finally, systemic literacy permits, performs, and responds to critique. The systems that we will be discussing are too critical to be thought, understood, designed and enacted by the few, especially when those few all too easily align themselves with, or are subsumed by, older elites and power structures. There is a concrete and causal relationship between the complexity of the systems we encounter every day; the opacity with which most of those systems are constructed or described; and fundamental, global issues of inequality, violence, populism and fundamentalism, All too often, new technologies are presented as inherently emancipatory. But this is itself an example of computational thinking, of which we are all guilty. Those of us who have been early adopters and cheerleaders of new technologies, who have experienced their manifold pleasures and benefitted from their opportunities, and who have consequently argued, often naively, for their wider implementation, are in no less danger from their uncritical deployment. But the argument for critique cannot be made from individual threats, nor from identification with the less fortunate or less knowledgeable. Individualism and empathy are both insufficient in the network. Survival and solidarity must be possible without understanding.”
James Bridle, New Dark Age: Technology and the End of the Future
“Computational thinking is an extension of what others have called solutionism: the belief that any given problem can be solved by the application of computation. Whatever the practical or social problem we face, there is an app for it. But solutionism is insufficient too; this is one of the things that our technology is trying to tell us. Beyond this error, computational thinking supposes - often at an unconscious level - that the world really is like the solutionists propose. It internalizes solutionism to the degree that it is impossible to think or articulate the world in terms that are not computable. Computational thinking is predominant in the world today, driving the worst trends in our society and interactions, and must be opposed by a real systemic literacy. If philosophy is that fraction of human thought dealing with that which cannot be explained by the sciences, then systemic literacy is the thinking that deals with a world that is not computable, while acknowledging that it is irrevocably shaped and informed by computation.

The weakness of 'learning to code' alone might be argued in the opposite direction too: you should be able to understand technological systems without having to learn to code at all, just as one should not need to be a plumber to take a shit, nor to live without fear that your plumbing might be trying to kill you. The possibility that your plumbing system is indeed trying to kill you should not be discounted either: complex computational systems provide much of the infrastructure of contemporary society, and if they are not safe for people to use, no amount of education in just how bad they are will save us in the long run.”
James Bridle, New Dark Age: Technology and the End of the Future
“One of the arguments often made in response to weak public understanding of technology is a call to increase technological education - in its simplest formulation, to learn to code. Such a call is made frequently by politicians, technologists, pundits and business leaders, and it is often advanced in nakedly functional and pro-market terms: the information economy needs more programmers, and young people need jobs in the future. This is a good start, but learning to code is not enough, just as learning to plumb a sink is not enough to understand the complex interactions between water tables, political geography, aging infrastructure, and social policy that define, shape and produce actual life support systems in society. A simply functional understanding of systems is insufficient; one needs to be able to think about histories and consequences too. Where did these systems come from, who designed them and what for, and which of these intentions still lurk within them today?”
James Bridle, New Dark Age: Technology and the End of the Future
“Our technologies are complicit in the greatest challenges we face today: an out-of-control economic system that immiserates many and continues to widen the gap between rich and poor; the collapse of political and societal consensus across the globe resulting in increasing nationalisms, social divisions, ethnic conflicts and shadow wars; and a warming climate which existentially threatens us all.

Across the sciences and society, in politics and education, in warfare and commerce, new technologies do not merely augment our abilities, but actively shape and direct them, for better and for worse. It is increasingly necessary to be able to think new technologies in different ways, and to be critical of them, in order to meaningfully participate in that shaping and directing. If we do not understand how complex technologies function, how systems of technologies interconnect, and how systems of systems interact, then we are powerless within them, and their potential is more easily captured by selfish elites and inhuman corporations. Precisely because these technologies interact with one another in unexpected and often-strange ways, and because we are completely entangled with them, this understanding cannot be limited to the practicalities of how things work: it must be extended to how things came to be, and how they continue to function in the world in ways that are often invisible and interwoven. What is required is not understanding, but literacy.

True literacy in systems consists of much more than simple understanding, and might be understood and practiced in multiple ways. It goes beyond a system's functional use to comprehend its context and consequences. It refuses to see the application of any one system as a cure-all, insisting upon the interrelationships of systems and the inherent limitations of any single solution. It is fluent not only in the language of a system, but in its metalanguage - the language it uses to talk about itself and to interact with other systems - and is sensitive to the limitations and the potential uses and abuses of that metalanguage. It is, crucially, capable of both performing and responding to critique.”
James Bridle, New Dark Age: Technology and the End of the Future
“a general study undertaken by Nature found that 70 per cent of scientists had failed to replicate the findings of other researchers.14 Across the board, from medicine to psychology, biology to environmental sciences, researchers are coming to the realisation that many of the foundations of their research may be flawed.”
James Bridle, New Dark Age: Technology and the End of the Future
“There’s an idea in the science-fiction community called steam-engine time, which is what people call it when suddenly twenty or thirty different writers produce stories about the same idea. It’s called steam-engine time because nobody knows why the steam engine happened when it did. Ptolemy demonstrated the mechanics of the steam engine, and there was nothing technically stopping the Romans from building big steam engines. They had little toy steam engines, and they had enough metalworking skill to build big steam tractors. It just never occurred to them to do it.”
James Bridle, New Dark Age: Technology and the End of the Future
“As we shall see, technology’s increasing inability to predict the future – whether that’s the fluctuating markets of digital stock exchanges, the outcomes and applications of scientific research, or the accelerating instability of the global climate – stems directly from these misapprehensions about the neutrality and comprehensibility of computation.”
James Bridle, New Dark Age: Technology and the End of the Future

« previous 1