More on this book
Kindle Notes & Highlights
by
Ed Finn
Read between
March 12 - March 17, 2018
we are no longer identified according to metrics we might choose ourselves (e.g., what we elect to share on a consumer survey) but according to a set of behavioral choices whose consequences are largely unknown.
Martin Heidegger’s notion of enframing. In very simple terms, Heidegger argued that technologies (and our social world in general) tend to nudge us into certain modes of thinking about what is possible and what can be revealed about the universe. We see a hammer and we think about what we can hammer with it; but a hammer could also be used to open a bottle, to prop open a door, to hold down papers on a windy day.
cybernetics: over the course of his career, Wiener grew increasingly concerned with the consequences of cybernetics in implementation, particularly around automation and labor.30 The cybernetic ideal of the feedback loop and the organism as an informational entity could also be applied to Manichean systems that manipulate human participants for unsavory or merely dishonest ends.
Viewed more broadly, the interface layer is a colonization of the quiet backwaters of contemporary capitalism—the remobilization of goods and spaces after they have already been consumed or deployed.
Behind the facade of the facile, friendly computational interface, there is a world of labor translation, remediation, and exploitation at work.
Christian Sandvig has pointed out the fascinating history of the cloud metaphor, which began life as an icon computer engineers in the 1970s would use in system flowcharts: “The term comes from the symbology of the network diagram, where a cloud symbol indicates a part of the diagram whose internal details are irrelevant.”40 By definition, then, the cloud was an abstraction, a way to bracket off less interesting aspects of a system.
(during a summer heat wave, it was deemed better, or more efficient, to line up ambulances at one Amazon warehouse than install air-conditioning).
The final telos of algorithmic labor is the work that abstracts physical and cultural infrastructure away altogether.
Dedicated “turkers” earn about $5 an hour, and many of them reside in the United States (with India making up the second largest contingent).57 But it’s disingenuous even to speak of workers in the typical meaning of that term, or an hourly wage, since the system’s atomization offers no stability, regularity, or persistence of particular forms of labor, leading to huge variance in the amount of time it takes to complete a particular task. Just as many computer servers in the cloud sit idle (in 2012, McKinsey & Company estimated the percentage as 90 percent when measured by power consumption),
...more
the Mechanical Turk marketplace is designed to eliminate almost any kind of expertise or specialization among workers, and thereby any real bargaining power.
Debates about the ethical impact of mechanization are anything but novel. The original chess-playing Turk, for example, inspired florid prose and rapt audiences during its long run of public appearances in the United States in the heyday of industrialization.
many viewed the rise of automation as a force for ethical good, inspiring (or forcing) workers to conduct themselves with the same dedication as the machines they attended. This is what one British economist called “moral machinery” in 1835: a system of managerial interventions to enhance the industrial system’s natural tendencies toward order and productivity among human workers.
Smith argues in Moral Sentiments that social cohesion and the effective functioning of a marketplace depend on a logic of virtuous action that has imagination at its root.
The foundation of a moral society, he argues, is the constant imaginative work we all do to map out the territory of empathy, creating a pragmatic sense of justice and shared experience that guides economic and social behavior.
Quantifying empathy, often into a completely abstracted five-star scale, encourages us to leave the heavy lifting to the algorithms.
That hunger for emotional contact, for a space where we can imagine directly, marks another disparity between abstraction and implementation. The gulf between the imaginative empathy of human and machine actors in culture comes down to the construction of value. Just as Smith sought to put economic practice on a foundation of intersubjective, empathetic understanding, we are now struggling to define the fundamental structure of value in an algorithmic world.
HFT offers one of the purest examples of algorithms that are fundamentally altering an existing culture machine, that venerable assemblage of math, social practices, faith-based communities, and arbitrage that we call “the market.”
Much more than securities, these systems trade information, which has always been an essential role of the markets. But they atomize and encrypt that information in a new series of black boxes—the switching systems, trade-matching engines, and algorithms themselves—that allow HFT firms and the major Wall Street firms that deal with them to
create a kind of subaltern skim economy.
You might argue that consciousness is the act of constructing a narrative from the inaccessible gutter of human cognition—the ways that we stitch together a personal history from fragments of sensory input and inference.
trying to understand computational space is a different challenge entirely, requiring us to voyage not forward but deeper into time.
Both the encyclopédistes and Google would argue that their projects do not create hierarchy but model it—that their knowledge ontologies are simply more effective maps for structures that already existed in culture. And yet, as Eco suggests, in both instances the structure they created quickly became an ordering mechanism of its own, shaping the cultural space it was designed to observe.
For Diderot and d’Alembert, it was the book that politically, philosophically, and perhaps epistemologically paved the way for the French Revolution. For Google, it is a future of machines that understand, converse, and anticipate—a future of a complete computational ontology of knowledge.
We are haunted by the shoes, the cars, the vacations that we have not yet purchased much more directly than we are by the hidden shadows of our digital selves that marketing companies carefully maintain.
the web has moved from a romantic to a rational phase, leaving behind the homebrew sublime of early hand-coded sites for the elaborate corporate order of branded, automated page serving that dominates today.20
Bitcoin’s notionally trust-free system ends up demanding two different kinds of trust: first, faith in the algorithm itself, especially in its transparent underpinnings in the blockchain. And second, Bitcoin encourages participants to band together into computing collectives, creating a shared community around the arbitrary calculation of proof-of-work solutions.
Bitcoin, despite its pretensions to democracy, is fundamentally a technologically elite system that inserts computational value into the capitalistic foundations of currency.
As Edward Castronova argues, if significant numbers of people started using virtual currencies without the backing of traditional sovereign states, it could create “a general dynamic state of decline” where those still using dollars would have to take on ever-increasing tax obligations to support the infrastructure of the state.
The central tenets of Bitcoin’s system for creating value epitomize similar shifts in all corners of cultural production, where daily business seems the same but the majority votes, central authorities, and validation structures are being subsumed by algorithmic processes.
In parallel, the quest for knowledge is subservient to the quest for methods, for better technical systems.
creating the belletristic equivalent of monoculture crops that have limited resilience to major changes in the ecosystem.
Computationalism makes cultural practices not just computational but programmable—susceptible to centralized editing and revision.
Programmable culture turns the public sphere inside out: the cultural data that used to make up the arena of “common concern” is increasingly privatized.
The algorithmic process of crowdfunding rewards those who master the methods of privatized publicity: a strong introductory video, frequent updates, tiered reward structures, and effective use of social media to raise awareness.
Habermas was quick to point out that the bourgeois public sphere faded away with the rise of mass media and modern advertising, but the ideal vision of disinterested public discourse persists in the aspirations of journalists, social entrepreneurs, Wikipedians, and many others who see the Internet as a new platform for collective engagement around “common concerns.”
As we invest more of our lives, our public and private conversations, our faith and beliefs, in algorithmic culture machines, we invest in the idea that truth resides in analytics, abstraction, and data-mining.
Comforted in the knowledge that we can always Google it later, we have gradually accepted that the arbitrage of information is more significant than the information itself.
It’s never been so easy to pretend to know so much without actually knowing anything. We pick topical, relevant bits from Facebook, Twitter or emailed news alerts, and then regurgitate them. … What matters to us, awash in petabytes of data, is not necessarily having actually consumed this content firsthand but simply knowing that it exists—and having a position on it, being able to engage in the chatter about it.
The seduction of this public processing, the blockchain-like generation of cultural discourse, lies precisely in its visibility, its thrilling transparency. Cultural processing (whether of Bitcoin transactions, Facebook likes, click-driven political activism, or Wikipedia updates) becomes its own spectacle, distracting us from the invisible sides of the system, like a magic trick performed in a glass box.
Our participation wears grooves deeper into the system and continually trains us in the fine arts of process-driven valuation.
Replacing the public sphere with the programmable sphere is ultimately a substitution of one form of reading for another: a new idiomatic frame to put on the world. Leaving behind the readers of the Spectator in an eighteenth-century London coffeehouse, we find the readers of algorithmic process, interpreting Twitter feeds and web traffic counts over Wi-Fi at Starbucks. The grammar of the new algorithmic sphere obscures certain operations while making others more visible, and the spectacle of the blockchain is merely one of the newer ways that we have invested these cultural systems with
...more
The Turing test was in many ways a demonstration of the absurdity of establishing a metric for intelligence; the best we can do is have a conversation and see how effective a machine is at emulating a human.
Lem seems to hint that the ultimate drive of instrumental reason, of continually interrogating our world in the quest for answers, may only be madness: fantasies that we project onto the world in order to construct a story about truth, just as the characters in the novel must grapple with figures projected from their pasts.
This book has traced a series of encounters with (possible) imaginative algorithms, noting the growing cognitive traffic between biological, cultural, and computational structures of thinking. Google Now and the design goal of “anticipation” are forms of imaginative thinking—a process for envisioning possible futures and bringing them into greater possibility through implementation.
Aside from the most simplistic cases, we will never know how algorithms know what they know.
augmented imagination, the transformative work that humans and machines can only do together.
Whatever imagination is, we know that the focal lenses of our tools inflect and change
Figure 6.1 Vannevar Bush’s Memex.
As complexity scientist Sam Arbesman argues in The Half-Life of Facts, the vast majority of human knowledge is contingent in this way, and the pace of epistemological change is accelerating.