More on this book
Community
Kindle Notes & Highlights
The Mechanical Turk offered the impressive illusion of a machine that could make decisions for itself, that seemed to be smarter than a human, though a human ultimately controlled it.
Over the two centuries since its invention, the device has become a prevalent metaphor for technological manipulation. It represents the human lurking behind the facade of seemingly advanced technology as well as the ability of such devices to deceive us about the way they work.
Algorithm is usually shorthand for “algorithmic recommendations,” the digital mechanisms that absorb piles of user data, push it through a set of equations, and spit out a result deemed most relevant to preset goals.
Algorithmic recommendations shape the vast majority of our experiences in digital spaces by considering our previous actions and selecting the pieces of content that will most suit our patterns of behavior. They are supposed to interpret and then show us what we want to see.
Designed and maintained by the engineers of monopolistic tech companies, and running on data that we users continuously provide by logging in each day, the technology is both constructed by us and dominates us, manipulating our perceptions and attention. The algorithm always wins.
Algorithmic recommendations dictate genres of culture by rewarding particular tropes with promotion in feeds, based on what immediately attracts the most attention.
Each platform develops its own stylistic archetype, which is informed not just by aesthetic preferences but by biases of race, gender, and politics as well as by the fundamental business model of the corporation that owns it.
The culture that thrives in Filterworld tends to be accessible, replicable, participatory, and ambient. It can be shared across wide audiences and retain its meaning across different groups, who tweak it slightly to their own ends.
It is also pleasant or average enough that it can be ignored and unobtrusively fade into the background, oftentimes going unnoticed until you look for it.
Filterworld culture is ultimately homogenous, marked by a pervasive sense of sameness even when its artifacts aren’t literally the same. It perpetuates itself to the point of boredom.
Algorithms are manipulative; the apps guide them through physical space to places that have adopted digitally popular aesthetics, winning attention and ratings from other users. With higher ratings come yet more algorithmic promotion and thus more visitors.
“Globalization takes place only in capital and data. Everything else is damage control.”
And yet for all that data, algorithmic feeds oftentimes misunderstand us, connecting us to the wrong people or recommending the wrong kinds of content, encouraging habits that we don’t want.
This imbalance induces a state of passivity: We consume what the feeds recommend to us without engaging too deeply with the material.
We write tweets, post on Facebook, and take Instagram photos in forms we know will grab attention and attract likes or clicks, which drive revenue for the tech companies.
Our natural reaction is to seek out culture that embraces nothingness, that blankets and soothes rather than challenges or surprises, as powerful artwork is meant to do. Our capacity to be moved, or even to be interested and curious, is depleted.
The dominance of algorithmic feeds is a relatively recent phenomenon. In the early days of social networks like Twitter, Facebook, Instagram, and Tumblr, the sites’ content feeds were more or less chronological.
As the platforms grew to millions and billions of users over the 2010s, and users connected with more people at once, fully chronological feeds became cumbersome and weren’t always interesting. You might miss a popular or compelling post just because you weren’t scrolling at the right time.
The motivation for that switch was less usability than profit. The more time users spend on an app, the more data they produce, the more easily they can be tracked, and the more efficiently their attention can be sold to advertisers.
TikTok, which launched in the United States in 2018, achieved its major innovation by making its main “For You” feed almost entirely algorithmic.
TikTok quickly became the fastest-growing social network ever, reaching more than 1.5 billion users in less than five years, and its competitors, struggling to catch up, have followed suit into algorithmification.
In place of the human gatekeepers and curators of culture, the editors and DJs, we now have a set of algorithmic gatekeepers.
The outcome of such algorithmic gatekeeping is the pervasive flattening that has been happening across culture. By flatness I mean homogenization but also a reduction into simplicity: the least ambiguous, least disruptive, and perhaps least meaningful pieces of culture are promoted the most.
The author observes the difference between hitting a button to instantly tune into the station and wiggling a knob back and forth, navigating through static, and eventually finding the perfect analog position. The latter might be less precise and less convenient, but it’s slightly more magical and humane.
The culture of Filterworld is the culture of presets, established patterns that get repeated again and again. The technology limits us to certain modes of consumption; you can’t stray outside of the lines.
Lovelace also realized that the repeating mechanical processes that the machine enabled could be applied to fields beyond mathematics.
In the 1990s and 2000s, computer programming began to take its place alongside basic math and science as a skill that was necessary for a child’s complete education.
According to Turing, the device would be able to perform any kind of calculation and at any scale without needing to be reconfigured. It had its own internal logical language that could be adapted to different ends, to solve any type of problem.
It would execute algorithms. He hinted at the way that machine learning algorithms today evolve over time, incorporating adjustments without human decision-making.
For Turing, however, the word computer referred not to the machine but to the person doing the computing, once again emphasizing that organic element.
Any algorithm, in the historical sense of a mathematical process, can be calculated by such a Turing Machine. And any computational system that can compute anything that a Turing Machine can is said to be “Turing-complete.”
What Turing correctly concluded was that any computing machine would be able to do the work of any other—even Charles Babbage’s nineteenth-century Analytical Engine could theoretically perform the complex tasks that our laptops do now, if given infinite scale and time.
All algorithms are engines of automation, and, as Ada Lovelace predicted, automation has now moved into many facets of our lives beyond pure mathematics.
“The ‘message’ of any medium or technology is the change of scale or pace or pattern that it introduces into human affairs,” McLuhan wrote. In our case, the medium is the algorithmic feed; it has scaled and sped up humanity’s interconnection across the world to an unimaginable degree. Its message is that on some level, our collective consumption habits, translated into data, run together into sameness.
All recommendation algorithms work by gathering a set of raw data. The overall term for that dataset is signal, the collected inputs that are fed into the machine. The signal data might include a user’s past purchases on Amazon or how many other users favorited a particular song on Spotify.
The primary signal fed into many social media recommendations is engagement, which describes how users are interacting with a piece of content. That might come in the form of likes, retweets, or plays—any kind of button found next to a post.
The signal is fed through a data transformer that puts it into usable packages, set to be processed by different kinds of algorithms.
A social calculator might be used to add information about how users relate to one another within a single platform—I
Then comes the specific equation of an individual algorithm. In today’s platforms, there is very rarely only one set algorithm—there are many.
Those algorithms are also weighed against each other. Hybrid filtering is when multiple techniques are used. Finally, the output is the recommendation itself, the next song in the automated playlist or the ordered list of posts.
It’s important to remember that how the Facebook feed works is a commercial decision, the same as a food manufacturer deciding which ingredients to use. Algorithms also change over time, refining themselves using machine learning.
We users fundamentally do not understand how algorithmic recommendations work on a day-to-day basis. Their equations, variables, and weights are not public because technology companies have little incentive to publicize them.
content-based filtering had significant drawbacks. It requires the material to be translated into data that the machine can understand, such as text; it lacks serendipity because it can filter only by the terms that the user inputs; and it does not measure inherent quality.
Social information filtering bypasses those problems because it is instead driven by the actions of human users, who evaluate content on their own—using judgments both quantitative and qualitative.
The similarity of one user’s taste to another was calculated using statistical correlation. The researchers designed a system called Ringo to make music recommendations using an email list.
The fourth algorithm, and the most effective according to the researchers, matched users based on whether they rated the same things either positive or negatively. In other words, their taste matched. Similarity was the best variable.
Ringo’s innovation was how it acknowledged that the best recommendations, or the best indications of relevance, were likely to come from other humans rather than analysis of the content itself. It represented a scaling up of human taste.
PageRank worked by measuring how many times a website was linked to by other sites, similar to the way academic papers cite key pieces of past research. The more links, the more important a page was likely to be.
PageRank mingled a form of collaborative filtering with content filtering. By linking various pages, human users had already formed a subjective map of recommendations that the algorithm could incorporate.
Decades later, PageRank has become almost tyrannical, a system that dominates how and when websites are seen. It’s vital for a business or resource to make it to that first page of Google Search results by adapting to the PageRank algorithm.