Filterworld: How Algorithms Flattened Culture
Rate it:
Open Preview
Kindle Notes & Highlights
Read between February 10 - February 11, 2024
1%
Flag icon
The message of many things in America is “Like this or die.” —George W. S. Trow
1%
Flag icon
“the Mechanical Turk.” It was a gift created to impress the Habsburg empress, Maria Theresa of Austria. Von Kempelen’s nigh magical machine could play and win a game of chess against a human opponent simply by means of internal clockwork gears and belts. As seen in historical etchings, the Mechanical Turk was a large wooden cabinet, about four feet wide, two and a half feet deep, and three feet tall, with doors exposing the elaborate machinery inside. On top sat a humanoid automaton the size of a child, dressed in a robe and turban and sporting a dramatic mustache, leaning over a chessboard.
1%
Flag icon
What the Mechanical Turk could not actually do, however, was play chess. There was no artificial intelligence driving the machine, no set of gears that mechanically determined its next move. Instead, a short-statured human pilot curled himself inside the cabinet. He was a chess expert who could observe the game by means of magnet-connected markers underneath the board that corresponded to the pieces on top—marking the locations of the pawns, the knights, the king as the game was played. The pilot maneuvered the automaton’s hand by means of levers and strings to grab the pieces and move them, ...more
This highlight has been truncated due to consecutive passage length restrictions.
2%
Flag icon
Over the two centuries since its invention, the device has become a prevalent metaphor for technological manipulation. It represents the human lurking behind the facade of seemingly advanced technology as well as the ability of such devices to deceive us about the way they work. (In 2005, Amazon named its service for accomplishing digital tasks, like tagging photos or cleaning data, using an invisible marketplace of outsourced human labor “Mechanical Turk.”)
2%
Flag icon
Algorithm is usually shorthand for “algorithmic recommendations,” the digital mechanisms that absorb piles of user data, push it through a set of equations, and spit out a result deemed most relevant to preset goals. Algorithms dictate the websites we find in Google Search results; the stories we see
2%
Flag icon
Algorithmic recommendations shape the vast majority of our experiences in digital spaces by considering our previous actions and selecting the pieces of content that will most suit our patterns of behavior. They are supposed to interpret and then show us what we want to see.
2%
Flag icon
Algorithmic recommendations are the latest iteration of the Mechanical Turk: a series of human decisions that have been dressed up and automated as technological ones, at an inhuman scale and speed. Designed and maintained by the engineers of monopolistic tech companies, and running on data that we users continuously provide by logging in each day, the technology is both constructed by us and dominates us, manipulating our perceptions and attention. The algorithm always wins.
2%
Flag icon
Filterworld, the title of this book, is my word for the vast, interlocking, and yet diffuse network of algorithms that influence our lives today, which has had a particularly dramatic impact on culture and the ways it is distributed and consumed.
2%
Flag icon
Algorithmic recommendations dictate genres of culture by rewarding particular tropes with promotion in feeds, based on what immediately attracts the most attention.
3%
Flag icon
Each platform develops its own stylistic archetype, which is informed not just by aesthetic preferences but by biases of race, gender, and politics as well as by the fundamental business model of the corporation that owns it.
3%
Flag icon
Filterworld culture is ultimately homogenous, marked by a pervasive sense of sameness even when its artifacts aren’t literally the same. It perpetuates itself to the point of boredom.
3%
Flag icon
“harmonization of tastes.”
3%
Flag icon
Algorithms are manipulative; the apps guide them through physical space to places that have adopted digitally popular aesthetics, winning attention and ratings from other users. With higher ratings come yet more algorithmic promotion and thus more visitors.
3%
Flag icon
Indian literary theorist Gayatri Spivak wrote in 2012, “Globalization takes place only in capital and data. Everything else is damage control.”
3%
Flag icon
The homogenous culture is the inevitable reaction to the damage of that spread, a way of coping with or adapting to it.
3%
Flag icon
Filterworld and its slick sameness can induce a breathtaking, near-debilitating sense of anxiety. The sameness feels inescapable, alienating even as it is marketed as desirable. “Surveillance capitalism,” as the scholar Shoshana Zuboff has labeled it, is how tech companies monetize the constant absorption of our personal data, an intensification of the attention economy.
3%
Flag icon
And yet for all that data, algorithmic feeds oftentimes misunderstand us, connecting us to the wrong people or recommending the wrong kinds of content, encouraging habits that we don’t want. The network of algorithms makes so many decisions for us, and yet we have little way of talking back to it or changing how it works. This imbalance induces a state of passivity: We consume what the feeds recommend to us without engaging too deeply with the material. We also adapt the way we present ourselves online to its incentives.
3%
Flag icon
On the other side of our algorithmic anxiety is a state of numbness. The dopamine rushes become inadequate, and the noise and speed of the feeds overwhelming. Our natural reaction is to seek out culture that embraces nothingness, that blankets and soothes rather than challenges or surprises, as powerful artwork is meant to do. Our capacity to be moved, or even to be interested and curious, is depleted.
4%
Flag icon
The motivation for that switch was less usability than profit. The more time users spend on an app, the more data they produce, the more easily they can be tracked, and the more efficiently their attention can be sold to advertisers. Feeds have become increasingly algorithmic over time, particularly in the watershed moment of the mid-2010s.
4%
Flag icon
At least for the major corporations that comprise most of the Internet, the algorithmic tide shows no sign of reversing. In place of the human gatekeepers and curators of culture, the editors and DJs, we now have a set of algorithmic gatekeepers.
4%
Flag icon
Attention becomes the only metric by which culture is judged, and what gets attention is dictated by equations developed by Silicon Valley engineers. The outcome of such algorithmic gatekeeping is the pervasive flattening that has been happening across culture. By flatness I mean homogenization but also a reduction into simplicity: the least ambiguous, least disruptive, and perhaps least meaningful pieces of culture are promoted the most. Flatness is the lowest common denominator, an averageness that has never been the marker of humanity’s proudest cultural creations.
4%
Flag icon
Somehow, Crystal, a 1980 Japanese novel by Yasuo Tanaka.
4%
Flag icon
The culture of Filterworld is the culture of presets, established patterns that get repeated again and again. The technology limits us to certain modes of consumption; you can’t stray outside of the lines. “Maniac fun,” as Yuri says, is gone—that is to say, a certain degree of originality, unprecedentedness, creativity, and surprise disappears when so much weighs on culture’s ability to spread through digital feeds.
4%
Flag icon
We can dispel their influence only by understanding them—by opening the cabinet of the Mechanical Turk to reveal the operator inside.
4%
Flag icon
Algorithms are key to the history of early mathematics. Around 300 BCE, the Greek philosopher Euclid recorded in his treatise Elements what is called the Euclidean algorithm, a way of finding the greatest common divisor of two or more numbers.
5%
Flag icon
But the actual word algorithm comes from a single person—or at least his birthplace.
6%
Flag icon
“Turing-complete.” All programming languages, for example, are Turing-complete because they can model any kind of equation. (Even the spreadsheet software Excel became Turing-complete in 2021.)
6%
Flag icon
What Turing correctly concluded was that any computing machine would be able to do the work of any other—even Charles Babbage’s nineteenth-century Analytical Engine could theoretically perform the complex tasks that our laptops do now, if given infinite scale and time.
6%
Flag icon
There is something of the clash between mechanical rules and human operation within Turing’s life, too.
6%
Flag icon
the law is its own kind of algorithm, deciding judgment based on an implacable set of rules.
6%
Flag icon
no matter how complex, an algorithm remains in its essence an equation: a method to arrive at a desired conclusion, whether it’s a Sumerian diagram to divide an amount of grain equally among several men or the Facebook feed determining which post to show you first when you open the website. All algorithms are engines of automation, and, as Ada Lovelace predicted, automation has now moved into many facets of our lives beyond pure mathematics.
9%
Flag icon
Google’s authority. “Knowledge itself is power,” Francis Bacon wrote in the sixteenth century, but in the Internet era, sorting knowledge might be even more powerful. Information is now easy to find in abundance; making sense of it, knowing which information is useful, is much harder.
10%
Flag icon
The technology is not at issue—one can no more blame an algorithm itself for bad recommendations than blame a bridge for its engineering flaws. And some degree of reordering is necessary to make the vast stores of content on digital platforms comprehensible. The negative aspects of Filterworld might have emerged because the technology has been applied too widely, without enough consideration for the experience of the user, rather than for the advertisers targeting them. The recommendations, such as they are, don’t work for us anymore; rather, we are increasingly alienated by them.
10%
Flag icon
Facebook today is a frenetic highway with exits and on-ramps every few seconds, in the aughts it was more like a high school rec room where only a few people could hang out at a time. You built a profile, updated your status on the profile, and joined groups around common interests—but not much else.
12%
Flag icon
Culture is meant to be communal and requires a certain degree of consistency across audiences; without communality, it loses some of its essential impact.
12%
Flag icon
In 1933, the Japanese novelist Junichiro Tanizaki memorialized another moment of technological change when he wrote In Praise of Shadows, a book-length essay about electric lights arriving in Tokyo. The metaphorical switch had flipped; within Tanizaki’s lifetime (he was born in 1886), electric lights had gone from unknown in his country to ubiquitous, thanks to the intrusion of the West beginning in 1867, in a wave of increasing globalization and subsequent clashes of cultures. The Westerner’s “quest for a brighter light never ceases,” Tanizaki wrote. In the essay, Tanizaki mourned the unique ...more
13%
Flag icon
time. With new technology, the miraculous quickly becomes mundane, any glitch in its function is felt as bothersome, and finally it becomes ignorable, the miracle forsaken. We forget that life wasn’t always this way, that we couldn’t directly speak to people across long distances, that ceiling lights didn’t make every room bright, or that we didn’t have our information and media automatically filtered by machines. Such is the presence algorithmic feeds now have in our lives; the algorithm is often unconsidered, part of the furniture, noticed only when it doesn’t function in the way it’s ...more
15%
Flag icon
Algorithmic anxiety places the burden of action on the individual, not the business—the user must change their behavior or risk disappearing. Users sometimes complain of being “shadowbanned” when their posts or content on a platform suddenly lack the same level of engagement as before.
15%
Flag icon
The algorithm was a specter that haunted any encounter with digital platforms and their increasingly intrusive presence in our lives. That is not to say we understood what exactly algorithms were doing, per se: “Just as the fear of heights is not about heights, algorithmic anxiety is not simply about algorithms,” de Vries said.
15%
Flag icon
To move forward, we must disentangle the effects of algorithmic recommendations as technology from the ways that we have habitually adopted them as the primary gatekeepers of our online communication. Algorithms, after all, are inextricable from the data they run on, which has been created and is constantly refreshed by humans. Actual influence coexists with the fear of influence, which is equally manipulative. Algorithms entered our lives by promising to make decisions for us, to anticipate our thoughts and desires. Filterworld represents the establishment of the psychic world of ...more
17%
Flag icon
If the Amazon bookstore represented the triumph of algorithmic logic, then McNally was the pinnacle of human tastemakers, the word we often use for the people who sort and select the culture that we consume. Booksellers are tastemakers, but so are librarians who recommend titles for their patrons, professional buyers for lifestyle boutiques, radio-station DJs, movie booking agents who advocate on behalf of films to theaters nationwide, and concert programmers who book bands for venues. These tastemakers all provide an interface between the creators of culture and its consumers. They constantly ...more
17%
Flag icon
Through our algorithmic feeds, we get only the Amazon retail experience, not the McNally curatorial eye.
17%
Flag icon
The “taste” of tastemakers means personal preference, the discernment that we all apply to figure out what we like, whether in music, fashion, food, or literature. We make constant decisions to listen to, read, or wear one thing instead of another.
17%
Flag icon
Taste is a word for how we measure culture and judge our relationship to it. If something suits our taste, we feel close to it and identify with it, as well as form relationships with other people based on it, the way customers commune over clothing labels (either loving or hating a particular brand). Intentionally bad taste might be just as compelling as good taste, as the author Rax King described in her book Tacky: “Tackiness is joyfulness.” But in its origins, taste is a much deeper philosophical concept. It borders on morality, representing an innate sense of what is good in the world.
17%
Flag icon
Voltaire wrote, “In order to have taste, it is not enough to see and to know what is beautiful in a given work. One must feel beauty and be moved by it. It is not even enough to feel, to be moved in a vague way: it is essential to discern the different shades of feeling.” Taste goes beyond superficial observation, beyond identifying something as “cool.” Taste requires experiencing the creation in its entirety and evaluating one’s own authentic emotional response to it, parsing its effect. (Taste is not passive; it requires effort.) Montesquieu, who was a baron and a judge in addition to a ...more
This highlight has been truncated due to consecutive passage length restrictions.
18%
Flag icon
Love, money, and beauty could all be as easily lost as gained, and gaining may not always be better than losing. Absence must be appreciated as much as presence. “Iki is understood as a superior form of taste,” Kuki wrote.
18%
Flag icon
“Something can surprise us because it excites wonder, or because it is new or unexpected,” he wrote—it exists outside the realm of what we already know we like. “Our soul often experiences pleasure when it feels something it cannot analyze, or when an object appears quite different from what it knows it to be.” Understanding this feeling of surprise can take time. Taste is not necessarily instantaneous and changes as you consider and digest the experience of an artwork: “We become aware of the presence of great beauty when something inspires us with a surprise which at first is only mild, but ...more
18%
Flag icon
If taste indeed must be deeply felt, requires time to engage with, and benefits from the surprise that comes from the unfamiliar, then it seems that technology could not possibly replicate it, because algorithmic feeds run counter to these fundamental qualities.
18%
Flag icon
Taste can also feel more like a cause for concern than a source of personal fulfillment. A selection made based on your own personal taste might be embarrassing if it unwittingly clashes with the norms of the situation at hand, like wearing athleisure to the office or bright colors to a somber funeral.
19%
Flag icon
As taste requires surprise, it also thrives on challenge and risk, treading too far in a particular direction. Safety may avoid embarrassment, but it’s also boring.
« Prev 1 3 4