More on this book
Community
Kindle Notes & Highlights
by
Kyle Chayka
Read between
August 7 - August 14, 2024
Algorithmic recommendations shape the vast majority of our experiences in digital spaces by considering our previous actions and selecting the pieces of content that will most suit our patterns of behavior. They are supposed to interpret and then show us what we want to see.
Surely there is more to my identity as a consumer of culture?
Algorithmic recommendations are the latest iteration of the Mechanical Turk: a series of human decisions that have been dressed up and automated as technological ones, at an inhuman scale and speed. Designed and maintained by the engineers of monopolistic tech companies, and running on data that we users continuously provide by logging in each day, the technology is both constructed by us and dominates us, manipulating our perceptions and attention. The algorithm always wins.
Filterworld culture is ultimately homogenous, marked by a pervasive sense of sameness even when its artifacts aren’t literally the same. It perpetuates itself to the point of boredom.
Filterworld and its slick sameness can induce a breathtaking, near-debilitating sense of anxiety. The sameness feels inescapable, alienating even as it is marketed as desirable.
In place of the human gatekeepers and curators of culture, the editors and DJs, we now have a set of algorithmic gatekeepers.
Attention becomes the only metric by which culture is judged, and what gets attention is dictated by equations developed by Silicon Valley engineers. The outcome of such algorithmic gatekeeping is the pervasive flattening that has been happening across culture. By flatness I mean homogenization but also a reduction into simplicity: the least ambiguous, least disruptive, and perhaps least meaningful pieces of culture are promoted the most. Flatness is the lowest common denominator, an averageness that has never been the marker of humanity’s proudest cultural creations.
The culture of Filterworld is the culture of presets, established patterns that get repeated again and again. The technology limits us to certain modes of consumption; you can’t stray outside of the lines. “Maniac fun,” as Yuri says, is gone—that is to say, a certain degree of originality, unprecedentedness, creativity, and surprise disappears when so much weighs on culture’s ability to spread through digital feeds.
Where Project Cybersyn suggested that the world run by data might be coherent and graspable, contained within a room, we now know that it is abstract and diffuse, everywhere and nowhere at once. We’re encouraged to forget the presence of algorithms.
Even in 1992, engineers at Xerox’s Palo Alto Research Center (better known as PARC) were already overwhelmed by it. They sought to solve the problem of “the increasing use of electronic mail, which is resulting in users being inundated by a huge stream of Incoming documents,” David Goldberg, David Nichols, Brian M. Oki, and Douglas Terry wrote in a 1992 paper.
Quality is subjective; data alone, in the absence of human judgment, can go only so far in gauging it.
Information is now easy to find in abundance; making sense of it, knowing which information is useful, is much harder.
The negative aspects of Filterworld might have emerged because the technology has been applied too widely, without enough consideration for the experience of the user, rather than for the advertisers targeting them.
The advent of Filterworld has seen a breakdown in monoculture. It has some advantages—more than ever before, we can all consume a wider possible range of media—but it also has negative consequences. Culture is meant to be communal and requires a certain degree of consistency across audiences; without communality, it loses some of its essential impact.
Social networks and streaming services have become the primary way a significant percentage of the global population metabolizes information, whether it’s music, entertainment, or art. We now live in an era of algorithmic culture.
Today, it is difficult to think of creating a piece of culture that is separate from algorithmic feeds, because those feeds control how it will be exposed to billions of consumers in the international digital audience.
We worry that our posts either won’t be seen by the right people or will be seen by too many if selected for virality, exposing us to strangers. There’s an emotional fallout to this quest for attention: we end up both overstimulated and numb, much like a glassy-eyed slots player waiting for matching symbols to come up.
Exploitation is disguised as an accidental glitch instead of an intentional corporate policy.
In her 2019 dissertation titled Algorithmic Anxiety in Contemporary Art, the scholar Patricia de Vries defined algorithmic anxiety as a condition in which “the possible self is perceived to be circumscribed, bounded, and governed by algorithmic regimes.”
Filterworld represents the establishment of the psychic world of algorithms—not just how they work, but how we users have come to rely on them, allowing them to displace our own agency, even as we come to resent their looming presence.
Like many things that operate at the scale of the Internet, the bookstore was inhuman.
Enshiu derided his own taste as too mainstream to be truly great. Yet catering to “the taste of the majority” might be the single goal of algorithmic feeds—a majority based on data.
Taste is not necessarily instantaneous and changes as you consider and digest the experience of an artwork: “We become aware of the presence of great beauty when something inspires us with a surprise which at first is only mild, but which continues, increases, and finally turns into admiration.”
If taste indeed must be deeply felt, requires time to engage with, and benefits from the surprise that comes from the unfamiliar, then it seems that technology could not possibly replicate it, because algorithmic feeds run counter to these fundamental qualities.
Consumption without taste is just undiluted, accelerated capitalism.
The overall digital environment is dictated by tech companies with ruthlessly capitalist, expansionary motives, which do not provide the most fertile ground for culture.
Not only does the feed try to guess what you like, it may not understand when your preferences move on or evolve.
There was also an algorithmic feedback loop, in which what is popular becomes more popular.
Instead, taste amounts to a form of consumerism in which what you buy or watch is the last word on your identity and dictates your future consumption as well.
So, per Spotify’s algorithm, one might assume that country music is equated with men, a formula that still held even when Watson made a playlist called “Country Music by Musicians with Vaginas.”
In the matter of personal taste, knowing what you like is difficult, but it’s equally hard to know when you don’t like or don’t want something when it’s being so strenuously presented as “For You.”
While political filter bubbles silo users into opposing factions by disagreement, cultural recommendations bring them together toward the goal of building larger and larger audiences for the lowest-common-denominator material.
You can accrue a laboriously curated digital library of music only to have it thrown into disarray when the app changes.
They need to reach us where we are, and where we are is leaning back in the feed, not paying too much attention, both accepting of the newest algorithmic recommendation and likely to flip away at a moment’s dissatisfaction. There is no choice but to adapt.
Culture is continuously refined according to the excesses of data generated by digital platforms, which offer a second-by-second record of what audiences are engaging with and when and how.
“Culture is no longer made. It is simply curated from existing culture, refined, and regurgitated back at us. The algorithms cut off the possibility of new discovery,” wrote Paul Skallas, an anti-technology lifestyle influencer, bemoaning the 2010s’ plague of movie sequels and endless continuations of Marvel superhero franchises.
In so many cases, the culture disseminated through algorithmic feeds is either designed to produce a sensory void or to be flattened into the background of life, an insidious degradation of the status of art into something more like wallpaper.
Her comment echoed another paradoxical message of Filterworld and algorithmic recommendations: You are unique, just like everybody else.
Those years saw the dawning mainstream awareness of globalization, the way that colonialism and capitalism had led to a planet that was more interconnected and felt smaller than ever before.
Where you were physically mattered less, both to your experiences and your affinities, than which channels of media you were consuming.
Chaotic diversity was unprofitable, just as personal taste makes for inefficient digital consumption.
A place’s uniqueness only attracts more tourists, which gradually grind it into dust with the increasing flow of travelers, who arrive to consume its character as a product and leave it ever more degraded.
When I lived in Bushwick, I saw constant groups of French tourists being guided along the sparsely populated streets of the industrial neighborhood as if it were the Louvre, marveling at murals that eventually were replaced by hand-painted paid advertising.
particular dish might be so elaborately visual that it functions more as an image than as food. That’s how a New York City bar called Black Tap became famous around 2016 for its elaborate milkshakes, which came so encrusted with candy and other accessories (even an entire piece of cake) that they were barely edible but looked appropriately dramatic in a photo for Instagram.
When her café started selling coffee online, Facebook and Instagram seemed to throttle their reach—unless they bought ads and boosted the social media company’s own profits.
Not everyone is able to easily be perceived as generic or fit into seemingly frictionless spaces. Convenience for one group of users doesn’t mean convenience for all of them.
But now-ubiquitous algorithmic recommendations have also made the left side of the long-tail graph that much larger, as they route even uninterested consumers toward the standardized set of content that has already proven popular.
file can serve an infinite number of consumers, as long as the server space exists to host it. Larger audiences don’t change the experience. But physical places are not scalable.
version of each destination—a shallow, superficially interactive image that ultimately did not reflect reality. In fact, the image occluded reality, made it harder to find. That feeling—that truth and authenticity are evasive and we are discouraged from seeking them by a series of digital mediations—seems to be at the heart of Filterworld.
The number of thumbs-ups on a Facebook post or Tweet was a representation of how many people it reached, how many people were inspired to click a button by your little self-promotional bulletin or opinionated missive. But was that a representation of a post’s quality? Should we evaluate ourselves by this arbitrary new metric? The questions loomed, but given the utility of new social networks and the instant reach they afforded, it was easy to ignore such quandaries and just keep posting.