More on this book
Community
Kindle Notes & Highlights
Algorithmic recommendations shape the vast majority of our experiences in digital spaces by considering our previous actions and selecting the pieces of content that will most suit our patterns of behavior. They are supposed to interpret and then show us what we want to see.
Whether visual art, music, film, literature, or choreography, algorithmic recommendations and the feeds that they populate mediate our relationship to culture, guiding our attention toward the things that fit best within the structures of digital platforms.
The culture that thrives in Filterworld tends to be accessible, replicable, participatory, and ambient.
Filterworld and its slick sameness can induce a breathtaking, near-debilitating sense of anxiety. The sameness feels inescapable, alienating even as it is marketed as desirable.
Attention becomes the only metric by which culture is judged, and what gets attention is dictated by equations developed by Silicon Valley engineers.
Technology often appears to belong to the distant future right up until the moment the switch flips, and the leap forward becomes totally mundane, a simple fact of daily life.
Content creators from marginalized groups, who don’t have the same access to media and attention as, say, a white, private-school-educated, professionally trained dancer, like D’Amelio, have a harder time benefiting from the tides of Filterworld.
Maybe the Amazon store’s uncanniness stemmed from how it confronted me with my own lack of freedom, demonstrating just how much algorithms push us into not thinking for ourselves.
I was overwhelmed, which might be the default state of consumers in Filterworld: surrounded by superabundant content, but inspired by none of it.
Taste is a word for how we measure culture and judge our relationship to it.
If taste indeed must be deeply felt, requires time to engage with, and benefits from the surprise that comes from the unfamiliar, then it seems that technology could not possibly replicate it, because algorithmic feeds run counter to these fundamental qualities.
When taste is too standardized, it is degraded.
Consumption without taste is just undiluted, accelerated capitalism.
Taste, in the end, is its own kind of algorithm that classifies things as good or bad.
Ultimately, such Netflix recommendations are less about finding the content that suits a user’s preferences and more about presenting what’s already popular or accessible, an illusion of taste.
It’s not that great art needs to be inherently offensive; rather, when everything conforms to established expectations, we miss out on culture that is truly progressive and uncomfortable, that might subvert categories rather than fit neatly into them.
The Internet has been increasingly enclosed into a series of bubbles, self-reinforcing spaces in which it becomes harder to find a diverse range of perspectives.
I appreciated the effect—after all, I had sought out that experience of sameness and achieved it. But there was also something missing: I wasn’t surprising myself with the unfamiliar during traveling, just reaffirming the superiority of my own sense of taste by finding it in a new place. Maybe that’s why it felt hollow.
Part of the fear of algorithmically driven art is the obviation of the artist: If viable art can be created or curated by computer, what is the point of the humans producing it?
Today, it can often feel like there is no creativity without attention, and no attention without the accelerant of algorithmic recommendations.
Cultural flattening is one consequence. But the same mechanism is also what makes our public political discourse more and more extreme, because conflict and controversy light up the feed and attract likes in a way that subtlety and ambiguity never will.
“Art will adjust to meet the attention span of contemporary life,”
“The problems of homogeneity are not just that it is boring; the most or least offensive stuff rises to the top, because that gets clicks,”
Algorithmic feeds help automatically distribute misinformation and can speed ideological radicalization, feeding users ever more extreme content in a single category.
While previous generations might have had dance halls or independent radio stations to help them discover new music during their formative teenage years, and young people in the twenty-first century have TikTok feeds and Spotify playlists, millennials in the late 1990s and early 2000s had online forums and MP3 piracy. These required much more labor to find what you like and consume it than the frictionless avenues of algorithmic feeds. While avoiding that labor may be convenient, it also makes our personal tastes flimsier, less hard-won.
The path of chasing something that will appeal to, or at least avoid offending, the highest number of people leads to homogeneity. And that homogeneity is inevitably cast in the mold of dominant groups: white, cisgender, heterosexual.
We turn to art to seek connection, yet algorithmic feeds give us pure consumption.