More on this book
Community
Kindle Notes & Highlights
by
Kyle Chayka
Read between
September 1 - November 17, 2024
Algorithmic recommendations are the latest iteration of the Mechanical Turk: a series of human decisions that have been dressed up and automated as technological ones, at an inhuman scale and speed. Designed and maintained by the engineers of monopolistic tech companies, and running on data that we users continuously provide by logging in each day, the technology is both constructed by us and dominates us, manipulating our perceptions and attention. The algorithm always wins.
Each platform develops its own stylistic archetype, which is informed not just by aesthetic preferences but by biases of race, gender, and politics as well as by the fundamental business model of the corporation that owns it.
Filterworld culture is ultimately homogenous, marked by a pervasive sense of sameness even when its artifacts aren’t literally the same. It perpetuates itself to the point of boredom.
Attention becomes the only metric by which culture is judged, and what gets attention is dictated by equations developed by Silicon Valley engineers. The outcome of such algorithmic gatekeeping is the pervasive flattening that has been happening across culture. By flatness I mean homogenization but also a reduction into simplicity: the least ambiguous, least disruptive, and perhaps least meaningful pieces of culture are promoted the most. Flatness is the lowest common denominator, an averageness that has never been the marker of humanity’s proudest cultural creations.
With new technology, the miraculous quickly becomes mundane, any glitch in its function is felt as bothersome, and finally it becomes ignorable, the miracle forsaken. We forget that life wasn’t always this way, that we couldn’t directly speak to people across long distances, that ceiling lights didn’t make every room bright, or that we didn’t have our information and media automatically filtered by machines. Such is the presence algorithmic feeds now have in our lives; the algorithm is often unconsidered, part of the furniture, noticed only when it doesn’t function in the way it’s supposed to,
...more
The bookstore selection driven by the average of all of Amazon’s data was curiously homogenous and ultimately boring. It had been aggressively filtered in advance to appeal to me—or at least myself as a generic consumer—with abundant reassurances that other people did like the books on display. But I wasn’t excited or encouraged to page through any of them. Rather, I was overwhelmed, which might be the default state of consumers in Filterworld: surrounded by superabundant content, but inspired by none of it.
If taste indeed must be deeply felt, requires time to engage with, and benefits from the surprise that comes from the unfamiliar, then it seems that technology could not possibly replicate it, because algorithmic feeds run counter to these fundamental qualities.
But the more automated an algorithmic feed is, the more passive it makes us as consumers, and the less need we feel to build a collection, to preserve what matters to us. We give up the responsibility of collecting. Over the past two decades, the collecting of culture—whether films on DVD, albums on vinyl, or books on a shelf—has shifted from being a necessity to appearing as an indulgent luxury. Why would I bother worrying about what I have access to at hand when digital platforms advertise their ability to provide access to everything, forever, whenever I want? The problem is that there is
...more
The line between catering to algorithmic feeds enough and relying on them too much is a hard one to walk. The temptation to court the extreme attention, and thus profit, is always there, but the cumulative effect of so much reliance on automated feeds is a kind of desensitization. We end up not being able to imagine culture operating any other way than algorithmically.
It’s very possible to be interested in something but not like it, in the case of a difficult piece of music or an abstract painting. A piece of art can provoke you and leave you confused or perturbed but still drawn in.
I found that the way to fight the generic is to seek the specific, whatever you are drawn toward. You don’t need to be a credentialed or professionalized expert to be a connoisseur. You don’t need to monetize your opinion as an influencer for it to be legitimate. The algorithm promises to supplant your taste and outsource it for you, like a robotic limb, but all it takes to form your own taste is thought, intention, and care.
In Filterworld, the most popular culture is also the most desiccated. It is streamlined and averaged until, like a vitamin pill, it may contain the necessary ingredients but lacks any sense of brilliance or vitality. This process happens not by force, in the way of a mold stamping metal, but by compliance, as creators voluntarily shape their work to pursue the motivation of algorithmic exposure and access to audiences. This is not to say the creators are cynical; they have few other options, because garnering attention on digital platforms is the most reliable method of earning a living in the
...more
Our phones and feeds absorb so much of our attention and dominate so many of our preferences that stepping out of their conveniently predetermined paths and choosing an experience not immediately engaging feels somewhat radical.
We cannot just rid ourselves of algorithmic influence, even if we wanted to, since the technology has already inexorably shaped our era. But the first step of escaping the algorithms’ grip is recognizing it. By moving away from the mindset of passive consumption and thinking about a post-algorithmic digital ecosystem, we begin to construct that alternative, demonstrating that the influence of algorithms is neither inevitable nor permanent.