Filterworld: How Algorithms Flattened Culture
Rate it:
Open Preview
Read between February 28 - April 11, 2024
2%
Flag icon
(In 2005, Amazon named its service for accomplishing digital tasks, like tagging photos or cleaning data, using an invisible marketplace of outsourced human labor “Mechanical Turk.”)
3%
Flag icon
Through their feeds, they are consuming similar kinds of digital content, no matter where they live, and so their preferences are shaped in that image. Algorithms are manipulative; the apps guide them through physical space to places that have adopted digitally popular aesthetics, winning attention and ratings from other users. With higher ratings come yet more algorithmic promotion and thus more visitors.
3%
Flag icon
Filterworld and its slick sameness can induce a breathtaking, near-debilitating sense of anxiety. The sameness feels inescapable, alienating even as it is marketed as desirable.
Ieva Jakė liked this
3%
Flag icon
We also adapt the way we present ourselves online to its incentives. We write tweets, post on Facebook, and take Instagram photos in forms we know will grab attention
4%
Flag icon
Attention becomes the only metric by which culture is judged, and what gets attention is dictated by equations developed by Silicon Valley engineers.
4%
Flag icon
The written explanation of each Babylonian algorithm ended with the same phrase: “This is the procedure.” That line emphasizes an inherent quality of algorithms: they can be repeated, equally applicable and effective every time a given situation occurs. An acolyte of Silicon Valley today might describe them as scalable.
8%
Flag icon
The data they take in is used for gradual self-improvement to encourage even more engagement; the machine adapts to users and users adapt to the machine.
8%
Flag icon
Quality is subjective; data alone, in the absence of human judgment, can go only so far in gauging it.
18%
Flag icon
Building your own sense of taste, that set of subconscious principles by which you identify what you like, is an uphill battle compared to passively consuming whatever content feeds deliver to you.
19%
Flag icon
Safety may avoid embarrassment, but it’s also boring.
Daumantas Jakas
Safety in taste
20%
Flag icon
The force of algorithmic pressure is not theoretical. It’s not a gloomy dystopian future but, rather, a pervading force that is already influencing cultural consumers and creators.
20%
Flag icon
Lately, trend cycles have accelerated into “microtrends” that come and go in a matter of weeks; she feels like she’s missing out when friends cite a meme or video she hasn’t seen. (The anxiety of not keeping up with the algorithm.)
21%
Flag icon
“I just want to know that what I like is what I actually like.”
21%
Flag icon
There was more diversity and originality in a physical walk down the Manchester sidewalk than in her digital feeds, which all ran together.
22%
Flag icon
Given the priorities of Spotify’s recommender system at the time, the content that was the most generic succeeded best. This is how algorithmic normalization happens.
31%
Flag icon
Like a corporatized form of Buddhism, the implied answer to anxiety is to learn not to desire differentiation in the first place, to simply be satisfied with whatever is placed in front of you. The cultivation of taste is discouraged because taste is inefficient in terms of maximizing engagement.
97%
Flag icon
Filterworld consists of one fundamental, unavoidable reality: never in human history have so many people experienced the same things, the same pieces of content disseminated instantly through the feeds, to our individual screens. Every consequence flows from that fact.
99%
Flag icon
But the first step of escaping the algorithms’ grip is recognizing it. By moving away from the mindset of passive consumption and thinking about a post-algorithmic digital ecosystem, we begin to construct that alternative, demonstrating that the influence of algorithms is neither inevitable nor permanent.