More on this book
Community
Kindle Notes & Highlights
by
Kyle Chayka
Read between
February 28 - April 11, 2024
(In 2005, Amazon named its service for accomplishing digital tasks, like tagging photos or cleaning data, using an invisible marketplace of outsourced human labor “Mechanical Turk.”)
Through their feeds, they are consuming similar kinds of digital content, no matter where they live, and so their preferences are shaped in that image. Algorithms are manipulative; the apps guide them through physical space to places that have adopted digitally popular aesthetics, winning attention and ratings from other users. With higher ratings come yet more algorithmic promotion and thus more visitors.
We also adapt the way we present ourselves online to its incentives. We write tweets, post on Facebook, and take Instagram photos in forms we know will grab attention
Attention becomes the only metric by which culture is judged, and what gets attention is dictated by equations developed by Silicon Valley engineers.
The written explanation of each Babylonian algorithm ended with the same phrase: “This is the procedure.” That line emphasizes an inherent quality of algorithms: they can be repeated, equally applicable and effective every time a given situation occurs. An acolyte of Silicon Valley today might describe them as scalable.
The data they take in is used for gradual self-improvement to encourage even more engagement; the machine adapts to users and users adapt to the machine.
Quality is subjective; data alone, in the absence of human judgment, can go only so far in gauging it.
Building your own sense of taste, that set of subconscious principles by which you identify what you like, is an uphill battle compared to passively consuming whatever content feeds deliver to you.
The force of algorithmic pressure is not theoretical. It’s not a gloomy dystopian future but, rather, a pervading force that is already influencing cultural consumers and creators.
Lately, trend cycles have accelerated into “microtrends” that come and go in a matter of weeks; she feels like she’s missing out when friends cite a meme or video she hasn’t seen. (The anxiety of not keeping up with the algorithm.)
“I just want to know that what I like is what I actually like.”
There was more diversity and originality in a physical walk down the Manchester sidewalk than in her digital feeds, which all ran together.
Given the priorities of Spotify’s recommender system at the time, the content that was the most generic succeeded best. This is how algorithmic normalization happens.
Like a corporatized form of Buddhism, the implied answer to anxiety is to learn not to desire differentiation in the first place, to simply be satisfied with whatever is placed in front of you. The cultivation of taste is discouraged because taste is inefficient in terms of maximizing engagement.
Filterworld consists of one fundamental, unavoidable reality: never in human history have so many people experienced the same things, the same pieces of content disseminated instantly through the feeds, to our individual screens. Every consequence flows from that fact.
But the first step of escaping the algorithms’ grip is recognizing it. By moving away from the mindset of passive consumption and thinking about a post-algorithmic digital ecosystem, we begin to construct that alternative, demonstrating that the influence of algorithms is neither inevitable nor permanent.