More on this book
Community
Kindle Notes & Highlights
by
Kyle Chayka
Read between
October 21 - December 6, 2024
Algorithmic recommendations shape the vast majority of our experiences in digital spaces by considering our previous actions and selecting the pieces of content that will most suit our patterns of behavior. They are supposed to interpret and then show us what we want to see.
Designed and maintained by the engineers of monopolistic tech companies, and running on data that we users continuously provide by logging in each day, the technology is both constructed by us and dominates us, manipulating our perceptions and attention. The algorithm always wins.
Each platform develops its own stylistic archetype, which is informed not just by aesthetic preferences but by biases of race, gender, and politics as well as by the fundamental business model of the corporation that owns it.
As the Indian literary theorist Gayatri Spivak wrote in 2012, “Globalization takes place only in capital and data. Everything else is damage control.”
On the other side of our algorithmic anxiety is a state of numbness. The dopamine rushes become inadequate, and the noise and speed of the feeds overwhelming. Our natural reaction is to seek out culture that embraces nothingness, that blankets and soothes rather than challenges or surprises, as powerful artwork is meant to do. Our capacity to be moved, or even to be interested and curious, is depleted.
Attention becomes the only metric by which culture is judged, and what gets attention is dictated by equations developed by Silicon Valley engineers. The outcome of such algorithmic gatekeeping is the pervasive flattening that has been happening across culture. By flatness I mean homogenization but also a reduction into simplicity: the least ambiguous, least disruptive, and perhaps least meaningful pieces of culture are promoted the most. Flatness is the lowest common denominator, an averageness that has never been the marker of humanity’s proudest cultural creations.
Algorithms are key to the history of early mathematics.
When we talk about “the algorithm,” it often feels like a force that began to exist only recently, in the era of social networks. But we’re discussing a technology with a history and legacy that has slowly formed over centuries, long before the Internet existed.
It’s important to remember that how the Facebook feed works is a commercial decision, the same as a food manufacturer deciding which ingredients to use.
The inability to evaluate quality brings to mind artificial intelligence: New tools like ChatGPT seem to be able to understand and generate meaningful language, but really, they only repeat patterns inherent in the preexisting data they are trained on. Quality is subjective; data alone, in the absence of human judgment, can go only so far in gauging it.
Algorithmic feeds are sometimes more formally and literally labeled “recommender systems,” for the simple act of choosing a piece of content.
The negative aspects of Filterworld might have emerged because the technology has been applied too widely, without enough consideration for the experience of the user, rather than for the advertisers targeting them. The recommendations, such as they are, don’t work for us anymore; rather, we are increasingly alienated by them.
The algorithmic feed itself is not consistent or on a linear path toward some ultimate perfection. It changes with a company’s priorities.
euphemisms have emerged for terms that trigger the algorithm to block or slow down a video: “unalive” for kill, “SA” for sexual assault, “spicy eggplant” instead of vibrator, as the journalist Taylor Lorenz documented in the Washington Post. Such vocabulary was nicknamed “algospeak”: speech molded in the image of the algorithm.
(After the program ended, it also emerged that Facebook had lied about the traffic the videos were getting, inflating the numbers up to nine times, according to a lawsuit.)
Culture is meant to be communal and requires a certain degree of consistency across audiences; without communality, it loses some of its essential impact.
Social networks and streaming services have become the primary way a significant percentage of the global population metabolizes information, whether it’s music, entertainment, or art. We now live in an era of algorithmic culture.
Monopolistic growth is more important to these entities than the quality of user experience and certainly more important than the equitable distribution of culture through the services’ feeds.
(A digital platform has none of the curatorial responsibility of, say, an art museum.)
Andrew Bosworth, a deputy of Mark Zuckerberg’s at Facebook, demonstrated in 2016: So we connect more people. That can be bad if they make it negative. Maybe it costs someone a life by exposing them to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de-facto* good.
Quality is subjective, of course, but the host’s sentiment speaks to how users can feel misunderstood and misjudged by algorithmic evaluations. “It’s like an exam, but you don’t know what’s going to be on this exam, or how to score well on this exam,” Jhaver explained. And it’s not just the users who don’t know what’s going on. Jhaver continued: “At the end of the day, even the people who create the algorithms cannot tell you which factor was responsible for which decision; the complexity of the algorithm is so high that disentangling different factors is just not possible.”
Gig-economy platforms like Airbnb have long promised flexible work and alternative ways of making or supplementing a living, but they also created a new form of labor in the need to stay up to date on changes in algorithmic priorities.
Algorithmic anxiety is something of a contemporary plague. It induces an OCD-ish tendency in many users toward hyperawareness and the need to repeat the same rituals, because when these rituals “work,” the effect is so compelling, resulting in both a psychological dopamine rush from receiving attention and a potential economic reward if your online presence is monetized.
Algorithmic anxiety happens because there is a dramatically asymmetrical relationship between user and algorithm. For the individual user, trying to predict or dictate the outcome of the algorithm is like trying to control the tide. To continue the metaphor, all users can do is surf the wave that’s already formed. There is little incentive for companies to assuage this anxiety because a user’s confusion can be beneficial to business.
Behind each rearrangement I felt the hand that selected the book, an individual intelligence, rather than a singular formula. Browsing was a way of discovering new things; one could argue that Amazon’s formula of “if you like this, you’ll like that” functions similarly, but the connections at McNally were less direct and literal. They expanded the shopper’s idea of what a particular category could contain.
If taste indeed must be deeply felt, requires time to engage with, and benefits from the surprise that comes from the unfamiliar, then it seems that technology could not possibly replicate it, because algorithmic feeds run counter to these fundamental qualities.
The Netflix algorithm slots users into particular “taste communities,” of which there are more than two thousand.
In 2023, Netflix was streaming fewer than four thousand films, a lower total than what one of the larger Blockbuster stores stocked before that company disappeared, often upwards of six thousand films.
This is wild to me. Freed from the constraints of physical space and real estate, Netflix still only offers 2/3 of what was once had at a large Blockbuster store.
The recommendations create an illusion of diversity and depth that doesn’t exist in reality.
When we embrace ambience, we lose the meaning of the finite and the discrete.
The successor to Brian Eno’s Music for Airports is the YouTube stream “lofi hip hop radio—beats to relax/study to,” which was created by a DJ named Dmitri going by the username ChilledCow in 2015.
Even in the nineteenth century, Tarde predicted that in the future stylistic difference would be based not on “diversity in space” but “diversity in time.”
Spivak is correct that what really flows across the planet are various forms of money and information: investments, corporations, infrastructure, server farms, and the combined data of all the digital platforms, sluicing invisibly like wind or ocean currents between nations. We users voluntarily pump our own information through this same system, turning ourselves into flowing commodities, too.
In the place of physical hotels and airports, we have Twitter, Facebook, Instagram, and TikTok as spaces of congregation that erase differences.
These installations became utterly exhausting to encounter by the end of the decade. So-called Instagram Museums arose that made the taking of the photo the intentional goal of the experience, as if the only point of going to the Louvre was to take a selfie in front of the Mona Lisa.
In that, they served a very clear purpose. Instagram walls or experiences attracted visitors to a locale and kept them engaged by giving them an activity to perform with their phones, like a restaurant providing coloring books for kids. It was a concession to the fact of our growing addictions—you can’t just go somewhere; you must document your experience of it.
We users are what makes social media run, and yet we also aren’t given full control over the relationships we develop on the platforms, in large part because algorithmic recommendations are so dominant.
Not everyone is able to easily be perceived as generic or fit into seemingly frictionless spaces. Convenience for one group of users doesn’t mean convenience for all of them.
The incentive isn’t to offer a particularly unique experience; it’s to convince the most website visitors possible to click the Buy button, to engage with the content.
Leaked data from the video-game streaming platform Twitch suggested that only the top .01 percent of Twitch creators were able to earn the median U.S. income.
But physical places are not scalable.
The trade-off that Chesky implied was that the more movement there was, the less identity there would be. Identity, too, is a matter of content.
Collectors of ancient Chinese painting habitually added their names to favored paintings by stamping them with seals—akin to an approving tweet or a thumbs-up like. Masterpieces acquired bunches of stamps over centuries, with some even layered on top of the painted landscape itself.
Art itself—not to mention artists as people—tends not to be bound by the quest for likability, and yet likes are what the current tyranny of quantification prioritizes most.
to fit into digital feeds, in order to attract those pernicious likes and further promote itself as much as possible, culture has to be content first and art second—if at all.
Emily in Paris is cognizant of how life has been taken over by social media feeds and the need for any piece of culture—fashion line, retail store, public art installation—to spark content creation.
in 2022, a United States survey found that 54 percent of the respondents, ages thirteen to thirty-eight, would become influencers if given the opportunity. Another survey of three thousand children in 2019 found that 30 percent of them would choose to be a YouTuber—another kind of influencer—over other careers like professional athlete, musician, and astronaut.
Every user is famous to their followers.
Where the term blogger described a literal activity of writing, “influencing” is closer to the financial side of what’s going on. It’s a sales job, convincing audiences to buy something, first a vision of aspirational lifestyle and then the products that make it up.