More on this book
Community
Kindle Notes & Highlights
All of these small decisions used to be made one at a time by humans: A newspaper editor decided which stories to put on the front page, and a magazine photo editor selected photographs to publish; a film programmer picked out which films to play in a theater’s season; an independent radio station DJ assembled playlists of songs that fit their own mood and the particular vibe of a day or a place. While these decisions were of course subject to various social and economic forces, the person in charge of them ensured a basic level of quality, or even safety, that can be missing from the
...more
In 2019, the writer Jia Tolentino similarly identified “Instagram face,” the “distinctly white but ambiguously ethnic” mix of features made popular on the platform and enabled by plastic surgery: “It has catlike eyes and long, cartoonish lashes; it has a small, neat nose and full, lush lips.”
Filterworld culture is ultimately homogenous, marked by a pervasive sense of sameness even when its artifacts aren’t literally the same. It perpetuates itself to the point of boredom.
Filterworld and its slick sameness can induce a breathtaking, near-debilitating sense of anxiety. The sameness feels inescapable, alienating even as it is marketed as desirable. “Surveillance capitalism,” as the scholar Shoshana Zuboff has labeled it, is how tech companies monetize the constant absorption of our personal data, an intensification of the attention economy.
Attention becomes the only metric by which culture is judged, and what gets attention is dictated by equations developed by Silicon Valley engineers. The outcome of such algorithmic gatekeeping is the pervasive flattening that has been happening across culture. By flatness I mean homogenization
but also a reduction into simplicity: the least ambiguous, least disruptive, and perhaps least meaningful pieces of culture are promoted the most. Flatness is the lowest common denominator, an averageness that has never been the marker of humanity’s proudest cultural creations.
Any algorithm, in the historical sense of a mathematical process, can be calculated by such a Turing Machine. And any computational system that can compute anything that a Turing Machine can is said to be “Turing-complete.” All programming languages, for example, are Turing-complete because they can model any kind of equation.
In March 2016, the Instagram feed began switching from a chronological to an algorithmic arrangement. The change was tested on small groups of users then rolled out to more and more, until it hit everyone. The increasingly out-of-order feed induced a sense of confusion and anxiety akin to the feeling of someone rearranging the furniture in your house without your knowledge. Before, by scrolling through the feed, you were moving back in time. But suddenly, a post from two days ago appeared at the top of your feed.
We forget that life wasn’t always this way, that we couldn’t directly speak to people across long distances, that ceiling lights didn’t make every room bright, or that we didn’t have our information and media automatically filtered by machines.
Such is the presence algorithmic feeds now have in our lives; the algorithm is often unconsidered, part of the furniture, noticed only when it doesn’t function in the way it’s supposed to, like traffic lights or running water.
In other words, for a piece of culture to be commercially successful, it must already have traction on digital platforms.
Filterworld represents the establishment of the psychic world of algorithms—not just how they work, but how we users have come to rely on them, allowing them to displace our own agency, even as we come to resent their looming presence.
A joke written on Twitter by a Google engineer named Chet Haase in 2017 pinpoints the problem: “A machine learning algorithm walks into a bar. The bartender asks, ‘What’ll you have?’ The algorithm says, ‘What’s everyone else having?’ ” The punch line is that in algorithmic culture, the right choice is always what the majority of other people
have already chosen. But even if everyone else was, maybe you’re just not in the mood for a whiskey sour.
In other cases, there’s the friend who always knows the right wine to bring to dinner, the friend tuned into the most relevant fashion brands, or the friend who recommends television shows worth watching. Taste is a word for how we measure culture and judge our relationship to it. If something suits our taste, we feel close to it and identify with it, as well as form relationships with other people based on it, the way customers commune over clothing labels (either loving or hating a particular brand).
Taste goes beyond superficial observation, beyond identifying something as “cool.” Taste requires experiencing the creation in its entirety and evaluating one’s own authentic emotional response to it, parsing its effect.
Taste is an abstract, ineffable, unstable thing. A listener to music or reader of a book cannot truly tell if they will enjoy something before they experience it; pleasure in a piece of art is never guaranteed. So when encountering an artwork, we immediately evaluate it by some set of mental principles, and, hopefully, find the beauty in it, feel affirmed, even if we can’t quite describe what that beauty is or how exactly we determined it in the first place.
Taste is not necessarily instantaneous and changes as you consider and digest the experience of an artwork: “We become aware of the presence of great beauty when something inspires us with a surprise which at first is only mild, but which continues, increases, and finally turns into admiration.”
If taste indeed must be deeply felt, requires time to engage with, and benefits from the surprise that comes from the unfamiliar, then it seems that technology could not possibly replicate it, because algorithmic feeds run counter to these fundamental qualities.
Fashion, to take one example, is often strongest as an art form when it doesn’t follow the rules and chase averages. Part of its appeal lies in breaking with the social code: wearing something unexpected or strange, even at times challenging your own taste. It’s something that no automated recommendation alone can approximate.
Jada Watson, a professor at the University of Ottawa who studies country radio airplay, tried the feature for herself and ended up with a similar result: She went through twelve refreshes of all men. Even though, for research purposes, Watson uses Spotify solely to listen to women artists, she found that “within the first 200 songs (19 refreshes), only 6 songs (3%) by women and 5 (3%) by male-female ensembles were included (all emerging after 121 songs by male artists).”
So, per Spotify’s algorithm, one might assume that country music is equated with men, a formula that still held even when Watson made a playlist called “Country Music by Musicians with Vaginas.” A biased algorithm defined the
genre in a biased way.
Students were then asked how much they would pay for a given song; the higher the star rating, the more they were willing to pay. Each added star resulted in a 10 to 15 percent increase in willingness to pay for the song. The experiment showed how the perception of recommendation could skew the perceived value of a given piece of culture, making it seem more likable or significant. The flaw intensifies due to the self-reinforcing loop of algorithmic recommendations.
Over time, the system will “provide less diverse recommendations,” as Zhang told the podcast Planet Money. Eventually, she said, it will “provide similar items to everybody, like, regardless of personal taste.” Hence the homogenization we are experiencing today.
But the more automated an algorithmic feed is, the more passive it makes us as consumers, and the less need we feel to build a collection, to preserve what matters to us. We give up the responsibility of collecting.
Over the past two decades, the collecting of culture—whether films on DVD, albums on vinyl, or books on a shelf—has shifted from being a necessity to appearing as an indulgent luxury.
“Ownership is the most intimate relationship that one can have to objects,” Benjamin wrote. “Not that they have come alive in him; it is he who lives in them.” In other words, we often discover, and even rediscover, ourselves in what we keep around us.
There’s no way to access older, favored versions of Spotify or Instagram the way it was possible in decades past to stick with an outdated software version by declining updates. The apps now mostly exist in the cloud, accessed online by the user, and the company wholly controls how
they work. This instability only intensifies the cultural flattening going on, since users can’t store or revisit their past experiences within their original context. All that exists is a relentlessly changing digital present tense.
Streaming services separate between two styles of consumption to analyze their users, as executives have described to me. There are “lean-in” moments, when the user is paying attention, choosing what to consume and actively judging the result. And there are “lean-back” moments, when users coast along with content running in the background, not worrying too much about what it is or what plays next.
Algorithmic recommendations push us toward the latter category, in which we are fed culture like foie-gras ducks, with more regard for volume than quality—because volume, sheer time spent, is what makes money for the platform through targeted advertising.
The gimmick worked; in 2021 TikTok passed a billion monthly active users and took its place as a successor in the line of massive social networks that host our digital lives. Its success meant that all-algorithmic feeds increasingly became the default, and it began to create a new era of digital fame and cultural success as well.
Patti Smith has written that as she got older, she still went to cafés to write and ordered a cappuccino each time, but didn’t need to actually drink the cappuccino; its presence on the table next to her notebook was enough to spark creativity. For me, the space itself was inspiration enough.
That’s how a New York City bar called Black Tap became famous around 2016 for its elaborate milkshakes, which came so encrusted with candy and other accessories (even an entire piece of cake) that they were barely edible but looked appropriately dramatic in a photo for Instagram. In fact, the shakes were designed not by a chef but by the restaurant’s social media manager. They were first served only at special events for social media influencers, but became even more popular with regular customers, who could turn them into content just the same. They were more meant to be photographed than
...more
May observed an effect that might be called follower inflation. High follower numbers correlate less and less to actual engagement over time, as the platform’s priorities change and as some active accounts leave, or the same content tricks stop working. It’s a familiar feeling for all of us who have been on Instagram over the past decade. While it might hurt your ego to receive fewer likes on a selfie, it’s a real financial problem when that follower footprint is how a business makes money, whether it’s a café attracting visitors or an influencer selling sponsored content.
That’s what Anca Ungureanu was trying to do in Bucharest. “We are a coffee shop where you can meet people like you, people that have interests like you,” she said. Her comment made me think that a certain amount of homogeneity might be an unavoidable consequence of algorithmic globalization, simply because so many like-minded people are now moving through the same physical spaces, influenced by the same digital platforms. The sameness has a way of compounding.
Of course, Iceland wasn’t beautiful because of Instagram, the way cafés were intentionally designed to be Instagram-ready. It was beautiful because of various accidents of geology and geography. But Instagram framed and highlighted its natural assets,
and the platform’s recommendations propelled iconic images of Icelandic hot spots to the top of the feed for hundreds of millions of users, turning those images into the de facto representations of the country.
The specter of our online presences haunts every moment, causing us to constantly judge which scenes in our lives are worth broadcasting to our personal audiences, offering them up to the feed.
This need to corral an audience in advance by succeeding on social media can be explained by the useful phrase “content capital.” Established by the scholar Kate Eichhorn in her 2022 monograph Content, it describes the Internet-era state in which “one’s ability to engage in work as an artist or a writer is increasingly contingent on one’s content capital; that is, on one’s ability to produce content not about one’s work but about one’s status as an artist, writer, or performer.” In other words, the emphasis is not on the thing itself but the aura that surrounds it, the ancillary material that
...more
You could talk to so many people at once on social media! But that exposure became enervating, too, and I missed the previous sense of intimacy, the Internet as a private place—a hideout from real life, rather than the determining force of real life.
Instagram and WhatsApp are just two of Facebook’s dozens of acquisitions. Google similarly acquired YouTube in 2006 and turned the video-uploading site into a media-consumption juggernaut, a replacement for cable television. Other social networks didn’t survive. Tumblr, for example, once on par with Twitter and Facebook, was bought by Yahoo in 2013 for $1.1 billion. Yet it suffered through years of mismanagement and declining growth, barely changing its initial product.
If the 2000s saw the emergence of the mainstream Internet, and the 2010s saw the rise and domination of massive digital platforms, then the next decade seems likely to embrace decentralization once more. Agency might be the watchword: the ability of an individual user to dictate how they publish and view content.
GDPR recognizes that these days, we are our data—data both documents what we have done and influences what we are able to do, or are most likely to do, in the future, oftentimes through algorithmic decisions. Thus, we should have some of the same kinds of control over it, and rights to it, that we have over our physical bodies.
Frictionlessness is always the Filterworld ideal—as soon as you slow down, you might just reconsider what you’re clicking on and giving your data away. “Friction allows people to think about
their actions,” she continued, a point that applies just as well to Spotify radio or the TikTok feed. If you think too much, you might stop.
Meta announced that both Facebook and Instagram would add options for users to opt out of algorithmic recommendations entirely, removing the possibility of automated personalization. But that option would only be available for users in the EU—because the US has been much slower to adopt such legislation. When I saw the headlines, I was jealous. It was suddenly as if only EU residents could breathe pollution-free air.
The most powerful choice might be the simplest one: Stop lending your attention to platforms that exploit it. There is a way to do that while continuing to use digital technology, sticking to websites and companies that treat users better. We can return to a more DIY Internet. But the more dramatic option is to log out entirely and figure out how to sustain culture offline once more.
By 2023, social media seems to