More on this book
Community
Kindle Notes & Highlights
There are two forces forming our tastes. As I described previously, the first is our independent pursuit of what we individually enjoy, while the second is our awareness of what it appears that most other people like, the dominant mainstream.
Algorithmic feeds further reinforce the presence of that mainstream, against which our personal choices are evaluated.
Taste, in the end, is its own kind of algorithm that classifies things as good or bad.
So it can be difficult to distinguish that organic social code from the software code of recommendation algorithms, though it is vital to do so.
The former option is mercurial and driven by elite gatekeepers, a powerful group built up over a century of modern cultural industries, riddled with their own blind spots and biases including those of gender and race.
Yet the human flaws may become even more dramatic in an algorithmic ecosystem when the actions of mass audiences dictate what can easily be seen. Racism, sexism, and other forms of bias are a de facto part of that equation.
The overall digital environment is dictated by tech companies with ruthlessly capitalist, expansionary motives, which do not provide the most fertile ground for culture.
Fashion, to take one example, is often strongest as an art form when it doesn’t follow the rules and chase averages. Part of its appeal lies in breaking with the social code: wearing something unexpected or strange, even at times challenging your own taste.
Algorithmic feeds are a double-edged sword: A marginalized fashion designer might find a way to game the Instagram algorithm and spark their own popularity without waiting to be noticed by a white editor who might be biased against them. But they are then conforming to the tenets of a tech company even more powerful and more blinkered than the editor.
On the consumer side, the bombardment of recommendations can induce a kind of hypnosis that makes listening to, watching, or buying a product all but inevitable—whether it truly aligns with your taste or not.
Algorithmic anxiety is fueled in part by the scourge of targeted online advertising, which uses the same kind of algorithms as the feeds.
Since ads are the primary way that many digital platforms and online publications make money, they are everywhere, interrupting articles, popping up with autoplay videos.
Instead of accomplishing the goal of sustaining my attention, the recommendations force me to confront the aesthetics’ lack of context and meaning.
Yet algorithmic recommendations also have a way of warping cultural creators’ intentions for what they put out into the feed, changing their relationship to their own work as well.
This is how algorithmic normalization happens. Normal is a word for the unobtrusive and average, whatever won’t provoke negative reactions.
As fewer people see the content that doesn’t get promoted, there is less awareness of it and less incentive for creators to produce it—in part because it wouldn’t be financially sustainable.
The bounds of aesthetic acceptability become tighter and tighter until all that’s left is a column in the middle. While popular styles shift, like moving targets, the centralization and normalization persist.
Filterworld can be fascistic, in that the algorithmic feeds tend to create templates of how things are supposed to be, always informed by inherent biases—a bracketing of reality that is then fulfilled by users creating content that fits the mold.
And because Spotify implacably controls how listeners interact with the music, they do not have to incentivize musicians in the same way as major labels did, with rich record deals or other perks.
The idea of taste became a series of ever more granular preferences, liking A instead of B, rather than a deeper-seated, holistic sense of self.
Even more striking than the recommendations themselves, Netflix also algorithmically changes the thumbnail art on all its content to tailor it to the specific user.
Users have observed, with justifiable anxiety, how Netflix’s home page only displays thumbnails with their same skin color, despite theoretically not tracking their race.
By changing the show’s thumbnail in such an aggressive way, the platform is manipulating the users, not recommending what they might like but altering the presentation of the same content to make it appear more similar to their preferences.
The algorithm “defaulted to recommending content with a high likelihood of producing user engagement and did so under the guise of personalization,”
Ultimately, such Netflix recommendations are less about finding the content that suits a user’s preferences and more about presenting what’s already popular or accessible, an illusion of taste.
The recommendations create an illusion of diversity and depth that doesn’t exist in reality.
Taste’s moral capacity, the idea that it generally leads an individual toward a better society as well as better culture, is being lost. Instead, taste amounts to a form of consumerism in which what you buy or watch is the last word on your identity and dictates your future consumption as well.
In guiding us into particular categories through soft coercion, the Netflix algorithm ends up defining our taste as only one fixed thing, made more rigid by every successive interaction on the platform, moving deeper into a pigeonhole.
That lack of confrontation is concerning. It’s not that great art needs to be inherently offensive; rather, when everything conforms to established expectations, we miss out on culture that is truly progressive and uncomfortable, that might subvert categories rather than fit neatly into them.
McBride discovered that the playlist-recommendation function wasn’t based on the user’s listening habits at all, but on the title of the playlist.
“Corrupt personalization is the process by which your attention is drawn to interests that are not your own,” Sandvig wrote. The recommendation system “serves a commercial interest that is often at odds with our interests.”
Other examples of corrupt personalization include Amazon suggesting its own in-house brands before other results in its marketplace and Google Search prioritizing the company’s other products, like Google Maps, as the best sources of information.
In the matter of personal taste, knowing what you like is difficult, but it’s equally hard to know when you don’t like or don’t want something when it’s being so strenuously presented as “For You.”
While political filter bubbles silo users into opposing factions by disagreement, cultural recommendations bring them together toward the goal of building larger and larger audiences for the lowest-common-denominator material.
We often build our senses of personal taste by saving pieces of culture: slowly building a collection of what matters to us, a monument to our preferences, like a bird constructing a nest. But the more automated an algorithmic feed is, the more passive it makes us as consumers, and the less need we feel to build a collection, to preserve what matters to us. We give up the responsibility of collecting.
The collector is the only one who decides how to arrange their possessions, ordering books by author, title, theme, or even color of the cover—and they stay in the same places they’re put. That’s not true of our digital cultural interfaces, which follow the whims and priorities of the technology companies that own them.
The interfaces follow the company’s incentives, pushing its own products first and foremost, or changing familiar patterns to manipulate users into trying a new feature.
But it’s very difficult to feel such ownership for what we collect on the Internet; we can’t be stewards of the culture we appreciate in the same way as Benjamin. We don’t actually own it and can’t guarantee accessing it in the same way each time.
This instability only intensifies the cultural flattening going on, since users can’t store or revisit their past experiences within their original context. All that exists is a relentlessly changing digital present tense.
The shifting sands of digital technology have robbed our collections of their meaning. They appear only as nostalgic ruins, the remains of once-inhabited metropolises gone silent.
When we find something meaningful enough to save, to add to our collection, the action both etches it a little deeper into our hearts and creates a context around the artifact itself, whether text, song, image, or video. The context is not just for ourselves but for other people, the knit-together, shared context of culture at large.
There are “lean-in” moments, when the user is paying attention, choosing what to consume and actively judging the result. And there are “lean-back” moments, when users coast along with content running in the background, not worrying too much about what it is or what plays next.
As consumers become increasingly passive, failing to exercise distinctly cultivated tastes, artists are forced to contend even more with algorithmic pressures, because working through the feed is the only way they can reach the scale of audience and engagement that they need to make a living.
Musically merged into TikTok, which became popular for its quick video clips of music or dance, a holdover from its predecessor.
What also set TikTok apart was its primary “For You” feed, which is almost entirely algorithmic. Users were not encouraged to choose who to follow; they just trusted the decisions of the equation.
Its success meant that all-algorithmic feeds increasingly became the default, and it began to create a new era of digital fame and cultural success as well.
Kabvina’s videos, the algorithmic feed, and the rapacious audience formed a feedback loop. He called it “instant gratification”: “I can post on TikTok and in ten minutes I can check back and see thirty thousand people have watched it.”
For independent creators, the algorithm takes the place of bosses and performance reviews; it’s a real-time authority gauging your success at adapting to its definition of compelling content, which is always shifting.
But before Filterworld, creative taste—which is the way that artists evaluate their own work, as much as how consumers evaluate their appreciation of it—has never been so influenced by data and the granular measurement of attention.
Just as algorithmic recommendations slot users into categories of consumption, supplanting their personal tastes, they also sort cultural output into categories, which creators run up against. These categories are the house styles of Filterworld, in which greatness is defined by optimization rather than dramatic creative leaps into the unknown.