More on this book
Community
Kindle Notes & Highlights
On the other side of our algorithmic anxiety is a state of numbness. The dopamine rushes become inadequate, and the noise and speed of the feeds overwhelming. Our natural reaction is to seek out culture that embraces nothingness, that blankets and soothes rather than challenges or surprises, as powerful artwork is meant to do. Our capacity to be moved, or even to be interested and curious, is depleted.
The culture of Filterworld is the culture of presets, established patterns that get repeated again and again. The technology limits us to certain modes of consumption; you can’t stray outside of the lines. “Maniac fun,” as Yuri says, is gone—that is to say, a certain degree of originality, unprecedentedness, creativity, and surprise disappears when so much weighs on culture’s ability to spread through digital feeds.
Culture is meant to be communal and requires a certain degree of consistency across audiences; without communality, it loses some of its essential impact.
Proust’s narrator describes the telephone as “a supernatural instrument before whose miracles we used to stand amazed, and which we now employ without giving it a thought, to summon our tailor or to order an ice cream.”
With new technology, the miraculous quickly becomes mundane, any glitch in its function is felt as bothersome, and finally it becomes ignorable, the miracle forsaken.
We forget that life wasn’t always this way, that we couldn’t directly speak to people across long distances, that ceiling lights didn’t make every room bright, or that we didn’t have our information and media automatically filtered by machines.
To move forward, we must disentangle the effects of algorithmic recommendations as technology from the ways that we have habitually adopted them as the primary gatekeepers of our online communication.
Maybe the Amazon store’s uncanniness stemmed from how it confronted me with my own lack of freedom, demonstrating just how much algorithms push us into not thinking for ourselves.
The punch line is that in algorithmic culture, the right choice is always what the majority of other people have already chosen. But even if everyone else was, maybe you’re just not in the mood for a whiskey sour.
If taste indeed must be deeply felt, requires time to engage with, and benefits from the surprise that comes from the unfamiliar, then it seems that technology could not possibly replicate it, because algorithmic feeds run counter to these fundamental qualities.
If you find something boring, perhaps too subtle, you just keep scrolling, and there’s no time for a greater sense of admiration to develop—one is increasingly encouraged to lean into impatience and superficiality in all things.
Building your own sense of taste, that set of subconscious principles by which you identify what you like, is an uphill battle compared to passively consuming whatever content feeds deliver to you.
Consumption without taste is just undiluted, accelerated capitalism.
There are two forces forming our tastes. As I described previously, the first is our independent pursuit of what we individually enjoy, while the second is our awareness of what it appears that most other people like, the dominant mainstream.
“There’s a pressure to be normal. That pressure is just saying: Be the same, whatever’s familiar to you is safe and somehow makes you feel like part of the group,” Krukowski said. “There’s a horrible vanishing point to that, and that is fascism.”
“You may not use social media, but it’s using you. You’re writing in tweets, like it or not.”
“Algorithms are limiting the future to the past,”
“The twenty-first century is oppressed by a crushing sense of finitude and exhaustion. It doesn’t feel like the future.”
This perception that culture is stuck and plagued by sameness is indeed due to the omnipre...
This highlight has been truncated due to consecutive passage length restrictions.
Like a corporatized form of Buddhism, the implied answer to anxiety is to learn not to desire differentiation in the first place, to simply be satisfied with whatever is placed in front of you. The cultivation of taste is discouraged because taste is inefficient in terms of maximizing engagement.
“The irony of it all is that these spaces are supposed to represent spaces of individuality, but they’re incredibly monotonous,” Gonzalez said. Her comment echoed another paradoxical message of Filterworld and algorithmic recommendations: You are unique, just like everybody else.
This is an eye opener. We always seem to describe nice new coffee shops as "unique" but actually they all follow the same rule that it should be instagram worthy.
I got a recommendation from a friend living in Tokyo to visit Rokuyosha, a kissaten-style café that opened in Kyoto in 1950.
A particular dish might be so elaborately visual that it functions more as an image than as food.
“We’ve put a lot of time and energy into creating beautiful content. But as a result of that algorithm, we find we’re not necessarily hitting as many eyeballs as we think we could or should, and sometimes that can be a little disheartening.”
Popularity has less to do with how the coffee tastes than how it looks in an Instagram photo.
“Just going somewhere to visually consume it because it’s beautiful is not really what tourism should be about,”
The most common comment on a dramatic travel Instagram shot is always “Where is this?,” implying another question: How can I get there, too?
you are more likely to like a clip of a TV show that you’ve seen many times before than a clip of one you haven’t seen, which would require sitting through the video and evaluating it.
Today, it can often feel like there is no creativity without attention, and no attention without the accelerant of algorithmic recommendations.
“We’re so plugged in that we’re almost not plugged into ourselves,”
Digital life became increasingly templated, a set of boxes to fill in rather than a canvas to cover in your own image.
I missed the previous sense of intimacy, the Internet as a private place—a hideout from real life, rather than the determining force of real life.
To break down Filterworld, change has to happen on the industrial level, at the scale of the tech companies themselves.
But I began to think that as much as the feeds brought me things I never would have seen or heard otherwise, my overdependence on them was also cutting me off from a different realm of experiences that I had forgotten about over the course of the decade: the encounter with scarcity rather than infinity, the process of judging and choosing for myself what I wanted to see in a given moment, without the option to scroll away.
“I believe that my job is not to tell people what’s good and what’s bad, but rather, it’s to stimulate their own critical sense,”
Algorithmic feeds disrupt curated juxtapositions and make it that much harder to interpret the broad swath of culture, to figure out which themes join things together and which aspects set them apart.
“It’s a question of making something available to someone who otherwise wouldn’t have known about it,” Cavalconte said. “You don’t know what you want until you’ve got it.”
“If you are not paying for it, you’re not the customer; you’re the product being sold.”
We turn to art to seek connection, yet algorithmic feeds give us pure consumption.
Culture is built on personal recommendations, not automated ones, as we share, interpret, and respond to the things that we love.
To resist Filterworld, we must become our own curators once more and take responsibility for what we’re consuming.
“The ultimate, hidden truth of the world is that it is something that we make, and could just as easily make differently.”