Filterworld: How Algorithms Flattened Culture
Rate it:
Open Preview
9%
Flag icon
The Amazon website began using collaborative filtering as early as 1998 to recommend products for customers to buy. Rather than attempting to measure similar profiles of users to approximate taste, as Ringo did, the system worked by determining which items were likely to be purchased in tandem—a rattle with a baby bottle, for example.
10%
Flag icon
Amazon found that the personalized product recommendations were much more effective in terms of click-throughs and sales than unpersonalized marketing tactics like banner advertisements and lists of bestselling products, which can’t be as tightly targeted.
10%
Flag icon
Zuckerberg’s Facebook tied online identity coherently and consistently to the offline world. The platform encouraged users to use their real names rather than arcane aliases and influenced real-life plans in the small world of college:
11%
Flag icon
Facebook’s Like button, with its signature thumbs-up, was introduced in 2009, providing one form of data on how interested a user might be in a particular piece of content. User engagement, measured by likes, comments, and one account’s previous interactions with another, factored into the order of the feed.
11%
Flag icon
The EdgeRank scores were not permanently assigned once, like the outcome of a basketball game in a tournament, but changed instant to instant.
11%
Flag icon
The algorithmic feed itself is not consistent or on a linear path toward some ultimate perfection. It changes with a company’s priorities.
11%
Flag icon
The relationship was almost oppositional; only if you “gamed” the algorithm would you be heard. You could no longer rely on users who had followed or friended you seeing your posts.
11%
Flag icon
At another point, it became clear that writing text that resembled a marriage announcement and comments that said “congratulations” pushed posts to the top of the feed. So I began sharing my articles with fake weddings or other life milestones.
11%
Flag icon
More recently, on TikTok, euphemisms have emerged for terms that trigger the algorithm to block or slow down a video: “unalive” for kill, “SA” for sexual assault, “spicy eggplant” instead of vibrator,
11%
Flag icon
Such vocabulary was nicknamed “algospeak”: speech molded in the image of the algorithm.
12%
Flag icon
Around 2015, Facebook decided to prioritize video content, so the recommendation algorithm promoted videos much more than it did previously.
12%
Flag icon
In 2016, Facebook added “reactions” to posts, so that viewers could respond with a range of emoticons rather than just the Like button.
12%
Flag icon
But that change backfired, too, when incendiary content—posts that received many angry-face reactions, for example, like rage-inducing political stories—was getting too much promotion and souring the tone of the entire site.
12%
Flag icon
Almost every major social network followed the same path over the 2010s. Filterworld began taking shape in the middle of the decade when algorithmification intensified.
12%
Flag icon
Larger cultural consequences, unexpected by users and perhaps by the companies themselves, followed this shift—the way that damming a river changes an entire ecosystem.
12%
Flag icon
It’s impossible to know what someone else is seeing at a given time, and thus harder to feel a sense of community with others online, the sense of collectivity you might feel when watching a movie in a theater or sitting down for a prescheduled cable TV show.
12%
Flag icon
Culture is meant to be communal and requires a certain degree of consistency across audiences; without communality, it loses some of its essential impact.
13%
Flag icon
With new technology, the miraculous quickly becomes mundane, any glitch in its function is felt as bothersome, and finally it becomes ignorable, the miracle forsaken.
13%
Flag icon
Such is the presence algorithmic feeds now have in our lives; the algorithm is often unconsidered, part of the furniture, noticed only when it doesn’t function in the way it’s supposed to, like traffic lights or running water.
13%
Flag icon
Social networks and streaming services have become the primary way a significant percentage of the global population metabolizes information, whether it’s music, entertainment, or art.
13%
Flag icon
Today, it is difficult to think of creating a piece of culture that is separate from algorithmic feeds, because those feeds control how it will be exposed to billions of consumers in the international digital audience.
13%
Flag icon
And it is even more difficult to think of consuming something outside of algorithmic feeds, because their recommendations inevitably influence what is shown on television, played on the radio, and published in books, even if those experiences are not contained within feeds.
13%
Flag icon
Under algorithmic feeds, the popular becomes more popular, and the obscure becomes even less visible. Success or failure is accelerated.
13%
Flag icon
The absence of attention inevitably raises the question of what the feed will promote, tacitly encouraging safer choices, urging conformity.
14%
Flag icon
It’s often not the original creators of a meme or trend who get credit, attention, and thus financial gain from its popularity in an algorithmic feed.
14%
Flag icon
We’re encouraged to overlook algorithmic processes, but their glitches remind us of their unearned authority.
14%
Flag icon
Algorithmic anxiety describes the burgeoning awareness that we must constantly contend with automated technological processes beyond our understanding and control, whether in our Facebook feeds, Google Maps driving directions, or Amazon product promotions.
14%
Flag icon
(The inconsistency of algorithmic promotion forces us to engage with it and stress about it even more, like repeatedly pulling a slot machine lever to hit the jackpot.)
14%
Flag icon
The hosts’ reaction to such algorithmic anxiety, the researchers found, was to develop “folk theories”—superstitious tricks that were meant to goose more algorithmic promotion and better search results—the same way I used to post my article links with fake wedding announcements.
15%
Flag icon
There is little incentive for companies to assuage this anxiety because a user’s confusion can be beneficial to business. When a company’s product is ineffective or a user encounters difficulty, it can be blamed on the opaque entity of “the algorithm,” which is perceived as external to both the users and the company itself, since they are likened to opaque “black boxes.”
15%
Flag icon
Users often fear that their account specifically has been blocked without warning or recourse by some decision-maker; but the algorithmic priorities may simply have silently changed, and traffic is no longer flowing in their direction.
15%
Flag icon
The possibilities that we perceive for ourselves—our modes of expression and creation—now exist within the structures of digital platforms.
15%
Flag icon
It builds to a sense that, since we users cannot control the technology, we may as well succumb to the limits of algorithmic culture and view it as inevitable. Many users have already entered such a state of despair, both dissatisfied and unable to imagine an alternative.
16%
Flag icon
It was hard to discern a sense of taste in Amazon Books; there was no spirit there with which I could identify. Instead, the overall narrative was driven wholly by the market and whatever provoked attention.
16%
Flag icon
One reason the Amazon bookstore felt so strange was that it represented a blatant intrusion of the Internet’s algorithmic logic into what we call “real life.”
17%
Flag icon
These tastemakers all provide an interface between the creators of culture and its consumers. They constantly gather and judge new material to determine how and why it may resonate with audiences—a process that now falls under the broad banner of the word curation.
17%
Flag icon
With Netflix’s home page, Facebook’s feed, and Spotify’s automated radio, there is no direct influence from editor, DJ, or booker, but, rather, a mathematical processing of crowdsourced data stretching to encompass every user on the site.
17%
Flag icon
The “taste” of tastemakers means personal preference, the discernment that we all apply to figure out what we like, whether in music, fashion, food, or literature. We make constant decisions to listen to, read, or wear one thing instead of another.
17%
Flag icon
But in its origins, taste is a much deeper philosophical concept. It borders on morality, representing an innate sense of what is good in the world.
17%
Flag icon
Voltaire wrote, “In order to have taste, it is not enough to see and to know what is beautiful in a given work. One must feel beauty and be moved by it. It is not even enough to feel, to be moved in a vague way: it is essential to discern the different shades of feeling.”
17%
Flag icon
Taste is a fundamental part of the self; developing or indulging it means constructing a firmer sense of self. It becomes the basis for identity.
18%
Flag icon
Art itself was not meant to be generic or cater to a broad audience, Okakura argued in a discussion of tearoom design: “That the tearoom should be built to suit some individual taste is an enforcement of the principle of vitality in art.”
18%
Flag icon
Yet catering to “the taste of the majority” might be the single goal of algorithmic feeds—a majority based on data.
18%
Flag icon
Montesquieu crucially argued that surprise, which can be alienating or challenging, like a particularly ugly wabi-sabi Japanese tea vessel, is a fundamental element of taste.
18%
Flag icon
Understanding this feeling of surprise can take time. Taste is not necessarily instantaneous and changes as you consider and digest the experience of an artwork: “We become aware of the presence of great beauty when something inspires us with a surprise which at first is only mild, but which continues, increases, and finally turns into admiration.”
18%
Flag icon
If taste indeed must be deeply felt, requires time to engage with, and benefits from the surprise that comes from the unfamiliar, then it seems that technology could not possibly replicate it, because algorithmic feeds run counter to these fundamental qualities.
18%
Flag icon
The feed structure also discourages users from spending too much time with any one piece of content. If you find something boring, perhaps too subtle, you just keep scrolling, and there’s no time for a greater sense of admiration to develop—one is increasingly encouraged to lean into impatience and superficiality in all things.
18%
Flag icon
Building your own sense of taste, that set of subconscious principles by which you identify what you like, is an uphill battle compared to passively consuming whatever content feeds deliver to you.
18%
Flag icon
We are free to choose anything. Yet the choice we often make is to not have a choice, to have our purview shaped by automated feeds, which may be based on the aggregate actions of humans but are not human in themselves.
19%
Flag icon
As taste requires surprise, it also thrives on challenge and risk, treading too far in a particular direction. Safety may avoid embarrassment, but it’s also boring.