More on this book
Community
Kindle Notes & Highlights
The Amazon website began using collaborative filtering as early as 1998 to recommend products for customers to buy. Rather than attempting to measure similar profiles of users to approximate taste, as Ringo did, the system worked by determining which items were likely to be purchased in tandem—a rattle with a baby bottle, for example.
Amazon found that the personalized product recommendations were much more effective in terms of click-throughs and sales than unpersonalized marketing tactics like banner advertisements and lists of bestselling products, which can’t be as tightly targeted.
Zuckerberg’s Facebook tied online identity coherently and consistently to the offline world. The platform encouraged users to use their real names rather than arcane aliases and influenced real-life plans in the small world of college:
Facebook’s Like button, with its signature thumbs-up, was introduced in 2009, providing one form of data on how interested a user might be in a particular piece of content. User engagement, measured by likes, comments, and one account’s previous interactions with another, factored into the order of the feed.
The EdgeRank scores were not permanently assigned once, like the outcome of a basketball game in a tournament, but changed instant to instant.
The algorithmic feed itself is not consistent or on a linear path toward some ultimate perfection. It changes with a company’s priorities.
The relationship was almost oppositional; only if you “gamed” the algorithm would you be heard. You could no longer rely on users who had followed or friended you seeing your posts.
At another point, it became clear that writing text that resembled a marriage announcement and comments that said “congratulations” pushed posts to the top of the feed. So I began sharing my articles with fake weddings or other life milestones.
More recently, on TikTok, euphemisms have emerged for terms that trigger the algorithm to block or slow down a video: “unalive” for kill, “SA” for sexual assault, “spicy eggplant” instead of vibrator,
Such vocabulary was nicknamed “algospeak”: speech molded in the image of the algorithm.
Around 2015, Facebook decided to prioritize video content, so the recommendation algorithm promoted videos much more than it did previously.
In 2016, Facebook added “reactions” to posts, so that viewers could respond with a range of emoticons rather than just the Like button.
But that change backfired, too, when incendiary content—posts that received many angry-face reactions, for example, like rage-inducing political stories—was getting too much promotion and souring the tone of the entire site.
Almost every major social network followed the same path over the 2010s. Filterworld began taking shape in the middle of the decade when algorithmification intensified.
Larger cultural consequences, unexpected by users and perhaps by the companies themselves, followed this shift—the way that damming a river changes an entire ecosystem.
It’s impossible to know what someone else is seeing at a given time, and thus harder to feel a sense of community with others online, the sense of collectivity you might feel when watching a movie in a theater or sitting down for a prescheduled cable TV show.
Culture is meant to be communal and requires a certain degree of consistency across audiences; without communality, it loses some of its essential impact.
With new technology, the miraculous quickly becomes mundane, any glitch in its function is felt as bothersome, and finally it becomes ignorable, the miracle forsaken.
Such is the presence algorithmic feeds now have in our lives; the algorithm is often unconsidered, part of the furniture, noticed only when it doesn’t function in the way it’s supposed to, like traffic lights or running water.
Social networks and streaming services have become the primary way a significant percentage of the global population metabolizes information, whether it’s music, entertainment, or art.
Today, it is difficult to think of creating a piece of culture that is separate from algorithmic feeds, because those feeds control how it will be exposed to billions of consumers in the international digital audience.
And it is even more difficult to think of consuming something outside of algorithmic feeds, because their recommendations inevitably influence what is shown on television, played on the radio, and published in books, even if those experiences are not contained within feeds.
Under algorithmic feeds, the popular becomes more popular, and the obscure becomes even less visible. Success or failure is accelerated.
The absence of attention inevitably raises the question of what the feed will promote, tacitly encouraging safer choices, urging conformity.
It’s often not the original creators of a meme or trend who get credit, attention, and thus financial gain from its popularity in an algorithmic feed.
We’re encouraged to overlook algorithmic processes, but their glitches remind us of their unearned authority.
Algorithmic anxiety describes the burgeoning awareness that we must constantly contend with automated technological processes beyond our understanding and control, whether in our Facebook feeds, Google Maps driving directions, or Amazon product promotions.
(The inconsistency of algorithmic promotion forces us to engage with it and stress about it even more, like repeatedly pulling a slot machine lever to hit the jackpot.)
The hosts’ reaction to such algorithmic anxiety, the researchers found, was to develop “folk theories”—superstitious tricks that were meant to goose more algorithmic promotion and better search results—the same way I used to post my article links with fake wedding announcements.
There is little incentive for companies to assuage this anxiety because a user’s confusion can be beneficial to business. When a company’s product is ineffective or a user encounters difficulty, it can be blamed on the opaque entity of “the algorithm,” which is perceived as external to both the users and the company itself, since they are likened to opaque “black boxes.”
Users often fear that their account specifically has been blocked without warning or recourse by some decision-maker; but the algorithmic priorities may simply have silently changed, and traffic is no longer flowing in their direction.
The possibilities that we perceive for ourselves—our modes of expression and creation—now exist within the structures of digital platforms.
It builds to a sense that, since we users cannot control the technology, we may as well succumb to the limits of algorithmic culture and view it as inevitable. Many users have already entered such a state of despair, both dissatisfied and unable to imagine an alternative.
It was hard to discern a sense of taste in Amazon Books; there was no spirit there with which I could identify. Instead, the overall narrative was driven wholly by the market and whatever provoked attention.
One reason the Amazon bookstore felt so strange was that it represented a blatant intrusion of the Internet’s algorithmic logic into what we call “real life.”
These tastemakers all provide an interface between the creators of culture and its consumers. They constantly gather and judge new material to determine how and why it may resonate with audiences—a process that now falls under the broad banner of the word curation.
With Netflix’s home page, Facebook’s feed, and Spotify’s automated radio, there is no direct influence from editor, DJ, or booker, but, rather, a mathematical processing of crowdsourced data stretching to encompass every user on the site.
The “taste” of tastemakers means personal preference, the discernment that we all apply to figure out what we like, whether in music, fashion, food, or literature. We make constant decisions to listen to, read, or wear one thing instead of another.
But in its origins, taste is a much deeper philosophical concept. It borders on morality, representing an innate sense of what is good in the world.
Voltaire wrote, “In order to have taste, it is not enough to see and to know what is beautiful in a given work. One must feel beauty and be moved by it. It is not even enough to feel, to be moved in a vague way: it is essential to discern the different shades of feeling.”
Taste is a fundamental part of the self; developing or indulging it means constructing a firmer sense of self. It becomes the basis for identity.
Art itself was not meant to be generic or cater to a broad audience, Okakura argued in a discussion of tearoom design: “That the tearoom should be built to suit some individual taste is an enforcement of the principle of vitality in art.”
Yet catering to “the taste of the majority” might be the single goal of algorithmic feeds—a majority based on data.
Montesquieu crucially argued that surprise, which can be alienating or challenging, like a particularly ugly wabi-sabi Japanese tea vessel, is a fundamental element of taste.
Understanding this feeling of surprise can take time. Taste is not necessarily instantaneous and changes as you consider and digest the experience of an artwork: “We become aware of the presence of great beauty when something inspires us with a surprise which at first is only mild, but which continues, increases, and finally turns into admiration.”
If taste indeed must be deeply felt, requires time to engage with, and benefits from the surprise that comes from the unfamiliar, then it seems that technology could not possibly replicate it, because algorithmic feeds run counter to these fundamental qualities.
The feed structure also discourages users from spending too much time with any one piece of content. If you find something boring, perhaps too subtle, you just keep scrolling, and there’s no time for a greater sense of admiration to develop—one is increasingly encouraged to lean into impatience and superficiality in all things.
Building your own sense of taste, that set of subconscious principles by which you identify what you like, is an uphill battle compared to passively consuming whatever content feeds deliver to you.
We are free to choose anything. Yet the choice we often make is to not have a choice, to have our purview shaped by automated feeds, which may be based on the aggregate actions of humans but are not human in themselves.
As taste requires surprise, it also thrives on challenge and risk, treading too far in a particular direction. Safety may avoid embarrassment, but it’s also boring.