Filterworld: How Algorithms Flattened Culture
Rate it:
Open Preview
Kindle Notes & Highlights
Read between October 21 - December 6, 2024
57%
Flag icon
The app gradually lost its identity as a relatively austere space for expressing your own taste. “At no point were these decisions made to make the platform a better place for the people creating the content,” Janelle said. “They were purely decisions made by the growth team and by the business team to figure out how they could expand.”
57%
Flag icon
The constant need to figure out the next big social media platform is reminiscent of early silent-film stars trying to make the switch to talkies in the early twentieth century, or theater actors moving to television: not everyone made it, and not everyone’s artistic approach functioned in the new medium.
58%
Flag icon
For so many career paths in Filterworld, following the demands of various feeds has become an almost unavoidable commandment. The pressure is so great that the promotional content has a way of superseding the actual craft.
58%
Flag icon
Kate Eichhorn in her 2022 monograph Content,
Ruth
[TBR]
59%
Flag icon
the excess content demanded by algorithmic feeds more often gets in the way of art, because it sucks up an increasingly high percentage of a creator’s time.
Ruth
Hate this for us.
65%
Flag icon
Users changing their behavior can only go so far; we can’t trust that the mechanisms will ever prioritize our well-being over sparking more engagement that drives advertising revenue. Users can exert only certain kinds of agency within digital platforms. They can pursue a specific theme of content, for example, but can’t alter the equation of the recommendation algorithm. We don’t have enough alternative options to navigate the Internet outside of algorithmic feeds, in part because the Internet is now so dominated by just a few companies.
66%
Flag icon
America Online (AOL) incorporated Usenet in 1993, and the sudden influx of noobs became known as “Eternal September.” The implication of the phrase was that the wrong kind of user was suddenly dominant in these once-niche groups.
68%
Flag icon
Innovation starts at a small level, with a group of users or a new app enabling new forms of behavior, like publishing a new kind of content or creating new communities. For a time, digital culture feels organic and exciting, different from what came before it. But then those new behaviors and features are adopted by larger companies, by copying, by business pressure, or by mergers and acquisitions. Users jump on board with newfound excitement about their old apps, but the newness fades as the innovations are monetized to death. Any joy in the new forms of expression is ruthlessly exploited, ...more
68%
Flag icon
Authentic online cultures are always being ruined, as users inevitably complain, but also always emerging again.
Ruth
This is a key point in life: nothing stays the same forever and you never know what is around the next bend.
70%
Flag icon
But algorithms can misinterpret language. Gade showed me a case in which a model was assigning the word gay a very negative connotation, meaning that content that included it wasn’t getting prioritized. That could be a complete mistake if the word is meant positively—or perhaps it should be interpreted as neutral. If automated content moderation or recommender systems misinterpret a word or behavior, “You could potentially silence an entire demographic,” Gade said. That’s why being able to see what’s happening around a given algorithmic decision is so important.
71%
Flag icon
Just as digital platforms aren’t responsible for explaining their algorithmic feeds, they also don’t take responsibility for what the feeds promote—they separate themselves from the outcomes of their recommender systems. They can do that because of the United States’s 1996 Telecommunications Act, which included a Communications Decency Act with a piece called Section 230.
71%
Flag icon
it has also allowed the tech companies that have supplanted traditional media businesses to operate without the safeguards of traditional media.
71%
Flag icon
The conflicting cases brought up a fundamental paradox: Internet services that did nothing to filter the content going to users were legally protected, while services that did try to filter the content, even just for basic quality or safety, were not protected. It was riskier for Internet companies to try to influence content at all.
73%
Flag icon
Facebook outsources much of its human moderation to a company called Accenture, which employs thousands of moderators around the world, including in countries like Portugal and Malaysia. These laborers face daily exposure to deaths on camera, recorded abuse, and child pornography. They keep the feeds clean for the rest of us, exposing themselves to psychic harm the way trash pickers digging through international electronic waste dumped in Ghana and elsewhere are exposed to poisonous chemicals. The toxic material doesn’t just magically vanish because of the mediation of the algorithm. Once ...more
Ruth
I appreciate the author making the connection between our toxic content and our toxic waste, and how the consequences of both are pushed onto more marginalized communities to deal with.
82%
Flag icon
Those forums were “communities of consumption,” a term that academics have used to describe the diverse groups of people that congregate online around a particular shared pursuit, whether swapping product tips or discussing avant-garde literature. One paper described communities of consumption as a form of “mutual learning”—we collectively figure out what it is that we’re looking for and how to find it.
82%
Flag icon
connoisseurship referred to amateur collectors who could tell which artist painted a work based solely on looking at it.
83%
Flag icon
That encouraged shallowness of consumption contributes to the overall flatness of culture in Filterworld.
84%
Flag icon
What we gain with algorithmic feeds in terms of availability—having instant access to a broad range of material to be scanned at will—we lose in connoisseurship, which requires depth and intention. It’s ultimately a form of deep appreciation, for what the artist has done as well as the capacities of our own tastes.
85%
Flag icon
We should talk even more about the things we like, experience them together, and build up our own careful collections of likes and dislikes. Not for the sake of fine-tuning an algorithm, but for our collective satisfaction.
94%
Flag icon
We turn to art to seek connection, yet algorithmic feeds give us pure consumption.
95%
Flag icon
I found that the way to fight the generic is to seek the specific, whatever you are drawn toward. You don’t need to be a credentialed or professionalized expert to be a connoisseur. You don’t need to monetize your opinion as an influencer for it to be legitimate.
Ruth
Wish I could double-highlight the last sentence there: You don't need to monetize your opinion as an influencer for it to be legitimate.
99%
Flag icon
The anthropologist David Graeber once wrote: “The ultimate, hidden truth of the world is that it is something that we make, and could just as easily make differently.” The same is true of the Internet.
« Prev 1 2 Next »