More on this book
Community
Kindle Notes & Highlights
by
Kyle Chayka
Read between
February 10 - February 11, 2024
There is an element of elitism at play in any evaluation that casts social media as the opposite of art. Not everyone has access to the traditional, more acceptable routes of art making: Ivy League universities, literary magazines, Chelsea galleries.
Later, she adapted her stores to the literary-influencer side of TikTok, called Booktok, too, making space for shooting videos instead of still images.
What Depp has less of a grasp on is which books become popular on Bookstagram or Booktok.
“Whatever it is that you’re consuming just becomes an expression of your self; it exists only insofar as it can describe you,” she said. On the platform, books are popularized less as texts to read than as purchasable lifestyle accessories, visual symbols of an identity. Such is the narcissism encouraged by Filterworld.
“They don’t realize the curation isn’t being done by some third party on the Internet, because the way they see all information is something that’s been curated by an invisible hand on the Internet,” she said. The line between catering to algorithmic feeds enough and relying on them too much is a hard one to walk. The temptation to court the extreme attention, and thus profit, is always there, but the cumulative effect of so much reliance on automated feeds is a kind of desensitization. We end up not being able to imagine culture operating any other way than algorithmically.
The philosopher Byung-Chul Han has described how people living in post-Internet society may “no longer have an unconscious.”
“Sometimes people don’t know what they want until you show it to them. There is some part of me that resists just giving people exactly what they ask for,” Hallie said.
Trends and platforms always change, but she could be sure she knew where her work was going: “If I adapt to every trend, if I hop on every new platform and try to build a following there, I’m going to be building sandcastle after sandcastle. If the algorithm is failing us now, that means it was never stable. It was like a fair-weather friend.”
Walker formally cited social media as potentially lethal, a medical danger.
Russell’s death was part of the human toll of algorithmic overreach, when content moves too quickly at too vast a scale to be moderated by hand. No magazine’s editor would have published a flood of such depression content, nor would a television channel broadcast it. But the algorithmic feed could assemble an instant, on-demand collection, delivering what Russell may have found most engaging even though it was harmful for her.
Users changing their behavior can only go so far; we can’t trust that the mechanisms will ever prioritize our well-being over sparking more engagement that drives advertising revenue.
Still, the blog made me understand what it meant to have a digital shadow self, a version of your life and personality that only existed online. At that time, it felt like a radical innovation, a refreshing novelty. I could control how I presented myself online.
Facebook would acquire the company, or it would make sure its growth slowed down by cutting off the start-up’s access to Facebook’s much larger platform, with its business software and social data. The strategy was “buy or bury.”
Already, when Facebook bought Instagram, it felt as though the walls of the Internet were closing in a little tighter around us users. The broad expanse of possibility, of messiness, on a network like Geocities or the personal expression of Tumblr was shut down. Digital life became increasingly templated, a set of boxes to fill in rather than a canvas to cover in your own image. (You don’t redesign how your Facebook profile looks; you just change your avatar.) I felt a certain sense of loss, but at first the trade-off of creativity for broadcast reach seemed worthwhile: You could talk to so
...more
Messaging “is one of the most dangerous beach heads to morph into Facebook,”
In early 2014, Facebook agreed to acquire WhatsApp for $19 billion—though WhatsApp had most recently been valued at $1.5 billion—an incredibly high price to pay for a start-up company.
Instagram and WhatsApp are just two of Facebook’s dozens of acquisitions. Google similarly acquired YouTube in 2006 and turned the video-uploading site into a media-consumption juggernaut, a replacement for cable television. Other social networks didn’t survive. Tumblr, for example, once on par with Twitter and Facebook, was bought by Yahoo in 2013 for $1.1 billion.
We’re asked to use tools to build our own spaces, to freely express ourselves, and then commanded to fit within a preset palette determined by a social network. Yet as soon as one standard becomes dominant, it seems to lose its grip. There is no teleological arc for digital platforms; they don’t move in one direction toward perfection, the way hard drives have been able to store more and more data over time. Instead, it is cyclical, swinging between different strategies of centralization and decentralization like a pendulum.
For a time, digital culture feels organic and exciting, different from what came before it. But then those new behaviors and features are adopted by larger companies, by copying, by business pressure, or by mergers and acquisitions. Users jump on board with newfound excitement about their old apps, but the newness fades as the innovations are monetized to death. Any joy in the new forms of expression is ruthlessly exploited, most often in the form of increased advertising.
Authentic online cultures are always being ruined, as users inevitably complain, but also always emerging again. No platform is ever completely safe. The biggest incumbent could be threatened by a tiny newcomer, simply because of a slight technological evolution—like Snapchat’s ephemeral posts or TikTok’s wholly algorithmic feed—or the unavoidable fact that people simply get bored, and technology, like fashion, must constantly change to maintain its hold over its users’ attention.
Still, the Internet in its current era has never looked more monolithic. Individual websites have been subsumed into ever-flowing feeds. All content has to fit into the same few molds. Content creators may have their choice of platform, but the platforms themselves increasingly resemble one another and func...
This highlight has been truncated due to consecutive passage length restrictions.
Should social networks be treated like newspapers and television channels, responsible for everything hosted within their domains? They have long escaped that responsibility. Or should they be classified more like telephone lines, theoretically neutral transmitters of information? But they are decidedly not neutral, given their algorithmic judgments. Or perhaps social media belongs in the category of vice industries, with tightly regulated limits meant for the safety of individuals who might otherwise abuse it. After all, so many users are addicted.
No matter how we classify the digital platforms that make up Filterworld, it’s clear that they need some form of regulation. As users, we only feel the consequences of their structures and adapt our behavior to them.
The quickest way to change how digital platforms work may be to mandate transparency: forcing the companies to explain how and when their algorithmic recommendations are working. Transparency would at least give users more information about the decisions constantly being made about what to show us. And if we know how algorithms work, perhaps we’ll be better able to resist their influence and make our own decisions.
As recently as 2021, Nick Clegg, the president of global affairs at Facebook’s parent company Meta (he was previously the deputy prime minister of the United Kingdom), was still projecting a path toward more transparency. “You should be able to talk back to the algorithm and consciously adjust or ignore the predictions it makes,” he wrote in a post on Medium titled “You and the Algorithm: It Takes Two to Tango.” According to Clegg, it should be possible to “alter your personal algorithm in the cold light of day, through breathing spaces built into the design of the platform.”
There is no pure form of culture that happens outside of technological influence, nor is there a singular best way to consume culture. We cannot just rid ourselves of algorithmic influence, even if we wanted to, since the technology has already inexorably shaped our era. But the first step of escaping the algorithms’ grip is recognizing it. By moving away from the mindset of passive consumption and thinking about a post-algorithmic digital ecosystem, we begin to construct that alternative, demonstrating that the influence of algorithms is neither inevitable nor permanent. Eventually
...more