More on this book
Community
Kindle Notes & Highlights
Read between
March 14 - March 16, 2025
You may have heard of the male gaze, a concept developed by media scholars to describe how, in a patriarchal society, art, media, and other forms of representation are created with a male viewer in mind. The male gaze decides which subjects are desirable and worthy of attention, and it determines how they are to be judged.
You may also be familiar with the white gaze, which similarly privileges the representation and stories of white Europeans and their descendants.
Inspired by these terms, the coded gaze describes the ways in which the priorities, preferences, and prejudices of those who have the power to shape technology can propagate harm, such as discrimination and erasure. We can en...
This highlight has been truncated due to consecutive passage length restrictions.
Generative AI products are only one manifestation of AI. Predictive AI systems are already used to determine who gets a mortgage, who gets hired, who gets admitted to college, and who gets medical treatment—but products like ChatGPT have brought AI to new levels of public engagement and awareness.
Can we make room for the best of what AI has to offer while also resisting its perils?
We swap fallible human gatekeepers for machines that are also flawed but assumed to be objective. And when machines fail, the people who often have the least resources and most limited access to power structures are those who have to experience the worst outcomes.
Do we need this AI system or this data in the first place, or does it allow us to direct money at inadequate technical Band-Aids without addressing much larger systemic societal issues?
AI will not solve poverty, because the conditions that lead to societies that pursue profit over people are not technical.
As Dr. Rumman Chowdhury reminds us in her work on AI accountability, the moral outsourcing of hard decisions to machines does not solve the underlying social dilemmas.
My parents taught me that the unknown was an invitation to learn, not a menacing dimension to avoid. Ignorance was a starting place to enter deeper realms of understanding.
An algorithm, at its most basic definition, is a sequence of instructions used to achieve a specific goal.
“What is the value of the work I am doing?”
Too often I have put my needs last to please other people. Too often I have said no to my own happiness as if it were some noble sacrifice to be a martyr for a cause.
It is not a badge of honor to be burnt out. It is not a sign of fortitude to over commit.
When individuals framed by society as inherently unworthy based on their gender, race, and background succeed anyway and outshine the competition, the establishment seeks to restore the old order. The old order is crumbling because after enduring a pandemic, continuously experiencing injustice no matter the level of success, and embracing their inherent dignity no accolades required, queens like you are embracing refusal.
The work before the work after doing the work was heavy.
Being erased by machines had become a familiar story to me from a personal standpoint, but this media erasure had a much longer shadow.
Coded Bias centered so many women and especially women of color as experts on the social implications of AI and algorithmic bias.
Symbolic annihilation describes the absence or lack of representation of a particular group. In journalism, whose work is highlighted sends a message about who is viewed as an authority and the face of expertise.
If news coverage depicts Black people only as victims, it perpetuates a harmful trope that we lack agency to make meaningful change.
Complete erasure is one way to “invisibilize” a group, yet inclusion that builds on stereotypical representation can also be harmful. It is not enough just to be seen if you and people like you are rendered in disempowering terms or through disempowering frames.
If AJL launched a #selfiesforinclusion campaign that improved facial recognition, are we inadvertently subjecting more vulnerable populations to unfair scrutiny?
If we do not improve the systems and they continue to be used, what are the implications of having innocent people identified as criminal suspects?