Unmasking AI: My Mission to Protect What Is Human in a World of Machines
Rate it:
Open Preview
Kindle Notes & Highlights
28%
Flag icon
Faithful to the data the model was trained on, it filtered out résumés indicating a candidate was a woman. This was the by-product of prior human decisions that favored men. At Amazon, the initial system was not adopted after the engineers were unable to take out the gender bias. The choice to stop is a viable and necessary option.
28%
Flag icon
The work of Nina Jablonski on skin distribution around the world shows the majority of the world’s populations have skin that would be classified on the darker end of most skin classification scales.
28%
Flag icon
Colorism is a stepchild of white supremacy that is seldom discussed.
29%
Flag icon
For example, ableism, which privileges able-bodied individuals, is another kind of power shadow often lurking in datasets, particularly those used for computer vision. For pedestrian tracking datasets, few have data that specifically include individuals who use assistive devices. Just as the past dwells in our data, so too do power shadows that show existing social hierarchies on the basis of race, gender, ability, and more. Relying on convenient data collection methods by collecting what is most popular and most readily available will reflect existing power structures.
29%
Flag icon
The existing gold standards did not represent the full sepia spectrum of humanity. Skewed gold standard benchmark datasets led to a false sense of universal progress based on assessing the performance of facial recognition technologies on only a small segment of humanity.
30%
Flag icon
turkers,
30%
Flag icon
After deciding to focus on gender classification for the sake of the technical simplicity of binary classification, I still had to deal with the notion of race.
30%
Flag icon
I needed to factor in more than gender categories in my experiments, and so I began an unexpected exploration into an area of study known as ethnic enumeration. I learned that across the world the rationale for categorizing people by race and ethnicity differed, and even the use of the terms race, ethnicity, or color to define the categorization varied.
30%
Flag icon
In places like Canada, the term “visible minority” is used, a term that acknowledges that outward appearance is what is used to make assumptions about race that hold real social consequences. In the Canadian case, and in places like the United Kingdom and the United States, racial categorization is used by the government to better understand where to allocate resources and support minority groups or those who might face discrimination.
30%
Flag icon
I was also surprised to learn about the wide variety of ethnic groups in Europe, as I had been socialized to use the broad label of white for people with European roots.
31%
Flag icon
West African smile.
31%
Flag icon
Alan, whose partner was a Frenchman, and I also talked about our amusement at seeing French and English animosity as we navigated Oxford University during our Rhodes Scholarships. To us they all looked “white,” just as to me Alan looked “Asian,” and depending on the part of the world I was in I looked “Black,” or for people with more experience with “Black” faces, I looked “West African.” No one ever guessed I was a mixture of Ashanti and Dagua tribes.
31%
Flag icon
My parents grew up without race being a primary social issue. When we moved to the United States, it took some time before they would link negative experiences to race. If I received bad treatment, they wouldn’t immediately assume it was because of race, but instead they would want to know the details of the situation. I grew up in Oxford, Mississippi, so my racial consciousness was shaped by the context of Black people being minorities and being stereotyped. When my elementary school classmate Billy invited the white kids in our class to his birthday party, but excluded me, I was pretty sure ...more
31%
Flag icon
Some studies were so crude it was almost comical—their labels included “white” and “non-white.” Others tried to borrow from existing classification systems and used labels like “caucasoid” and “negroid,” classifications that have roots in eugenics and scientific racism.
31%
Flag icon
I decided instead of looking at race I wanted to look for a more objective measure, which is when I started to focus on not just demographic attributes like gender and race, but also phenotypic attributes, namely the color of someone’s skin. Since face-based gender classification using photos relied on imaging technology, and since skin responds to light, focusing on the color of skin seemed to be a way to be more specific and objective. So, I began looking at ways people have classified skin.
32%
Flag icon
Having the ability to define classification systems is in itself a power.
33%
Flag icon
IBM faced fire for use of a subset of YFCC100M, a dataset released by Yahoo containing 100 million photos under a Creative Commons license on their Flickr platform.[8] Many people did not know their faces were being used in a research database that had been repurposed for IBM’s Diversity in Faces (DiF) dataset, which took a subset of about 1 million images from YFCC100M.
35%
Flag icon
Artists who already struggle to make a living based on their creative practice have expressed alarm at the use of their work to fuel generative AI systems.
36%
Flag icon
Even if the image is deleted, the many copies of the dataset already made still contain the images. This is why when Meta (then Facebook) announced the deletion of nearly 1 billion faceprints, there was pushback alongside the celebration. The celebration was around the fact that a major tech company deleting the faceprints was an acknowledgment of the risks associated with face-based technologies that many organizations, including the AJL, had been highlighting. But we do not know if someone at the organization made a secret backup copy, and we may never know. Facebook’s actions provided a ...more
36%
Flag icon
We should mandate deep data deletions to prevent the development of AI harms that stem from the collection of unconsented data.
36%
Flag icon
Because so much of modern life is based on interacting through the internet, and because governments increasingly encourage the use of digital systems, choosing not to use the internet is like attempting to live off-grid. At some point you will likely feel forced to participate, particularly in emergency situations.
36%
Flag icon
potholes
37%
Flag icon
The elephant can be perceived as many things depending on whether you touch the tail, the leg, or the trunk.
38%
Flag icon
The point remains: For machine learning models data is destiny, because the data provides the model with the representation of the world as curated by the makers of the system. Just as the kinds of labels that are chosen reflect human decisions, the kind of data that is made available is also a reflection of those who have the power to collect and decide which data is used to train a system.
38%
Flag icon
Guessing at a ground truth is already a sign you are on shaky ground.
38%
Flag icon
tween
38%
Flag icon
I had mixed feelings. On one hand, I was intentionally rejecting fitting gendered expectations about how I “should” look or sound as a “girl child.” So in that regard I had succeeded. However, having my name questioned compounded all the teasing I already faced and made me feel self-conscious.
38%
Flag icon
Social clubs and sororities established by elite Black organizations in the United States even used the brown paper bag test as a form of social exclusion. Being too black, darker than the brown paper bag, meant rejection based on skin color.[1]
39%
Flag icon
For people considered multiracial, there is also exclusion based on never quite belonging in any particular group. Knowing the impact of skin color and having particular sensitivity since my own skin is on the darker end of the spectrum, I also experienced the phenomenon of people being offended if they were perceived as darker than they perceived themselves to be.
39%
Flag icon
At home my mother filled our house with beautiful dark-skinned people, including photos of my relatives, and told me to ignore the nonsense. It was still painful to know that I was considered less than by some simply because of my skin color. Not to mention that the assumption that because I was from Africa I must have the darkest skin was ignorant of the ethnic and skin diversity across the continent.
39%
Flag icon
The subject of colorism can be taboo, as some see it as divisive in the push for racial justice. The cruelty of colorism is that it recapitulates social rejection and exclusion based on race into a hierarchy based on skin color.
39%
Flag icon
I wondered if that lack was because some of the leading Black voices, on the privileged end of colorism, did not see it as a topic worthy of discussion, were uncomfortable addressing their own color privilege, or saw the topic as dirty laundry to be kept out of the spotlight.
39%
Flag icon
Melanin comes in three flavors: eumelanin, pheomelanin, and neuromelanin. Eumelanin and pheomelanin affect skin color and hair color, while neuromelanin affects the color of the brain.
40%
Flag icon
AI reflects both our aspirations and our limitations. Our human limitations provide ample reasons for humility about the capabilities of the AI systems we create. Algorithmic justice necessitates questioning the arbiters of truth, because those with the power to build AI systems do not have a monopoly on truth.
40%
Flag icon
Getting on the bad side of one of these companies could have serious career consequences.
42%
Flag icon
lobbed
45%
Flag icon
bluegrass
47%
Flag icon
Longtermist thinking isn’t isolated to late-night ruminations by eccentrics. For example, Oxford philosopher Nick Bostrom posed the paper clip thought experiment as a way of illustrating why he believes in the need to plan ways to safeguard against super intelligence that emerges from machines. The thought experiment goes as follows: When humans give an AI system a goal to reach, we do not have full control over how that AI system will reach that goal.
47%
Flag icon
Paper clip production, like computer vision systems mistaking blueberry cupcakes for chihuahuas, might seem like an inconsequential, trivial, or cute example at best.
48%
Flag icon
You can be excoded when a hospital uses AI for triage and leaves you without care, or uses a clinical algorithm that precludes you from receiving a life-saving organ transplant.[6] You can be excoded when you are denied a loan based on algorithmic decision-making.[7] You can be excoded when your résumé is automatically screened out and you are denied the opportunity to compete for the remaining jobs that are not replaced by AI systems.[8] You can be excoded when a tenant screening algorithm denies you access to housing.[9]
48%
Flag icon
No one is immune from being excoded, and those already marginalized are at greater risk.
49%
Flag icon
Angela Bassett, who was in her late fifties at the time of the photo, was estimated by IBM’s system to be between eighteen and twenty-four years old. (Maybe not all algorithmic bias was that bad.)
50%
Flag icon
Until making the white mask fail demo, I thought of tech demonstrations as celebrations of what machines could do. If a demonstration included a failure, the demo gods had failed you.
51%
Flag icon
The dehumanizing portrayal of largely undressed enslaved individuals in these “scientific” studies complemented the ongoing cultural and political denigration of Black people in the United States, justifying and naturalizing their subordination and brutalization. Aware of the power of images and the stories they can tell, Sojourner Truth used the power of photography to portray herself using the dress code associated with middle-class white women of the time. It was this image of Truth, intentionally wearing what were considered to be quintessentially feminine garments, that I submitted to ...more
51%
Flag icon
Truth became a powerful orator pushing for abolition and women’s rights while also pointing out contradictions in the arguments used by white women to justify these rights in her “Ain’t I A Woman?” speech. She used her voice to push for change, but she also used her image to support herself financially by selling cartes de visite (collectible cards with photographs and messages that were a form of mass communication in the 1860s). Beyond providing financial support, Truth’s images also provided support for ending slavery, joining the ongoing project of using photography to present Black people ...more
51%
Flag icon
Truth and Douglass skillfully reused influential technology to shatter dehumanizing portraits that were constructed using the same tools. They showed that counter-demos do not just demonstrate but also demolish assumptions by offering real-world examples that shake the status quo. Similarly, AI systems can be used as tools of oppression as well as tools of liberation.
51%
Flag icon
In 2009, YouTube user wzamen01 posted a viral video of an HP laptop with a face tracking feature. The video has received over 3 million views and more than sixty-five hundred comments at the time of writing. The video application shown was supposed to pan along with the movement of the face in the video stream. While the system worked fine for the person with lighter skin, in the frame referred to as “Wanda,” for the darker-skinned person, the pan feature did not work. The person referred to as “Desi” states, “I’m Black. I think my blackness is interfering with the computer’s ability to follow ...more
52%
Flag icon
FREDERICK DOUGLASS REMINDS US THAT the stories we evoke through imagery can allow those whose power does not rest on vast material resources to make change: Poets, prophets, and reformers are all picture-makers—and this ability is the secret of their power and of their achievements. They see what ought to be by the reflection of what is, and endeavor to remove the contradiction.[*2]
52%
Flag icon
I was concerned that the subjectivity of my poetry would be viewed in opposition to the objectivity of my technical research. If it appeared that I already had a conclusion in mind before gathering and analyzing the data, I risked being considered a biased and thus less credible researcher. My future as an AI researcher was at stake.
52%
Flag icon
On this day I would convene with world leaders and executives of tech companies to offer advice on how to manage the pitfalls of artificial intelligence and deliver a gift. But first I had to get into the building.