Race After Technology Quotes

Rate this book
Clear rating
Race After Technology: Abolitionist Tools for the New Jim Code Race After Technology: Abolitionist Tools for the New Jim Code by Ruha Benjamin
2,435 ratings, 4.26 average rating, 308 reviews
Open Preview
Race After Technology Quotes Showing 1-30 of 53
“Invisibility, with regard to Whiteness, offers immunity. To be unmarked by race allows you to reap the benefits but escape responsibility for your role in an unjust system.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“the New Jim Code”: the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.8”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“In a classic study of how names impact people’s experience on the job market, researchers show that, all other things being equal, job seekers with White-sounding first names received 50 percent more callbacks from employers than job seekers with Black-sounding names.5 They calculated that the racial gap was equivalent to eight years of relevant work experience, which White applicants did not actually have; and the gap persisted across occupations, industry, employer size – even when employers included the “equal opportunity” clause in their ads.6 With emerging technologies we might assume that racial bias will be more scientifically rooted out. Yet, rather than challenging or overcoming the cycles of inequity, technical fixes too often reinforce and even deepen the status quo. For example, a study by a team of computer scientists at Princeton examined whether a popular algorithm, trained on human writing online, would exhibit the same biased tendencies that psychologists have documented among humans. They found that the algorithm associated White-sounding names with “pleasant” words and Black-sounding names with “unpleasant” ones.7 Such findings demonstrate what I call “the New Jim Code”: the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Better never means better for everyone … It always means worse, for some.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“by focusing mainly on individuals’ identities and overlooking the norms and structures of the tech industry, many diversity initiatives offer little more than cosmetic change, demographic percentages on a company pie chart, concealing rather than undoing the racist status quo.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“If blind people admit to seeing race, why do sighted people pretend not see it?”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Ultimately the danger of the New Jim Code positioning is that existing social biases are reinforced – yes. But new methods of social control are produced as well. Does this mean that every form of technological prediction or personalization has racist effects? Not necessarily. It means that, whenever we hear the promises of tech being extolled, our antennae should pop up to question what all that hype of “better, faster, fairer” might be hiding and making us ignore. And, when bias and inequity come to light, “lack of intention” to harm is not a viable alibi. One cannot reap the reward when things go right but downplay responsibility when they go wrong.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Zeros and ones, if we are not careful, could deepen the divides between haves and have-nots, between the deserving and the undeserving – rusty value judgments embedded in shiny new systems.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“As Kamal Sinclair – emerging media researcher and artist – posits: Story and narrative are the code for humanity’s operating system. We have used stories to communicate knowledge, prescribe behavior, and imagine our futures since our earliest days. Story and narrative inform how we design everything from technology to social systems. They shape the norms in which we perform our identities, even perhaps the mutations of our DNA and perceptions of reality. Stories are the first step in the process of how we imagine our reality; they literally make our reality.84”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“That new tools are coded in old biases is surprising only if we equate technological innovation with social progress.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“As Margaret Atwood writes, “Better never means better for everyone … It always means worse, for some.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“The selective outrage follows longstanding patterns of neglect and normalizes anti-Blackness as the weather, as
Christina Sharpe notes, whereas non-Black suffering is treated as a disaster.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“How nice it must be … to be so tired of living
mortally that one dreams of immortality. Like so many other “posts”
(postracial, postcolonial, etc.), posthumanism grows out of the Man’s
experience.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Social and legal codes, like their byte-size counterparts, are not neutral; nor are all codes created equal. They reflect particular perspectives and forms of social organization that allow some people to assert themselves – their assumptions, interests, and desires – over others. From the seemingly mundane to the extraordinary, technical systems offer a mirror to the wider terrain of struggle over the forces that govern our lives.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“It is one thing to capitalize on the coolness of a Black artist to sell (overpriced) products and quite another to engage the cultural specificity of Black people enough to enhance the underlying design of a widely used technology. This is why the notion that tech bias is “unintentional” or “unconscious” obscures the reality – that there is no way to create something without some intention and intended user in mind”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Luddite” is often used today as a term of disparagement for anyone who is presumed to oppose (or even question!) automation, the Luddite response was actually directed at the manner in which machinery was rolled out, without consideration for its negative impact on workers and society overall.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Take, for instance, a parody project that begins by subverting the anti-Black logics embedded in new high-tech approaches to crime prevention (Figure 5.2). Instead of using predictive policing techniques to forecast street crime, the White-Collar Early Warning System flips the script by creating a heat map that flags city blocks where financial crimes are likely to occur.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“In The Enigma of Diversity, sociologist Ellen Berry explains how those who champion a flat conception of diversity have “redefined racial progress for the post-civil rights era, from a legal fight for equal rights to a celebration of cultural difference as a competitive advantage” in which everyone (theoretically) wins.24 More pointedly, the late political scientist Lee Ann Fujii reminds us that the lack of genuine diversity “maintains a racialized way of seeing the world. This lens of default Whiteness is assumed to be neutral, unraced, and ungendered, and therefore ‘scientifically’ sound. But Whiteness is anything but.”25 Finally, legal scholar Nancy Leong details how institutions commodify racial diversity for their own benefit, defining, in her analysis, racial capitalism as “the process of deriving social and economic value from the racial identity of another person.”26”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“During an interview with Diversity Inc.’s director of research and product development, she walked me through a typical presentation used to pitch the value of the company’s software to prospective clients. I learned that their products are especially valuable to those industries not allowed to collect ethno-racial data directly from individuals because of civil rights legislation that attempts to curb how these data are used to discriminate. But now those who work in finance, housing, and healthcare can use predictive software programs to ascertain information that they cannot request directly. The US Health Insurance Portability and Accountability Act (HIPAA) privacy rule, for example, strictly monitors the collection, storage, and communication of individuals’ “protected health information,” among other features of the law. This means that pharmaceutical companies, which market to different groups, need indirect methods to create customer profiles, because they cannot collect racial-ethnic data directly. This is where Diversity Inc. comes in. Its software programs target customers not only on the basis of race and ethnicity, but also on the basis of socioeconomic status, gender, and a growing list of other attributes. However, the company does not refer to “race” anywhere in their product descriptions. Everything is based on individuals’ names, we are told. “A person’s name is data,” according to the director of research and product development. She explains that her clients typically supply Diversity Inc. with a database of client names and her team builds knowledge around it. The process, she says, has a 96 percent accuracy rate, because so many last names are not shared across racial–ethnic groups – a phenomenon sociologists call “cultural segregation.”18”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“After it was revealed that the Miami Police Department used images of Black men for target practice, a movement of clergy and other activists initiated the hashtag #UseMeInstead circulating their own, predominantly White photos. In another form of subversive visualization, activists called out the way media outlets circulate unflattering photos of Black youths murdered by police or White vigilantes. They used the hashtag #IfTheyGunnedMeDown and asked the question “Which one would they use?” with dueling photos of themselves looking stereotypically “thuggish” (e.g. not smiling, wearing a hoodie, throwing up hand signs, smoking, or holding alcohol) and “respectable” (e.g. smiling, wearing a graduation gown or suit, playing with a baby, or wearing a military uniform).”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“In this way DNA Dreams brings to life the dystopian nightmare we encounter in the 1997 film Gattaca, in which the main character Vincent, played by Ethan Hawke, narrates: “I belonged to a new underclass, no longer determined by social status or the color of your skin. No, we have discrimination down to a science.”46 As in so much science fiction, the Whiteness of the main protagonist is telling. Not only does it deflect attention away from the fact that, in the present, many people already live a version of the dystopia represented in the film in future tense. The “unbearable Whiteness” of sci-fi expresses itself in the anxiety underlying so many dystopian visions that, if we keep going down this road, “We’re next.”47 Whether it’s Keanu Reeves in The Matrix, Matt Damon in Elysium, Chris Evans in Snowpiercer – all characters whose Whiteness, maleness, straightness, and (let’s just admit) cuteness would land them at the top of the present social order – they all find themselves in a fictional future among the downtrodden. Viewers, in turn, are compelled to identify with the future oppression of subordinated White people without necessarily feeling concern for the “old” underclasses in our midst.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“As public schools in the United States began desegregating and students of different skin tones were photographed for yearbooks in the same frame, the technical fixes that could be employed when a Black child was photographed alone were not useful. In particular, Black parents, objecting to the fact that their children’s facial features were rendered blurry, demanded higher-quality images.20 But the photographic industry did not fully take notice until companies that manufactured brown products like chocolate and wooden furniture began complaining that photographs did not depict their goods with enough subtlety, showcasing the varieties of chocolate and of grains in wood.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“The show brilliantly depicts how the default Whiteness of tech development, a superficial corporate diversity ethos, and the prioritization of efficiency over equity work together to ensure that innovation produces social containment.5 The fact that Black employees are unable to use the elevators, doors, and water fountains or turn the lights on is treated as a minor inconvenience in service to a greater good. The absurdity goes further when, rather than removing the sensors, the company “blithely installs separate, manually operated drinking fountains for the convenience of the black employees,”6 an incisive illustration of the New Jim Code wherein tech advancement, posed as a solution, conjures a prior racial regime in the form of separate water fountains.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Men are shown ads for high-income jobs much more frequently than are women, and tutoring for what is known in the United States as the Scholastic Aptitude Test (SAT) is priced more highly for customers in neighborhoods with a higher density of Asian residents: “From retail to real estate, from employment to criminal justice, the use of data mining, scoring and predictive software … is proliferating … [And] when software makes decisions based on data, like a person’s zip code, it can reflect, or even amplify, the results of historical or institutional discrimination.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Take the credit associated with the aforementioned categories of playing video games and buying diapers. There are many ways to parse the values embedded in the distinction between the “idle” and the “responsible” citizen so that it lowers the scores of gamers and increases the scores of diaper changers. There is the ableist logic, which labels people who spend a lot of time at home as “unproductive,” whether they play video games or deal with a chronic illness; the conflation of economic productivity and upright citizenship is ubiquitous across many societies.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Like segregated water fountains of a previous era, the discriminatory soap dispenser offers a window onto a wider social terrain.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“As it happens, the term “stereotype” offers a useful entry point for thinking about the default settings of technology and society. It first referred to a practice in the printing trade whereby a solid plate called a “stereo” (from the ancient Greek adjective stereos, “firm,” “solid”) was used to make copies. The duplicate was called a “stereotype.”47 The term evolved; in 1850 it designated an “image perpetuated without change” and in 1922 was taken up in its contemporary iteration, to refer to shorthand attributes and beliefs about different groups. The etymology of this term, which is so prominent in everyday conceptions of racism, urges a more sustained investigation of the interconnections between technical and social systems.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“The intention to harm or exclude may guide some technical design decisions. Yet even when they do, these motivations often stand in tension with aims framed more benevolently. Even police robots who can use lethal force while protecting officers from harm are clothed in the rhetoric of public safety.35 This is why we must separate “intentionality” from its strictly negative connotation in the context of racist practices, and examine how aiming to “do good” can very well coexist with forms of malice and neglect.36 In fact a do-gooding ethos often serves as a moral cover for harmful decisions. Still, the view that ill intent is always a feature of racism is common: “No one at Google giggled while intentionally programming its software to mislabel black people.”37 Here McWhorter is referring to photo-tagging software that classified dark-skinned users as “gorillas.” Having discovered no bogeyman behind the screen, he dismisses the idea of “racist technology” because that implies “designers and the people who hire them are therefore ‘racists.’” But this expectation of individual intent to harm as evidence of racism is one that scholars of race have long rejected.38”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Consider the court decision in the case against one Mr. Henry Davis, who was charged with destruction of property for bleeding on police uniforms after officers incorrectly identified him as having an outstanding warrant and then beat him into submission: On and/or about the 20th day of September 20, 2009 at or near 222 S. Florissant within the corporate limits of Ferguson, Missouri, the above-named defendant did then and there unlawfully commit the offense of “property damage” to wit did transfer blood to the uniform.80 When Davis sued the officers, the judge tossed out the case, saying: “a reasonable officer could have believed that beating a subdued and compliant Mr. Davis while causing a concussion, scalp lacerations, and bruising with almost no permanent damage, did not violate the Constitution.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code
“Such fictional accounts find their real-life counterpart in “electronic sweatshops,” where companies such as Apple, HP, and Dell treat humans like automata, reportedly requiring Chinese workers to complete tasks every three seconds over a 12-hour period, without speaking or using the bathroom.”
Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code

« previous 1