Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor
Rate it:
Open Preview
29%
Flag icon
The housing first approach emerges instead from the understanding that it is difficult to attend to other challenges if you are not stably housed.
Kathy Reid
An order of precedence
36%
Flag icon
suspicion, or judicial process of any kind.”10 Operation Talon and other initiatives like it use administrative data to turn social service offices into extensions of the criminal justice system.
36%
Flag icon
Mobile and integrated administrative data can turn any street corner, any tent encampment, or any service provider into a site for a sting operation.
38%
Flag icon
The unhoused in Los Angeles are thus faced with a difficult trade-off: admitting risky, or even illegal, behavior on the VI-SPDAT can snag you a higher ranking on the priority list for permanent supportive housing. But it can also open you up to law enforcement scrutiny. Coordinated entry is not just a system for managing information or matching demand to supply. It is a surveillance system for sorting and criminalizing the poor.
38%
Flag icon
In contrast, in new data-based surveillance, the target often emerges from the data. The
38%
Flag icon
targeting comes after the data collection, not before.
38%
Flag icon
In his prescient 1993 book, The Panoptic Sort, communication scholar Oscar Gandy of the University of Pennsylvania also suggests that automated sorting of digital personal information is a kind of triage. But he pushes further, pointing out that the term is derived from the French trier, which means to pick over, cull, or grade marketable produce.
39%
Flag icon
As a system of moral valuation, coordinated entry is a machine for producing rationalization, for helping us convince ourselves that only the most deserving people are getting help. Those judged “too risky” are coded for criminalization. Those who fall through the cracks face prisons, institutions, or death.
39%
Flag icon
The proponents of the coordinated entry system, like many who seek to harness computational power for social justice, tend to find affinity with systems engineering approaches to social problems. These perspectives assume that complex controversies can be solved by getting correct information where it needs to go as efficiently as possible. In this model, political conflict arises primarily from a lack of information. If we just gather all the facts, systems engineers assume, the correct answers to intractable policy problems like homelessness will be simple, uncontroversial, and widely ...more
39%
Flag icon
Systems engineering can help manage big, complex social problems. But it doesn’t build houses, and it may not prove sufficient to overcome deep-seated prejudice against the poor, especially poor people of color. “Algorithms are intrinsically stupid,” said public interest lawyer, homeless advocate, and emeritus professor of law at UCLA Gary Blasi. “You can’t build any algorithm that can handle as many variables, and levels of nuance, and complexity as human beings present.”
39%
Flag icon
“Fraud is too strong a word,” said Blasi. “But homelessness is not a systems engineering problem. It’s a carpentry problem.”
40%
Flag icon
“I’m a criminal,” he said, “just for existing on the face of the earth.”
44%
Flag icon
The AFST, like all risk models, offers only probabilities, not perfect prediction. Though it might be able to identify patterns and trends, it is routinely wrong about individual cases.
45%
Flag icon
Data scientist Cathy O’Neil has written that “models are opinions embedded in mathematics.”8 Models are useful because they let us strip out extraneous information and focus only on what is most critical to the outcomes we are trying to predict. But they are also abstractions. Choices about what goes into them reflect the priorities and preoccupations of their creators. Human decision-making is reflected in three key components of the AFST: outcome variables, predictive variables, and validation data.
45%
Flag icon
Outcome variables are what you measure to indicate the phenomenon you are trying to predict.
45%
Flag icon
AFST uses two related variables—called proxies—as stand-ins for child maltreatment. The first proxy is community re-referral,
45%
Flag icon
The second proxy is child placement, when a call to the hotline about a child is screened in and results in the child being placed in foster care within two years.
45%
Flag icon
Predictive variables are the bits of data within a data set that are correlated with the outcome variables.
45%
Flag icon
ran a statistical procedure called a stepwise probit regression,
45%
Flag icon
Validation data is used to see how well your model performs.
46%
Flag icon
But Pat reminds me that I should be concerned with false negatives as well—when the AFST scores a child at low risk though the allegation or immediate risk to the child might be severe.
46%
Flag icon
Because variables describing their behavior have not been defined or included in the regression, crucial pieces of the child maltreatment puzzle might be omitted from the AFST. It could be missing the crucial “summer” variable that links ice cream and shark attacks.
48%
Flag icon
The AFST focuses all its predictive power and computational might on call screening, the step it can experimentally control, rather than concentrating on referral, the step where racial disproportionality is actually entering the system.
49%
Flag icon
In other words, the activity that introduces the most racial bias into the system is the very way the model defines maltreatment.
49%
Flag icon
But unlike other historically disadvantaged groups, the poor are not widely recognized as a legally protected class, so the disproportionate and discriminatory attention paid to poor families by child welfare offices goes largely unchallenged.
49%
Flag icon
Nearly all of the indicators of child neglect are also indicators of poverty: lack of food, inadequate housing, unlicensed childcare, unreliable transportation, utility shutoffs, homelessness, lack of health care. “The vast, vast majority of cases are neglect, stem[ming] from people who have difficult, unsafe neighborhoods to live in,” said Catherine Volponi, director of the Juvenile Court Project, which provides pro bono legal support for parents facing CYF investigation or termination of their parental rights. “We have housing issues, we have inadequate medical care, we have drugs and ...more
50%
Flag icon
We might call this poverty profiling. Like racial profiling, poverty profiling targets individuals for extra scrutiny based not on their behavior but rather on a personal characteristic: living in poverty. Because the model confuses parenting while poor with poor parenting, the AFST views parents who reach out to public programs as risks to their children.
51%
Flag icon
Sarah’s schedule is filled with appointments with helping professionals she needs to please with displays of servility.
51%
Flag icon
Parenting while poor means parenting in public.
52%
Flag icon
The same willingness to reach out for support by poor and working-class families, because they are asking for public resources, labels them a risk to their children in the AFST, even though CYF sees requesting resources as a positive attribute of parents.17
54%
Flag icon
Cherna’s administration wants to identify those families who could use help earlier, when interventions could make the most difference. But community members wonder if data collected with the best of intentions might be used against them in the future.
54%
Flag icon
But the assumption that academics speaking out against the way their research is used will have a significant impact on public policy or agency practice is naïve.
55%
Flag icon
Our denial runs deep. It is the only way to explain a basic fact about the United States: in the world’s largest economy, the majority of us will experience poverty.
55%
Flag icon
Our public policy fixates on attributing blame for poverty rather than remedying its effects or abolishing its causes. The obsession with “personal responsibility” makes our social safety net conditional on being morally blameless. As political theorist Yascha Mounk argues in his 2017 book, The Age of Responsibility, our vast and expensive public service bureaucracy primarily functions to investigate whether individuals’ suffering might be their own fault.
56%
Flag icon
This myopic focus on what’s new leads us to miss the important ways that digital tools are embedded in old systems of power and privilege.
56%
Flag icon
While automated social exclusion is growing across the country, it has key weaknesses as a strategy of class-based oppression. So, when direct diversion fails, the digital poorhouse creates something more insidious: a moral narrative that criminalizes most of the poor while providing life-saving resources to a lucky handful.
57%
Flag icon
This is a recipe for law enforcement fishing expeditions. The integration of policing and homeless services blurs the boundary between the maintenance of economic security and the investigation of crime, between poverty and criminality, tightening a net of constraint that tracks and traps the unhoused. This net requires data-based infrastructure to surround and systems of moral classification to sift.
57%
Flag icon
The digital poorhouse doesn’t just exclude, it sweeps millions of people into a system of control that compromises their humanity and their self-determination.
57%
Flag icon
Under the new regime of prediction, you are impacted not only by your own actions, but by the actions of your lovers, housemates, relatives, and neighbors.
Kathy Reid
Relationality
57%
Flag icon
Prediction, unlike classification, is intergenerational.
57%
Flag icon
The impacts of predictive models are thus exponential. Because prediction relies on networks and spans generations, its harm has the potential to spread like a contagion, from the initial point of contact to relatives and friends, to friends’ networks, rushing through whole communities like a virus.
58%
Flag icon
Just as the county poorhouse was suited to the Industrial Revolution, and scientific charity was uniquely appropriate for the Progressive Era, the digital poorhouse is adapted to the particular circumstances of our time.
58%
Flag icon
Today, the digital poorhouse responds to what Barbara Ehrenreich has described as a “fear of falling” in the professional middle class. Desperate to preserve their status in the face of the collapse of the working class below them, the grotesque expansion of wealth above them, and the increasing demographic diversity of the country, Ehrenreich writes, the white professional middle class has largely abandoned ideals of justice, equity, and fairness.3
58%
Flag icon
Containment in the physical institution of a county poorhouse had the unintentional result of creating class solidarity across race, gender, and national origin. When we sit at a common table, we might see similarities in our experiences, even if we are forced to eat gruel. Surveillance and digital social sorting drive us apart as smaller and smaller microgroups are targeted for different kinds of aggression and control.
58%
Flag icon
were difficult to scale.
59%
Flag icon
Google’s infrastructure has been integrated into so many systems that it has an internal momentum that is hard to arrest.
59%
Flag icon
We create a society that has no use for the disabled or the elderly, and then are cast aside when we are hurt or grow old. We measure human worth based only on the ability to earn a wage, and suffer in a world that undervalues care and community. We base our economy on exploiting the labor of racial and ethnic minorities, and watch lasting inequities snuff out human potential. We see the world as inevitably riven by bloody competition and are left unable to recognize the many ways we cooperate and lift each other up.
60%
Flag icon
While the algorithms that drive this target-marketing don’t explicitly use race to make decisions—a practice outlawed by the Fair Housing Act of 1968—a category like “Ethnic Second-City Strugglers” is clearly a proxy for both race and class.6 Disadvantaged communities are then targeted for subprime lending, payday loans, or other exploitative financial products.
60%
Flag icon
I asked Bruce Noel, the regional office director in Allegheny County, if he’s concerned that the intake workers he manages might be training an algorithm that will eventually replace them. “No,” he insisted. “There will never be a replacement for that human being and that connection.” But in a very real sense, humans have already been removed from the driver’s seat of human services.
61%
Flag icon
freedom from