More on this book
Community
Kindle Notes & Highlights
suspicion, or judicial process of any kind.”10 Operation Talon and other initiatives like it use administrative data to turn social service offices into extensions of the criminal justice system.
Mobile and integrated administrative data can turn any street corner, any tent encampment, or any service provider into a site for a sting operation.
The unhoused in Los Angeles are thus faced with a difficult trade-off: admitting risky, or even illegal, behavior on the VI-SPDAT can snag you a higher ranking on the priority list for permanent supportive housing. But it can also open you up to law enforcement scrutiny. Coordinated entry is not just a system for managing information or matching demand to supply. It is a surveillance system for sorting and criminalizing the poor.
In contrast, in new data-based surveillance, the target often emerges from the data. The
targeting comes after the data collection, not before.
In his prescient 1993 book, The Panoptic Sort, communication scholar Oscar Gandy of the University of Pennsylvania also suggests that automated sorting of digital personal information is a kind of triage. But he pushes further, pointing out that the term is derived from the French trier, which means to pick over, cull, or grade marketable produce.
As a system of moral valuation, coordinated entry is a machine for producing rationalization, for helping us convince ourselves that only the most deserving people are getting help. Those judged “too risky” are coded for criminalization. Those who fall through the cracks face prisons, institutions, or death.
The proponents of the coordinated entry system, like many who seek to harness computational power for social justice, tend to find affinity with systems engineering approaches to social problems. These perspectives assume that complex controversies can be solved by getting correct information where it needs to go as efficiently as possible. In this model, political conflict arises primarily from a lack of information. If we just gather all the facts, systems engineers assume, the correct answers to intractable policy problems like homelessness will be simple, uncontroversial, and widely
...more
Systems engineering can help manage big, complex social problems. But it doesn’t build houses, and it may not prove sufficient to overcome deep-seated prejudice against the poor, especially poor people of color. “Algorithms are intrinsically stupid,” said public interest lawyer, homeless advocate, and emeritus professor of law at UCLA Gary Blasi. “You can’t build any algorithm that can handle as many variables, and levels of nuance, and complexity as human beings present.”
“Fraud is too strong a word,” said Blasi. “But homelessness is not a systems engineering problem. It’s a carpentry problem.”
“I’m a criminal,” he said, “just for existing on the face of the earth.”
The AFST, like all risk models, offers only probabilities, not perfect prediction. Though it might be able to identify patterns and trends, it is routinely wrong about individual cases.
Data scientist Cathy O’Neil has written that “models are opinions embedded in mathematics.”8 Models are useful because they let us strip out extraneous information and focus only on what is most critical to the outcomes we are trying to predict. But they are also abstractions. Choices about what goes into them reflect the priorities and preoccupations of their creators. Human decision-making is reflected in three key components of the AFST: outcome variables, predictive variables, and validation data.
Outcome variables are what you measure to indicate the phenomenon you are trying to predict.
AFST uses two related variables—called proxies—as stand-ins for child maltreatment. The first proxy is community re-referral,
The second proxy is child placement, when a call to the hotline about a child is screened in and results in the child being placed in foster care within two years.
Predictive variables are the bits of data within a data set that are correlated with the outcome variables.
ran a statistical procedure called a stepwise probit regression,
Validation data is used to see how well your model performs.
But Pat reminds me that I should be concerned with false negatives as well—when the AFST scores a child at low risk though the allegation or immediate risk to the child might be severe.
Because variables describing their behavior have not been defined or included in the regression, crucial pieces of the child maltreatment puzzle might be omitted from the AFST. It could be missing the crucial “summer” variable that links ice cream and shark attacks.
The AFST focuses all its predictive power and computational might on call screening, the step it can experimentally control, rather than concentrating on referral, the step where racial disproportionality is actually entering the system.
In other words, the activity that introduces the most racial bias into the system is the very way the model defines maltreatment.
But unlike other historically disadvantaged groups, the poor are not widely recognized as a legally protected class, so the disproportionate and discriminatory attention paid to poor families by child welfare offices goes largely unchallenged.
Nearly all of the indicators of child neglect are also indicators of poverty: lack of food, inadequate housing, unlicensed childcare, unreliable transportation, utility shutoffs, homelessness, lack of health care. “The vast, vast majority of cases are neglect, stem[ming] from people who have difficult, unsafe neighborhoods to live in,” said Catherine Volponi, director of the Juvenile Court Project, which provides pro bono legal support for parents facing CYF investigation or termination of their parental rights. “We have housing issues, we have inadequate medical care, we have drugs and
...more
We might call this poverty profiling. Like racial profiling, poverty profiling targets individuals for extra scrutiny based not on their behavior but rather on a personal characteristic: living in poverty. Because the model confuses parenting while poor with poor parenting, the AFST views parents who reach out to public programs as risks to their children.
Sarah’s schedule is filled with appointments with helping professionals she needs to please with displays of servility.
Parenting while poor means parenting in public.
The same willingness to reach out for support by poor and working-class families, because they are asking for public resources, labels them a risk to their children in the AFST, even though CYF sees requesting resources as a positive attribute of parents.17
Cherna’s administration wants to identify those families who could use help earlier, when interventions could make the most difference. But community members wonder if data collected with the best of intentions might be used against them in the future.
But the assumption that academics speaking out against the way their research is used will have a significant impact on public policy or agency practice is naïve.
Our denial runs deep. It is the only way to explain a basic fact about the United States: in the world’s largest economy, the majority of us will experience poverty.
Our public policy fixates on attributing blame for poverty rather than remedying its effects or abolishing its causes. The obsession with “personal responsibility” makes our social safety net conditional on being morally blameless. As political theorist Yascha Mounk argues in his 2017 book, The Age of Responsibility, our vast and expensive public service bureaucracy primarily functions to investigate whether individuals’ suffering might be their own fault.
This myopic focus on what’s new leads us to miss the important ways that digital tools are embedded in old systems of power and privilege.
While automated social exclusion is growing across the country, it has key weaknesses as a strategy of class-based oppression. So, when direct diversion fails, the digital poorhouse creates something more insidious: a moral narrative that criminalizes most of the poor while providing life-saving resources to a lucky handful.
This is a recipe for law enforcement fishing expeditions. The integration of policing and homeless services blurs the boundary between the maintenance of economic security and the investigation of crime, between poverty and criminality, tightening a net of constraint that tracks and traps the unhoused. This net requires data-based infrastructure to surround and systems of moral classification to sift.
The digital poorhouse doesn’t just exclude, it sweeps millions of people into a system of control that compromises their humanity and their self-determination.
Prediction, unlike classification, is intergenerational.
The impacts of predictive models are thus exponential. Because prediction relies on networks and spans generations, its harm has the potential to spread like a contagion, from the initial point of contact to relatives and friends, to friends’ networks, rushing through whole communities like a virus.
Just as the county poorhouse was suited to the Industrial Revolution, and scientific charity was uniquely appropriate for the Progressive Era, the digital poorhouse is adapted to the particular circumstances of our time.
Today, the digital poorhouse responds to what Barbara Ehrenreich has described as a “fear of falling” in the professional middle class. Desperate to preserve their status in the face of the collapse of the working class below them, the grotesque expansion of wealth above them, and the increasing demographic diversity of the country, Ehrenreich writes, the white professional middle class has largely abandoned ideals of justice, equity, and fairness.3
Containment in the physical institution of a county poorhouse had the unintentional result of creating class solidarity across race, gender, and national origin. When we sit at a common table, we might see similarities in our experiences, even if we are forced to eat gruel. Surveillance and digital social sorting drive us apart as smaller and smaller microgroups are targeted for different kinds of aggression and control.
were difficult to scale.
Google’s infrastructure has been integrated into so many systems that it has an internal momentum that is hard to arrest.
We create a society that has no use for the disabled or the elderly, and then are cast aside when we are hurt or grow old. We measure human worth based only on the ability to earn a wage, and suffer in a world that undervalues care and community. We base our economy on exploiting the labor of racial and ethnic minorities, and watch lasting inequities snuff out human potential. We see the world as inevitably riven by bloody competition and are left unable to recognize the many ways we cooperate and lift each other up.
While the algorithms that drive this target-marketing don’t explicitly use race to make decisions—a practice outlawed by the Fair Housing Act of 1968—a category like “Ethnic Second-City Strugglers” is clearly a proxy for both race and class.6 Disadvantaged communities are then targeted for subprime lending, payday loans, or other exploitative financial products.
I asked Bruce Noel, the regional office director in Allegheny County, if he’s concerned that the intake workers he manages might be training an algorithm that will eventually replace them. “No,” he insisted. “There will never be a replacement for that human being and that connection.” But in a very real sense, humans have already been removed from the driver’s seat of human services.
freedom from

